• Content count

  • Joined

  • Last visited

Community Reputation

231 Excellent

About FrancisChung

Profile Information

  • Gender
  • Location
    Sydney | Seoul

Recent Profile Visitors

2,799 profile views
  1. I run the robots.txt file through this It says : Cache-Control => max-age=2592000 Expires => Wed, 21 Feb 2018 18:21:44 GMT Which is a bit baffling. The Homepage and Main Category pages are showing Cache-Control : No Cache at the moment. I do have Procache installed but currently, it's off at the moment. This is our current Cache-Control settings. <ifModule mod_headers.c> # Ignore comments, everything is set to 8 days atm. # 1 week for fonts <filesMatch "\.(ico|jpe?g|jpg|png|gif|swf)$"> Header set Cache-Control "max-age=691200, public" </filesMatch> <filesMatch "\.(eot|svg|ttf|woff|woff2)$"> Header set Cache-Control "max-age=691200, public" </filesMatch> # 1 day for css / js files <filesMatch "\.(css)$"> Header set Cache-Control "max-age=691200, public" </filesMatch> <filesMatch "\.(js)$"> Header set Cache-Control "max-age=691200, private" </filesMatch> <filesMatch "\.(x?html?|php)$"> Header set Cache-Control "private, must-revalidate" </filesMatch> </ifModule> I will add a separate filesMatch module for Robots.txt and see what happens ... <filesMatch "^robots.(txt|php)$"> Header Set Cache-Control "max-age=0, public" </filesMatch>
  2. Hi @teppo, it's been going on for a few weeks now. Very frustrating. It's definitely beyond a wait and see approach stage now. Is the robots.txt max-age defined in .htaccess? I don't recall creating such setting for robots.txt specifically, but I do recall such setting for the website in general for cache-headers.
  3. Hi, I have an ongoing issue with Google SEO that I can't seem to fix. Wondering if anyone has come across a similar situation? We deployed a new version of the website using a new deployment methodology and unfortunately, the wrong robots.txt file was deployed basically telling Googlebot not to scrape the site. The end result is that if our target keywords are used for a (Google) search, our website is displayed on the search page with "No information is available for this page." Google provides a link to fix this situation on the search listing, but so far everything I have tried in it hasn't fixed the situation. I was wondering if anyone has gone through this scenario and what was the steps to remedy it? Or perhaps it has worked and I have misunderstood how it works? The steps I have tried in the Google Webmaster Tool : Gone through all crawl errors Restored the Robots.txt file and Verified with Robots.txt tester Fetch/Fetch and Render as Google as both Desktop/Mobile, using root URL and other URLs, using Indexing Requested / Indexing Requested for URL and Linked Pages. Uploaded a new Sitemap.xml Particularly on the Sitemap page, it says 584 submitted, 94 indexed. Would the Search Engine return "No Information available" because the page is not indexed? The pages I'm searching for are our 2 most popular keywords and entry points into site. It's also one of 2 most popular category pages. So I'm thinking it probably isn't the case but ... How can I prove / disprove the category pages are being indexed? The site in questions is The keywords to search are fingerspiele and kindergedichte.
  4. Awesome tip, @Robin S Local history is something I use quite regularly but I suppose I should be committing more often as per the Git mantra Perhaps users who rely on local history can make a backup of the System Cache first before invalidating the cache and seeing what sort of performance gains they get. The locations are : Windows <User home>\.PhpStormXX\system that stores PhpStorm data caches. Linux ~/.PhpStormXX/system that stores PhpStorm data caches. macOS ~/Library/Caches/PhpStormXX contains data caches, logs, local history, etc. These files can be quite significant in size. Source :
  5. Completely understand if you don't wish to create accounts for the sake of it. I happen to use Bitbucket quite a lot as well, which integrates well with Sourcetree since they share the same maker.
  6. I had another read, and I think the best course of action try and reverse engineer a working plugin?
  7. I use Atlassians' Sourcetree if people wanted an alternative.
  8. @maxf5, I tried about a year ago but I couldn't get it to work. I assume you're on the latest build of PHPStorm?
  9. I forgot about this .... everyone should upvote this to get a chance of being implemented. Or even better, any Java/PW gurus out there?
  10. Wait till you get to Factory Methods, Contravariance, Covariance, Builder + Specification patterns .... haha Ideally Classes should do one thing and only one thing. So you should probably make multiple smaller classes that has a very distinct and easily identifiable singular purpose rather than 1 monolithic one that tries to do everything.
  11. Wow, it's like Night & Day compared to your first iteration. Looking very good.
  12. 1 of OO Programming's Commandments is : "Thou shalt favour composition over inheritance (or extending a class)" It can seem complex, especially if you're starting out on OO concepts. The has one of the best explanations I've seen. @bernhard's code above is an example of composition.
  13. Everyone starts somewhere at the bottom and make their way up. Your code wasn't that terrible, just needed a small house clean Glad to hear this. Makes it all worthwhile.
  14. You could take Bernhard's example and encapsulate it even further ... I've added 2 new functions NoErrors & DisplayErrors to encapsulate and hide away more logic.
  15. You want to define $errorMsg outside the function, where your main logic resides. I think I should also pass the message via reference not via value which is the default method of passing arguments. Funny enough, has a very similar example there.