Recently Browsing 0 members
No registered users viewing this page.
Hello Everyone, I was trying to update SEO meta title, description and meta keywords for my website in Process Wire CMS but it saving in the backend but it is not reflecting on my website, Please help me regarding this error. Please find below attached screen shot for your ref. TIA.
I have been asked by a client whether we can setup load balancing for their existing Processwire site.
From my investigations on Google and within these forums, it definitely seems possible but as a newbie with a basic understanding of the subject im a bit lost.
Does anyone know of any existing tutorials for settings up load balancing with PW?
What items would need to be changed on their current stand alone install, is there a list of best practices worth consulting etc?
As I understand it we would need to have some sort of copying mechanism (rsync script most likely) in order to make sure any uploaded assets are shared between the main server and the fallback ones, other than that im not sure what else would need to be ammended.
Any thoughts/help would be greatly appreciated.
Encountered strange problem re: PageReference + MapMarker.
The PageReference is being used for creating "tags" ala @renobird from this comment here: ( thx renobird! )
I then have a template with the PageReference on it, then below that a simple and standard MapMarker.
If i set the Page Reference to "Allow New Pages Created", it's great, new tags can be added in. But when you then tab into the MapMarker and set the location all appears fine until you either Save Page or Publish. At that point, the page is saved, the new tags are added correctly, but No Location is saved in the Map Marker.
If I set the Page Reference to NOT "Allow New Pages Created", the MapMarker location is saved perfectly well.
So does anyone know if something weird happens in the "Save", that successfully adds in the new tags, but looses what was put into the MapMarker location ?
Hi, I have an ongoing issue with Google SEO that I can't seem to fix. Wondering if anyone has come across a similar situation?
We deployed a new version of the website using a new deployment methodology and unfortunately, the wrong robots.txt file was deployed basically telling Googlebot not to scrape the site.
The end result is that if our target keywords are used for a (Google) search, our website is displayed on the search page with "No information is available for this page."
Google provides a link to fix this situation on the search listing, but so far everything I have tried in it hasn't fixed the situation.
I was wondering if anyone has gone through this scenario and what was the steps to remedy it?
Or perhaps it has worked and I have misunderstood how it works?
The steps I have tried in the Google Webmaster Tool :
Gone through all crawl errors Restored the Robots.txt file and Verified with Robots.txt tester Fetch/Fetch and Render as Google as both Desktop/Mobile, using root URL and other URLs, using Indexing Requested / Indexing Requested for URL and Linked Pages. Uploaded a new Sitemap.xml Particularly on the Sitemap page, it says 584 submitted, 94 indexed.
Would the Search Engine return "No Information available" because the page is not indexed? The pages I'm searching for are our 2 most popular keywords and entry points into site. It's also one of 2 most popular category pages. So I'm thinking it probably isn't the case but ...
How can I prove / disprove the category pages are being indexed?
The site in questions is Sprachspielspass.de. The keywords to search are fingerspiele and kindergedichte.