Jump to content

Gallery Images Blocked For Google Indexing


clickpass
 Share

Recommended Posts

Hello again dear forum!

This is my latest website: http://lobo-taberna.de, powered by ProcessWire 3.0.98. The galleries are image fields, of course, which are suited into a Featherlight Gallery (https://noelboss.github.io/featherlight/). I am also using the Masonry plugin for the layout (https://masonry.desandro.com/).

My code goes like this:

<div class="maze"
	data-featherlight-gallery
    data-featherlight-filter="a"
    >
    	<?php
        $a = 0;
        foreach ($page->taberna_imgs as $image) {
            $a++;
            echo "<div class='item'>
            		<p class='p-google'><a href='$image->url'>
            			<img src='$image->url' alt='$image->description'>
            		</a></p>
            	  </div>"; 
        } ?>
</div>

There is no problem with that. The rendering works. But Google won't index the images.

Somewhere I read that a linked image directly in the <div>-container wouldn't be indexed. So I embraced the <a>-tag with a <p></p>. But this doesn't seem to be the problem. At least they are not indexed, yet. Then I read a forum's post by Ryan that all pw-folders except of the template-folder would be excluded from searches by robots-code which might be inside the pw-docs somewhere. (I am sorry that I lost the post-link.) I am using a robots.txt in the root folder which allows crawling everything.

Simple question: How can I get my images indexed? Of course they are saved inside the assets folder.

Link to comment
Share on other sites

Is it a brandnew site? It might take a few days, if you just launched.

1 hour ago, clickpass said:

Somewhere I read that a linked image directly in the <div>-container wouldn't be indexed.

That's nonsense.

1 hour ago, clickpass said:

Then I read a forum's post by Ryan that all pw-folders except of the template-folder would be excluded from searches by robots-code which might be inside the pw-docs somewhere.

You don't even have a robots.txt file ?

You probably mean .htaccess - yes, there's a huge block where all kinds of paths are forbidden to show the index (file list). But that hasn't got anything to do with your issue.

But you have a sitemap.xml in place, that even lists images. Looking at that sitemap.xml actually puzzles me. You have no (visible) page /tapas, only #tapas on root. Since you have a one-page site, you have to somehow adjust the setup. Either turn #foo into /foo or adjust your sitemap.xml somehow.

Link to comment
Share on other sites

One image is actually indexed... ? 

1913020257_2019-05-2421_17_38.thumb.png.7c9b27217e98740a1765ab8f0f78a0c8.png

There are a few things you might want to change or at least take a closer look at.

  1. use only https:// and redirect to it
  2. choose prefered domain (see below)
  3. reduce page size drastically (~13MB is too large) and
  4. speed it up
  5. remove the preloader as it stays for hours
  6. fix your sitemap (as mentioned by @dragan)
  7. set up Google Search Console to figure out problems with your site
  8. add robots.txt (as mentioned above) and reference your sitemap
  9. and maybe add a nice favicon.ico

Your site is available through 4 different addresses:

  1. http://lobo-taberna.de/
  2. http://www.lobo-taberna.de/
  3. https://lobo-taberna.de/
  4. https://www.lobo-taberna.de/

Choose one. ? 

Reduce page size to ... way less than that.

1529893071_2019-05-2421_25_28.png.879d6fc313140eb494668c651b544e1b.png

  • Like 2
Link to comment
Share on other sites

Thank you, dragan & wbmnfktr!

@dragan, these information were just interesting to me. I am using Mark Rockett's marvellous sitemap module now. That should do it with some crawl time. The paths are fixed.

@wbmnfktr, I am using a bastard iframe for the google previews which can't be cross referenced under https. For this goody I also pay some loading time, but I find it so nice and found no other. But I really should use only one address, yes. Of course I don't remove the preloader. The site needs time like all good things ?  (kidding) but better to prevent FOUTs. Google Search Console is set up and and working. I find that a page size of ridiculous 12MB shouldn't be a problem anymore. But you are right. The images must be reduced in size.

Thank you very much!

Link to comment
Share on other sites

6 hours ago, clickpass said:

I find that a page size of ridiculous 12MB shouldn't be a problem anymore.

Well, well, well. I don't even know where to start. Speed is a big factor, and you should always strive to optimize for it. People on spotty, slow mobile networks might beg to differ with your opinion. And Google too. In case of a typical single page website, you could at least implement some sort of lazy loading technique, which would at least optimize the perceived loading time.

Also, run Google Chrome's Lighthouse speed audit, and you'll spot some strange problems there related to the site's images.

unable-to-decode.thumb.PNG.834feffd7e60aaab52a7b50ac7ad33fd.PNG

 

 

  • Like 1
Link to comment
Share on other sites

  • 2 weeks later...

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
 Share

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...