heldercervantes

IMagick vs Google PageSpeed Insights

Recommended Posts

Hey there brothers in arms.

I'm doing some pagepeed optimizations on my site (very close to cracking open a new version of SuperTINY). And damn, this thing is complaining about almost every image I've got that comes from the CMS.

I've already experimented with the default compression setting in the module, reducing the quality from the default 80 all the way down to 50, but still Google insists that they need compression.

Does anyone have any tricks to handle this?

Share this post


Link to post
Share on other sites

Do you have any example URLs? Do you allow images inside WYSIWYG fields? Did you do any tests with desktop image-processing software (e.g. Photoshop vs. PW)?

re: "setting in the module": Are you talking about a specific module? Why not post in that specific forum thread?

Share this post


Link to post
Share on other sites

Maybe this way can help ...
I often use delayed loading for images from this page

https://appelsiini.net/projects/lazyload/

Adds a script in bottom with async or defer to better load page time

<script src="<?php echo $templ_url?>assets/js/lazyload.min.js" defer></script>
<script>
    window.addEventListener("load", function(){
        lazyload();
    });
</script>


And I add the lazyload class for the image along with data-src

<img class='lazyload' data-src="/your-image-url" alt="img-alt">

 

  • Like 2

Share this post


Link to post
Share on other sites

Are you using the HTML5 srcset attribute to provide different sizes of the images for different screen resolutions?

I've found about 60 quality is as optimised as is needed, so long as the images are the right size for the resolution they are displaying at.

Even if you're only a few pixels larger than the screen resolution being rendered, I think Pagespeed insights can make a fuss.

Also, in the core - can't remember the exact module, I think I changed the image resizing function to use progressive encoding, as that can result in faster rendering of jpgs as they can start rendering before they're loaded. I can't find where I did this now, and it would be nice for it to be officially part of the core to have the option for progressive encoded jpgs.

 

  • Like 1

Share this post


Link to post
Share on other sites

Like you, I've tried lots of different compression techniques. Seems Google only likes its own compressed files. What I do to get the scores up...

1. Run the test then download the zip containing the compressed images generated by Google Page Speed Insights and expand the file

2. Upload the images to the assets/files/[page#] directory and overwrite the images generated by PW

3. Run GPSI again and your score will soar

It's not a long-term answer but helpful in development.

 

 

  • Like 3

Share this post


Link to post
Share on other sites

What g**gle want you to do is optimizing your images beyond what GD or imagick or other image libraries do.

They want you to optimize *every* *final* image variation with tools like jpegOptim or others. Or to use an image optimization service for your final variation images.

And besides the image compression, this analyzing system is not very intelligent!
As an example, when you use one single css and one single js file for a whole site, and you use it with aggressive client caching, you will get bad results for a single page url, because gpsi doesn't take into account, that a browser has this already in its cache after retrieving it one time.

And another thing is, that I never have seen a bad ranking for fetching g**gle fonts from there servers with every f**ckig page reload or page call. You know why those fonts are not kept in the browser cache, yes?

When selfhosting those fonts, every page load after the first one gets the fonts from the browser cache, no extra remote connection is needed!

  • Like 4

Share this post


Link to post
Share on other sites

@horst Google is Google... not %100 certain but strongly suspect G adds meta to jpgs it optimizes and so gives higher scores even if the original was fully optimized first. 

Even so, still regularly getting GSPI scores on mobile 75%+ and desktop 90%+ with PW, Procache etc and with careful coding so no errors on https://validator.w3.org/ . Can only do what I can do.

I use the PW site results as comparisons to WordPress site performances to show my clients that PW is the way to go.

For me, it's not about getting upset with G, it's more about proving how well PW performs

  • Like 3

Share this post


Link to post
Share on other sites

Thanks for your input.

@psy: your idea of downloading their version of the images and uploading those did cross my mind, but hell, are our clients supposed to go through all that trouble after we hand over the keys?

@dragan: no wysiwyg field images. Only controlled rendering of image fields.

<rant>

In this particular case I've added a condition for displaying the analytics chunk for everyone except Google, cause it was complaining about me not caching the file that's on their server. It also complains about the font that I'm loading from Google Fonts.

The inline critical CSS thing is another insanity. We're already minifying, caching and gzipping everything and STILL need to overengineer our frontends to appease the beast. So to make them happy our users will have to see fonts changing for a split second after page loads and a whole bunch of other weirdness.

Then there's the 200 byte SVG that can be reduced by 20% if compressed.

</rant>

  • Like 2

Share this post


Link to post
Share on other sites
13 minutes ago, heldercervantes said:

Then there's the 200 byte SVG that can be reduced by 20% if compressed.

I have similar issues with this - they're all already so small, but I get the feeling that google penalizes based on possible percentage reduction, rather than actual bytes saved.

Share this post


Link to post
Share on other sites

 

2 minutes ago, adrian said:

I get the feeling that google penalizes based on possible percentage reduction, rather than actual bytes saved

I think it's just the minification style. I already export my SVGs minified, but there's something there, like unnecessary metadata, that Google detects and immediately flags it.

  • Like 1

Share this post


Link to post
Share on other sites
12 hours ago, horst said:

What g**gle want you to do is...

So true all what you've said @horst

It is insane that we are sort of forced to adhere to google standards and not to common sense. However, I still find that content is king, so I do not push it too far optimizing things to google's liking. Instead, I tell my clients to write frequent news/blog posts. That works :)

  • Like 5

Share this post


Link to post
Share on other sites
On 07/12/2017 at 6:08 PM, heldercervantes said:

Does anyone have any tricks to handle this?

Yep, ignore it! Bunch of google ass. My sites load well fast enough, google says otherwise of course. But my target audience is usually in the UK where people have quickish internet.

  • Like 4

Share this post


Link to post
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now


  • Recently Browsing   0 members

    No registered users viewing this page.