Jump to content

IMagick vs Google PageSpeed Insights


heldercervantes

Recommended Posts

Hey there brothers in arms.

I'm doing some pagepeed optimizations on my site (very close to cracking open a new version of SuperTINY). And damn, this thing is complaining about almost every image I've got that comes from the CMS.

I've already experimented with the default compression setting in the module, reducing the quality from the default 80 all the way down to 50, but still Google insists that they need compression.

Does anyone have any tricks to handle this?

Link to comment
Share on other sites

Do you have any example URLs? Do you allow images inside WYSIWYG fields? Did you do any tests with desktop image-processing software (e.g. Photoshop vs. PW)?

re: "setting in the module": Are you talking about a specific module? Why not post in that specific forum thread?

Link to comment
Share on other sites

Maybe this way can help ...
I often use delayed loading for images from this page

https://appelsiini.net/projects/lazyload/

Adds a script in bottom with async or defer to better load page time

<script src="<?php echo $templ_url?>assets/js/lazyload.min.js" defer></script>
<script>
    window.addEventListener("load", function(){
        lazyload();
    });
</script>


And I add the lazyload class for the image along with data-src

<img class='lazyload' data-src="/your-image-url" alt="img-alt">

 

  • Like 2
Link to comment
Share on other sites

Are you using the HTML5 srcset attribute to provide different sizes of the images for different screen resolutions?

I've found about 60 quality is as optimised as is needed, so long as the images are the right size for the resolution they are displaying at.

Even if you're only a few pixels larger than the screen resolution being rendered, I think Pagespeed insights can make a fuss.

Also, in the core - can't remember the exact module, I think I changed the image resizing function to use progressive encoding, as that can result in faster rendering of jpgs as they can start rendering before they're loaded. I can't find where I did this now, and it would be nice for it to be officially part of the core to have the option for progressive encoded jpgs.

 

  • Like 1
Link to comment
Share on other sites

Like you, I've tried lots of different compression techniques. Seems Google only likes its own compressed files. What I do to get the scores up...

1. Run the test then download the zip containing the compressed images generated by Google Page Speed Insights and expand the file

2. Upload the images to the assets/files/[page#] directory and overwrite the images generated by PW

3. Run GPSI again and your score will soar

It's not a long-term answer but helpful in development.

 

 

  • Like 3
Link to comment
Share on other sites

What g**gle want you to do is optimizing your images beyond what GD or imagick or other image libraries do.

They want you to optimize *every* *final* image variation with tools like jpegOptim or others. Or to use an image optimization service for your final variation images.

And besides the image compression, this analyzing system is not very intelligent!
As an example, when you use one single css and one single js file for a whole site, and you use it with aggressive client caching, you will get bad results for a single page url, because gpsi doesn't take into account, that a browser has this already in its cache after retrieving it one time.

And another thing is, that I never have seen a bad ranking for fetching g**gle fonts from there servers with every f**ckig page reload or page call. You know why those fonts are not kept in the browser cache, yes?

When selfhosting those fonts, every page load after the first one gets the fonts from the browser cache, no extra remote connection is needed!

  • Like 4
Link to comment
Share on other sites

@horst Google is Google... not %100 certain but strongly suspect G adds meta to jpgs it optimizes and so gives higher scores even if the original was fully optimized first. 

Even so, still regularly getting GSPI scores on mobile 75%+ and desktop 90%+ with PW, Procache etc and with careful coding so no errors on https://validator.w3.org/ . Can only do what I can do.

I use the PW site results as comparisons to WordPress site performances to show my clients that PW is the way to go.

For me, it's not about getting upset with G, it's more about proving how well PW performs

  • Like 3
Link to comment
Share on other sites

Thanks for your input.

@psy: your idea of downloading their version of the images and uploading those did cross my mind, but hell, are our clients supposed to go through all that trouble after we hand over the keys?

@dragan: no wysiwyg field images. Only controlled rendering of image fields.

<rant>

In this particular case I've added a condition for displaying the analytics chunk for everyone except Google, cause it was complaining about me not caching the file that's on their server. It also complains about the font that I'm loading from Google Fonts.

The inline critical CSS thing is another insanity. We're already minifying, caching and gzipping everything and STILL need to overengineer our frontends to appease the beast. So to make them happy our users will have to see fonts changing for a split second after page loads and a whole bunch of other weirdness.

Then there's the 200 byte SVG that can be reduced by 20% if compressed.

</rant>

  • Like 2
Link to comment
Share on other sites

13 minutes ago, heldercervantes said:

Then there's the 200 byte SVG that can be reduced by 20% if compressed.

I have similar issues with this - they're all already so small, but I get the feeling that google penalizes based on possible percentage reduction, rather than actual bytes saved.

Link to comment
Share on other sites

 

2 minutes ago, adrian said:

I get the feeling that google penalizes based on possible percentage reduction, rather than actual bytes saved

I think it's just the minification style. I already export my SVGs minified, but there's something there, like unnecessary metadata, that Google detects and immediately flags it.

  • Like 1
Link to comment
Share on other sites

12 hours ago, horst said:

What g**gle want you to do is...

So true all what you've said @horst

It is insane that we are sort of forced to adhere to google standards and not to common sense. However, I still find that content is king, so I do not push it too far optimizing things to google's liking. Instead, I tell my clients to write frequent news/blog posts. That works :)

  • Like 5
Link to comment
Share on other sites

On 07/12/2017 at 6:08 PM, heldercervantes said:

Does anyone have any tricks to handle this?

Yep, ignore it! Bunch of google ass. My sites load well fast enough, google says otherwise of course. But my target audience is usually in the UK where people have quickish internet.

  • Like 4
Link to comment
Share on other sites

just a quick sidenote to google pagespeed insights and pingdom...

i analysed a page today by curiosity because the images where loading slowly... devtools showed that the frontpage loaded 19,8MB :D (wordpress, what else...); pagespeed insights sais 80/100 "needs work". so i checked another site from some days before which i knew got 22/100... this one has 9,1MB for the frontpage...

pingdom at least has the amount of seconds for load time and shows a percentage of slower websites... but still the  20MB site (pingdom said it are even 28MB) is faster than 38% of the tested sites... seriously?! ok, i get it... but jquery with its 84kB is too huge for the modern web :D:D

  • Like 2
Link to comment
Share on other sites

On 12/9/2017 at 10:02 PM, SamC said:

Yep, ignore it! Bunch of google ass. My sites load well fast enough, google says otherwise of course. But my target audience is usually in the UK where people have quickish internet.

Is this something that you can truly ignore? AFAIK your Google Pagespeed score does impact your SEO rankings. Otherwise, I would love to heed your advice and ignore what Google has to say.

Link to comment
Share on other sites

Check out AutoSmush module for image optimization, with that you can easily optimize images that google pagespeed accepts in most of the time. Usually I can get pagespeed "green" values and that's where I stop optimizing, unless there's something trivial I can solve easily.

The more optimization trick or workaround you make the less maintainable your site will become, so there is a point you better stop. Page speed analyzers are only indicators, not standards, this is something the so-called SEO experts usually get wrong :)

  • Like 2
Link to comment
Share on other sites

Given that this thread is about Google Pagespeed, an easy improvement for those with enough control over the server to install Apache/Nginx modules is mod_pagespeed.

I experimented with it on a cheap VPS earlier this year and it's almost like witchcraft! And it doesn't break the PW admin like CloudFlare does/did (haven't checked for a while).

  • Like 1
Link to comment
Share on other sites

5 hours ago, FrancisChung said:

Is this something that you can truly ignore? AFAIK your Google Pagespeed score does impact your SEO rankings. Otherwise, I would love to heed your advice and ignore what Google has to say.

I guess it depends on how you expect to find visitors. Imagine spending a few days trying to optimize for pagespeed to please almighty google (and maybe get some random tyre kicker search visitors), then think that you could have done a facebook/twitter campaign, or even physically went out to find customers in that same time instead and maybe get some serious leads.

For what I'm doing, word of mouth is going to be far more important than shaving a few milliseconds off pagespeed. You just have to use your time wisely, if pagespeed is important and will benefit you directly, then sink a few days into it.

ps if you find a way to ever get rid of "Eliminate render-blocking JavaScript and CSS in above-the-fold content", then please let me know. Inline css is bad, let's use separate stylesheet files. Actually let's combine them into one file, and let's minimize them. No, let's use webpack to use JS to pull in CSS inline. No, let's just load a small part of CSS 'above the fold' and the load the rest later. What about smart watches?

At this rate, we don't need to worry about IPv4 running out, everyone's too busy optimizing existing sites and learning tools to make anything new.

  • Like 2
Link to comment
Share on other sites

2 hours ago, SamC said:

ps if you find a way to ever get rid of "Eliminate render-blocking JavaScript and CSS in above-the-fold content", then please let me know

I did post a question to the community a little while back, but I've gotten 0 response as of date.

https://processwire.com/talk/topic/17676-critical-a-tool-to-extract-the-critical-css-from-a-web-page/

Link to comment
Share on other sites

On 12/7/2017 at 12:08 PM, heldercervantes said:

And damn, this thing is complaining about almost every image I've got that comes from the CMS.

Public enemy #1 for project completion. 

I have done unmentionable things, like "if-ing out" some scripts and stuff that gets Pagespeed into the red zone, all for the sake of sales.

Link to comment
Share on other sites

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...