An Accidental WordPress Experiment.

An Accidental WordPress Experiment.

Last week, I accidentally did an experiment with my WordPress site (yes, this one).

The results of the experiment that I accidentally found that I’d done, were a bit surprising. I’ve also been searching for related information on the internet to see if anyone else had experienced a similar kind of thing, but I came up with nothing.

I thought I’d write it all down and publish it as a post before I forget what happened.


The backstory.

I originally made this site at the beginning of May 2023. The ongoing effect of Liz Truss tanking the economy, the cost of living crisis, working in web hosting, and having a boss that didn’t mind me offering website creation services were all what bought this about.

I’ll be quite honest with you; although I’ve been working in web hosting for about 9 years now as a systems administrator, I didn’t know a huge amount about WordPress. This is chiefly because from a systems administration perspective, WordPress is just another web application that runs on our platform(s).

Sure, I knew how to fix a lot of things that can go wrong with WordPress, but these fixes aren’t generally WordPress specific, as most web applications work in the same way (bunch of PHP, a database and a config file connecting the two).

Needless to say I was a bit “green” when it came to using WordPress itself, but hey, you’ve got to start somewhere and I quite like working things out, and reading things to find out how to do things, so I got stuck in.

The original iteration of this site was built using a theme called Architect Studio, which is effectively a child theme of Aasta, and the Elementor page builder plugin.

Due to being a bit “green” I was pretty much finding things out as I went along.

So I built my site, got it looking how I wanted it to, then run it through https://pagespeed.web.dev/ to see how it performed when it came to page output being rendered by browsers. As is normally the case when one works like this, the result was disappointing, and I had quite a lot of work to do to optimise my page output.

So I:

Needless to say that all took a while, but the performance metric in https://pagespeed.web.dev/ started to look a lot better. All good.

Or was it…

When I was minimising the render blocking resources and reducing unused Javascript part I was using this plugin called Asset Clean Up (which is great, by the way) to unload Javascript and CSS that was present in my page output, but not actually being used.

The Asset Clean Up plugin scans your site, lists all the resources that you could potentially unload, and then you can unload them either site wide or on specific pages. The “Managing Javascript Assets” section of this Reduce Unused Javascript post outline the full process. Handy eh?

Anyway, when I was doing all the unloading of unused assets, I was a bit surprised about how much CSS and Javascript the Elementor page builder was adding to pages. I’ll come back to this in a moment.

So then I had another problem with my site. Some kind of update broke how the caching plugin I was using worked, and my site slowed right down, my google stats began to drop. My site sped up if I disabled and removed the caching plugin, so I ended up just doing that then using a different one. Things then began to improve. All good.

Or was it…


Something was bugging me.

Although things had been improved with regard to page output, and performance metrics in https://pagespeed.web.dev/ my site did feel a bit sluggish.

I’m also a big fan of “things not being there that don’t need to be” and I had one part of my site (Elementor) adding all this CSS and JS that I didn’t need to be there, and another part of my site (Asset Clean Up) unloading all the unused CSS and JS that Elementor was adding.

It crossed my mind that, from a website operation perspective, this was the equivalent of two children arguing over whether socks are actually a necessity or a nice to have. Essentially lots of noise for minimum effect (or minimal page output in my case).

Also, simple logic dictates that it would be better if Elementor was not putting all this CSS and JS there in the first place, rather than it doing that, and then Asset Manager unloading most of it.

Anyway, the sluggish aspect was bothering me, and one of the things I tried was using the Query Monitor plugin to see if there was some kind of database/query overhead causing the sluggish aspect. To cut a long story short Asset Manager appeared frequently in the reports this plugin generates, and it did kind of make sense that it would, because I was unloading quite a lot of assets using this.

It crossed my mind that it would be better to not have to use Asset Manager in the first place, but if I did that, then what about all the unused scripts Elementor is adding? What am I going to do about them?


Elementor? Is that you?

At this point I started trying to find out if other people had experienced the same as me, or if anyone knew of any similar issues, and after a lot of Googling and reading I was a bit surprised what I was finding out.

From what I can make out, there’s this kind of “fight” or at least opposing points of view people were publishing on the internet.

Some people were adamant that Elementor didn’t add bloat, others were adamant it did. Some people were saying Elementor wasn’t great for SEO, others were saying Elementor did nothing of the sort.

I was kind of sad when I was reading all this, mostly because from a usability and content appearance perspective, Elementor is absolutely fantastic. It’s also the kind of thing you can use in a site you’re making for a not-so-computery type person, hand it over to them, show them how it works, and they go “oh, that’s not as difficult to use as I thought it would be”. Elementor has an awful lot going for it in this capacity. Even the free version has lots of features, lots of different page elements, lots of functionality, and the pro version you can even use to (kind of) make your own theme exactly how you want it to look. Amazing! Who could ask for more?

The other thing to bear in mind, is that (as a systems adminsitrator) I work in a wold of fact. Logs, errors, manuals and best practices are the world I come from. A lot of what I was reading about this particular situation was (in most cases) not backed up with any kind of testing, or proof of what was being stated. It was probably about 90% conjecture in both the “Elementor is amazing” and “Elementor isnt great” camps.

Cnsequently, I can’t really accept that what I’m talking about in this post is simply a product of using Elementor.

Now I’m not going to profess that I have any proof either for or against (I’ve got a full time job, time, effort etc etc), as I don’t, but while all this was going on, I got asked by a friend to make them a website, and I thought I’d try making them a site using a minimal plugin set, and WordPress own built in page builder to see what happened.


Making my friend’s site.

The site that my friend wanted wasn’t massively complicated, 4 pages, a contact form, a gallery, a “past work” kind of affair. Nothing crazy, or with monster functionality.

After a bit of research (and playing around making child themes in the past) I decided to try using the Astra Theme and Spectra (which is an addon to WordPress’ default blocks editor). I’d also decided to use a child theme as this would give me more control over CSS, and I could also “do things” in the child theme’s functions.php file.

So in total, this friend’s site is:

So I got the site together made the pages, did the basic SEO business, then ran the site through https://pagespeed.web.dev/ with my breath held, and what happened?

This:

An Accidental WordPress Experiment

Notice how there’s no caching plugin mentioned in the list above. Nothing that would do anything like minifying and combining CSS either. No optimisation had been undertaken (at all). Yet out of the box, 100’s across the board.

I must admit I was surprised. So surprised, in fact, that I fell off my chair. After picking myself up and dusting myself off, I wrote this blog post about saving yourself the hassle of WordPress optimisation.

Then an unnerving reality came over me…


I’m going to have to remake my site, aren’t I?

Do you like swearing? I love swearing. It adds an appropriate level of emphasis in applicable situations. Believe me, I swore A LOT when this particular penny dropped. Basil Fawlty paled in comparison to my epic rantings.

I also couldn’t help but wonder if your everyday “I think I’ll make my own website” type of person would be able to work out enough to come to the same solution that I had. This does raise some questions about how usable WordPress can be, or if plugins and themes should be checked and sanitised before being made available but I guess that’s a blog post for another day.

I couldn’t help but feel for the people out there that had made a beautiful looking site only to run it through https://pagespeed.web.dev/ and burst in to tears. It’s the equivalent of Michelangelo proudly looking up at the ceiling of the Sistine Chapel only for a scaffolder to point out that Adam actually has blue, and not brown, eyes.

As well as the swearing and the sense of bitter disappointment, I was also a bit nervous about remaking my site. If all the stuff search engines read on my site changes, then that could be my historical SEO screwed, and nobody wants that to happen.

It did cross my mind to do a direct clone of my site, then remove things that were causing a problem, but after a bit of research it didn’t sound like you could just disable Elementor, enable another page builder and then it all be fine. That really sucked finding that out.

In the end, I decided to stop being all stampy about things, think about how I should do things, get a plan together and give it a try on a subdomain (so I could leave my site live while I worked it all out). So I came up with a plan.


The rebuild plan.

Did you know WordPress has it’s own importer/exporter? I only know about it because I get people saying things like “can you migrate this site for me?” then giving me an xml file. The first time this happened, I looked at the file and thought “surely not!”.

This kind of thing lead me to finding out about the import/export options in the “tools” menu of WordPress. You can use this to import/export all pages, posts and a bunch of other stuff. I didn’t really know how much this would copy like for like, what would be missing, what wouldn’t be missing, if all the CSS and JS that Elementor put there was still going to be there, and if it would copy all the images and so on. Well, there was only one way to find out.

Here’s what I did:

  • Set up a subdomain
  • Installed WordPress on the subdomain
  • Used the WordPress exporter on my live site to export all posts with “attachments” (?)
  • Used the WordPress exporter on my live site to export all pages with “attachments” (?)
  • Used the WordPress importer to import both of the above to the subdomain based site
  • Copied wp-content/uploads at file level from my live site to the subdmain site
  • On the subdomain site installed an “import to media gallery from uploads directory” type plugin, then ran that
  • Uninstalled the “import to media gallery from uploads directory” type plugin
  • Installed the Astra theme and Spectra plugin
  • Installed a child theme on the subdomain based site
  • Used the child theme to replicate the fonts and CSS of my live site (that part was probably the most challenging)
  • Customised Astra on the subdomain site so that it looked the same as my live site (I was DREADING this part, but thanks to Astra, it was a lot easier than I thought it would be… guess what my new favourite theme is?)
  • Installed Rank Math on the subdomain site (luckily this picked up all the SEO stuff I’d done on my live site in Rank Math, top marks to Rank Math, love those guys)
  • Installed and reconfigured contact form 7
  • Installed and reconfigured an SMTP plugin
  • Installed and configured easy updates manager and solid security
  • Used the functions.php of the child theme to unload the JS that contact form 7 puts on all pages from the pages with no contact form (thanks to Ezoic for that one).

The good news was that the WordPress import/export tool had done what I’d hoped. There wasn’t any of the additional CSS and JS that I didn’t want to be present in page output, so I was pleasantly surprised at how this part had turned out. It might not be the same for everyone, but it looks like this is quite a good method to move JUST page and post content, without also moving a lot of bloat.

At that point, things were about as like for like as I could get them without scrutinising each page and post manually.

So then I started…


Scrutinising Each Page and Post Manually.

I didn’t really want to do this, but I had to, to check that everything appeared OK, and that some of the SEO stuff was OK (alt tags on images, links working, that kind of thing). So I sighed, and thought “it’s only 60ish posts/pages, you’ve got to start somewhere”, so I started working through pages and posts…

This is when things started to get a bit weird. Not bad weird, like finding out that it’s not actually chocolate spread in the sandwich that you’ve just taken a bite of, more kind of good weird. More along the lines of being hit in the head with a golf ball, then suddenly being able to see people’s aura’s, like “this is kind of good, but… am I tripping out?”.

At this point I should probably let you know a bit about Rank Math (just in case you don’t use it). Rank Math gives you a percentage “SEO score” that’s red, yellow, or green depending on how well you score. This is displayed in both the lists of posts and pages down the right hand side, and in the top right hand corner when editing a page or post.

Got that? Right good.

So here’s what would happen.

I’d go to Posts > All posts

There’d be a post with a Rank Math score of, let’s say, 75 (I would guess that Rank Math picked this up from the imported data). I edit that post using blocks, and for about 2 seconds the Rank Math score would display 75 (which you’d expect given 75 was displayed in the posts list), then…

THEN!

Without me doing anything the 75 would increase by 2-3%, to 77 or 78.

I guess Rank Math had re-read the page or something, but anyway, why is it more than it was before?

So I logged in to the live site, checked the SAME POST. Rank Math score 75. Edited the post, still 75 (so no change like the increase in score I’d seen on the subdomain based site).

So the Rank Math SEO score was increasing on the subdomain site, just by me going to edit a page? Really?

You think that’s weird, now get this next part.

I had to tidy up a few pages (most pages and posts if I’m being honest). So I’m in the blocks editor, and I click on the page content to edit it, and it pops up with “Convert to blocks”, which I click. When I do the click, the Rank Math score increases by another 2-3% to 80 or 81.

So I’ve gained 4-6% of SEO score, just by opening a page in the blocks editor, and clicking on “convert to blocks” when it pops up.

It’s good, but a bit weird.

So I carried on getting all my posts and pages looking OK on the subdomain based site, then I took the subdomain based site live (to make it this one, the one you’re reading), and I published (I think) 3 posts. Then….


Google Webmaster Tools.

Obviously, you’re going to keep an eye on things like what’s going on in Google Webmaster tools after doing what’s effectively a site rebuild, so that’s what I did.

Again, it wasn’t a bad thing I was seeing, weird, but good, so this is my impressions graph (for this site) from Google Webmaster tools for the entirety of my site, from the original site going live (in May 2023), all the way through what I’ve mentioned above, to the new site (made with Astra, blocks and Spectra for page content) going live:

Impressions over time

What I’m mentioning here is by no means a complaint.

Don’t get me wrong, I’m not pointing the finger at any particular page builder causing a problem with SEO either.

From my point of view I’m sat here having worked on a site for 6 months, and getting a bumper increase in impressions that coincided with me remaking my site using Astra and Spectra. Although the obvious answer is:

Switching page builders = better impressions in Google Webmaster Tools

But there are other factors I’ve considered that might be at play, and I still can’t really swallow the “it’s the page builder”:

Performance.

Google does take performance in to account when ranking sites. After making the switch, the performance of my site did improve. Not Enormously, just by about 5% in https://pagespeed.web.dev/ nonetheless, there has been some improvement in performance.

Rank Math is actually seeing some kind of SEO improvement.

This is a little bit off whack, but it does make sense. My blog posts generally have keywords set that are to do with the content of the post (as you’d expect). Although pretty much all my posts are to do with web servers, the internet, and (mostly) WordPress, the keywords are very varied.

So if there’s a general, mild improvement over ALL pages and ALL posts for SEO, that could potentially result in the increase in impressions (maybe).

The other thing that I did when remaking the site was to use Rank Math’s local business schema on the contact page on my site. Then again, that’s one change to one page, so it seems a bit bananas to say that this caused what I was seeing.

Google have changed something in their algorithm.

And my site accidentally conforms to that change. This would be a massive coincidence, but it can’t be ruled out, and things do change.

WordPress is better.

Over the course of doing all that I’ve mentioned here, a new version of WordPress has been released, and I’ve updated to this. Again, coincidence probably, and why that would affect SEO is beyond me.

Page weight.

The size of my site’s pages has dropped from about 360KB to around 230KB (so a decrease of around 130KB), so my pages are smaller in bytes. That’s good. When I first started off doing the SEO thing, I couldn’t get Bing to crawl my site. They said it was too big! Like “we don’t crawl pages bigger than 125KB” (yep, that’s what the Bing guy said). Whilst 125KB seems REALLY small (https://www.google.com/ is just over 350KB to put things in perspective), page size is something certain search engines take in to account.

Number of posts.

As I mentioned, I posted 3 more blog posts (excluding this one) after making the switch. This took me up to 42 blog posts (I was on 39 before). I wondered if I’d crossed some kind of “take you a bit more seriously” threshold, just by passing the 40 mark. One of my customers said he thought you needed about 100 blog posts for Google to take you seriously and he’s been in the SEO game a lot longer than I have.

The page builder you use does make a difference.

Now, I don’t want to write this one. I’ll hold my hands up and be up front about this now. It would be harsh to point the finger at the page builder I used to use, and go “yeah, it’s that”, but I’m afraid it does seem to be, at least, related. Sigh. Again, I can’t prove it, yes, this isn’t exactly a valid experiment, and I’ll agree that this is conjecture rather than fact. That’s been the MAJOR change that’s been undertaken, so it would make sense that this has had an affect.

Google know what you’re doing.

Paranoia time. We had this spate of customers (at web hosting company where I work) some time around May/June (I think) that contacted us with queries along the lines of “Google thinks this site isn’t that great, rankings have dopped, and we haven’t changed anything, have you?” (we hadn’t). Now MOST of these sites (if you ran them through https://pagespeed.web.dev/ were flagged as having an excessive DOM size. The sites in question were also a bit “Let’s go nuts with what this thing can do” with lots of swishy animations and funky page elements. I kind of got the impression that what MIGHT have been going on was Google doing a bit of a “please do things properly”. At around the same time Gmail started enforcing SPF and DKIM checks for mail, which was again a bit of a “please do things properly”.

So, excuse the colloquialisms here, if google initially crawled my site and it’s logic ended up with something like “this site’s content is good, but the guy who made it’s a noob that doesn’t know what he’s doing, look at all the CSS/JS that doesn’t need to be there”, then it recrawled it (post switch over) and it’s logic ended up with “this site’s content is good, and now page output is cleaner” that might explain what I’m seeing in impressions.

The reward centre in our brains.

If you thought the above was paranoid and sketchy, just wait until you read this next bit.

Social media is deliberately designed to trigger the “reward centre” in your brain. It’s this that keeps you pulling down to refresh, or posting content hoping it gets retweeted (or is that re-x’d now?), or liked by other people.

It’s this that makes social media addictive.

Bear with me on this….

Let’s say, just hypothetically, that Google Webmaster Tools, also has some reward centre logic built in to it, to (possibly) make you feel rewarded by churning out blog post after blog post, and improving your site, then I could be getting hit with that right now.

Maybe, just maybe, Google are in some way saying something like “Well done Ralph, we can tell you’ve tried hard here, have a sweetie” (or in my case a better line on a graph).

Who knows?!

I don’t (for sure). If you do, or have any idea what’s going on here, or have had a similar experience and have any kind of idea, or even a guess, as to what’s going on here, I’d LOVE to hear from you.

You can get in touch with me via the contact information page of my site. Feel free to drop me a line (I don’t bite), if you feel so inclined or have anything to add to this.

But wait… there’s more…


I couldn’t sleep last night.

It was windy and rainy (I live on a hill) and I’ve got a cold, and the cat kept trying to go to sleep on my head…

Anyway, while I was lying there looking at the cieling, I had a thought:

It’s a combination of things!

I think the effect that I’ve seen is a combination of a few of the things I’ve mentioned above, which are:

  • Page weight
  • The Google bits mentioned above
  • Rank Math is actually seeing an improvement
  • Performance
  • The page builder you use does actually make a difference

Up until last night, I’d been thinking of these factors as individual aspects, but I think they MIGHT all relate to each other. Here’s how.

Let’s start with page weight. As I mentioned, I was seeing a decrease of around 130KB. A page that used to be about 360KB was reduced in size to about 230KB after doing the remake and switch over. That’s about 36%. Over 1/3rd.

Now consider that the words that a human would read, and the pictures that a human would look at haven’t changed (as I didn’t want to affect SEO).

So the reduction in weight/size, is likely to be specific to things like CSS, JS and maybe a bit of the HTML, all of which are things browsers read. So my site would look to be leaner on the scripts front. This explains the improved https://pagespeed.web.dev/ peformance metric.

Here’s the other thing that will have changed due to their being less scripts: The ratio of “human readable content” to “what the browser reads”

This had decreased by about 1/3rd and that is quite a fair chunk, so that would affect this ratio quite significantly.

So let’s, just for example’s sake say I used to have:

200KB of scripts (CSS and JS) and HTML.

160KB of human readable content.

This would give me a 44% ratio of human readable content to browser specific scripts.

As the human readable content hasn’t changed, and my site pages are now more like 230KB, this would mean that I now (roughly) have:

70KB of scripts (CSS and JS) and HTML.

160KB of human readable content.

This gives me a ratio of 69% of human readable content to scripts browser specific scripts.

That’s quite a difference.

One of the things that Rank Math asseses to give the percentage score is the keyword density. I don’t know if it works out the density based on the human readable content, or the whole lot including all the scripts. I do know that the keywords it’s referring to are in the human readable content though (because changing the human readable content directly affects keyword density metric displayed by Rank Math).

If we consider that it’s the latter (scripts and all, as well as human readable content) that’s used to work out keyword density, then this would explain the increased Rank Math percentage score.

As Rank Math is an SEO plugin, and most SEO tends to be Google centric, if Rank Math are analysing page output in the same way that Google do, and this is based on the entire page content (scripts and all), then what COULD be going on here is:

  • Page weight has been decreased by the reduction of scripts
  • The reduction of page weight, results in (to parahprase) an increase in keyword density
  • Rank Math gives an improved SEO score
  • And as Rank Math is likely to be Google centric, the SEO is then also better from Google’s perspective
  • More impressions in Google webmaster tools is seen accordingly

At this point, it wouldn’t be the craziest assumption to make that Elementor is putting all the scripts there due to it no longer being in use.

Throughout this, I’ve found the “It’s Elementor” train of thought a bit hard to accept. This is simply because:

  • It’s very widely used
  • There are a lot of advocates for Elementor publicly stating that it’s not problematic
  • It’s a good bit of software from a user perspective

People would simply not use Elementor if it universally caused problems with SEO and page performance, but Elementor has a massive user base.

So here’s my guess: It’s not the use of Elementor alone, it’s how page builders in general are used to create page content.

Yep, it’s a guess, but it’s the best I’ve got!

Take this site for example. It’s not very swishy, there’s not lots of animations and lots and fairly minimal styling (I’m a more function over form kind of person), so wasn’t really using a great deal of what Elementor can do, and WordPress’ blocks editor seems to be fine for the type of conent I publish. I’m toward one end of a user spectrum in this capacity.

As you might recall I mentioned that we had a spate of users at work, who had seen ranking drop, and most of these sites were being flagged as having an excessive DOM size (that was the main common factor).

A REALLY good way to get an excessive DOM size is to look at a page builder and think “Let’s see what this can do, then use EVERYTHING”. You end up with a LOT of HTML elements if you work like this, and that contributes to an excessive DOM size. People who operate in this manner are at the other end of the used spectrum from me.

This isn’t specific to Elementor, it’s specific to any page builder that provides a lot of swishy, funky, and styling facilities.

I would suggest that the reason that there’s been a fair amount of “noise” about Elementor being problematic is simply because it’s very widely used, and it does also provide a lot of swishy, funky, styling options… and some people (the ones complaning) have gone nuts with the styling.

It’s most likely not Elementor that’s the problem here, it’s more likely to be people using it (goign nuts with the styling).

If we factor in the “human readable” information on a page, and combine that with “I’ve used ALL THE FEATURES” in my page builder, you’re probably likely to end up with:

  • A big DOM
  • Lots of scripts and therefore more page wieght
  • A low “human readable” information to “total page output” ratio, and consequently…
  • A low keyword density

I’ll admit, a lot of this train of thought hinges on the “the keyword ratio/density is worked out on total page output, scripts and all” rather than just the “human readable” aspect, but I can’t think of much else that would affect the Rank Math SEO score increasing… afterall, I did see an increase WITHOUT CHANGING THE HUMAN READABLE CONTENT.

Sure, I could be wrong about all of this. Sure this isn’t a valid experiment. Sure, there’s a lot of conjecture in here. Sure, there could be other factors at play here. but what if I’m right?

If I’m right about this, what this means is that if you’re low on actual readable page content (I’m talking human readable text here) and you try and pad this out with swishy fancy things that your page builder can do, then you’re most likely going to end up with something that might look nice to a person, but that doesn’t look very nice from the perspective of Google, SEO and search engine rankings.

The problem here doesn’t look to be the page builder, and it’s probably not the content, it’s how the page builder is used to display the content.

I guess if you’re going to use a lot of page builder type elements, and use a lot of animation and styling then you’re probably going to need a reasonable amount of content to effectively balanace out what the page builder adds to your site.

Afterall, its the words that humans read, and it’s those that keep them looking at the page (most likely being advertised to when doing so). All the styling in the world doesn’t keep a human reading a page, it might make it look nice at an inital glance but there’s no longevity of page visit with low human readable content and lots of styling.

More words, less styling? Makes sense I guess. It keeps people reading longer.

Anyone want to chip in with this? Anyone?…. Oh, just me then.

At least I wrote it all down before I forgot.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top