Copied to clipboard!
With the SEO community still rushing to recoup rankings and traffic, or celebrate their good fortune at what the “Florida 2” update has done for their site, it’s important to dive into exactly what this Google update was all about.
Whilst it’s always difficult to hone in on exactly what Google focuses on with each change to their core algorithm, there is a general consensus on several different factors for this particular update (based on hundreds, if not thousands, of first hand accounts). But, while reading the below issues and potential fixes, it should be prefaced with a standard warning on implementing changes post-update.
Although several sites have given stories of their recovery, sometimes a rise in traffic of 50%+, you will always find differing levels of success. What works for one website may not work for another. This is because each site has its own basket of variables, and so they may only need to fix one thing, whereas others will have a plethora of issues that need fixing.
With that in mind, below are some of the core factors that caused the highest rises or falls for websites:
In the early days of SEO, content didn’t really matter. The name of the game was picking up as many links as possible, even if they were suspect at best. But Google quickly understood that links wasn’t the only factor in ranking a website’s importance and authority (especially with SEO’s manipulating their results with lots of links).
The user also needed to come across informative and helpful content on the site. It was no use sending hundreds of thousands of links to a website if it didn’t have good content (especially after the Google Penguin update). And this hasn’t changed with the so-called “Florida 2” update.
Content is still very much king. Although this update wasn’t focused solely on content, it was a large factor. As is always the case, Google stated that there was ‘no fix’ where drops in traffic were concerned, and that webmasters needed to focus on ‘creating better content’. This is a standard line that gets peddled out with each update (infuriatingly), but it should be taken seriously.
This doesn’t mean rushing off and writing content for content sake. You shouldn’t just fill your website with highly optimised, blatantly SEO focused content. You should be writing genuinely interesting, informative and useful content. This means writing more articles or blogs about things users care about, answering frequently asked questions on your site, and giving in-depth knowledge to the products you sell.
It also couldn’t hurt to attach a real author to the things you write online. Google seems to reward content that has an author attached to it, rather than just a wall of text on a blog with no sign of a human in sight.
Other updates, for example the aforementioned “Google Penguin”, have focused solely on spammy links. Sites that built thousands of links on dodgy directories and link farm blogs were hit, and they were hit hard. Some saw their visibility drop by 70%+, and some, as in the famous SEO case of Interflora, were wiped out completely. This means getting de-indexed by Google, which is incredibly brutal in a world where Google basically owns the search engine market.
In comparison, this update has seemingly reframed from attacking, or rewarding sites, based on their backlink profiles. Some sites who’s backlink profile grew dramatically before the update did well and saw increases in traffic, while others didn’t. As a factor on its own, it would be difficult to prove that it had any real significance.
But as is always the case, it wouldn’t be a post 2010 update unless some effects were due to backlink profiles. The only slight nugget that came from the SEO community analysing countless backlink profiles was that it seems slightly less, high quality backlinks were better than slightly more, lower quality backlinks.
Are you seeing a trend here? When it comes to Google, less is more, which is why the old SEO tactic of ploughing a site full of keywords and links really are dying out, meaning the SEO community has to be more proactive than ever.
If you’ve been in SEO for a while, you’ll be well aware of Google’s long history with exact match keywords. Gone are the days of building hundreds of exact match anchor text links and seeing your page soar to position 1. And now, there could be signals that exact match anchor text isn’t even safe to use on your own website.
Let’s be clear, this isn’t a blanket rule, and so don’t go away and remove every exact match keyword from your website. But, as was said previously, it looks as though less is more. It’s always been known that a high keyword density is actually a negative rather than a positive, but it looks as though the percentage is shrinking with each passing update. A move toward a more conversational, helpful style with long tail keywords is becoming the preferred option for Google (or at least that’s what early signs are indicating).
This could mean that user-generated signals are being prioritised over exact match keywords within on-page copy. It would make sense, as Google is doing more and more to give the user the best possible experience. If a user searches for a specific search term and lands on your website, stays, and then purchases and fills out a form, that would be a better positive marker than lots and lots of users hitting your page and bouncing straight off. And so having a small bit of text on a category page packed with exact match text wouldn’t exactly fulfil this need.
As said above, content, or at least useful content, is still very much king.
Ever since Mobile First Indexation came into effect last year, having a streamlined, user friendly mobile site has become increasingly important. Your desktop rankings will actually be affected by the way Google reacts with your mobile site. This was a big deal for those businesses whose mobile site was a direct copy of their desktop site, and caused a whole range of SEO issues.
But Google did provide SEO’s with a gift in the form of the new web.dev tool, released at their Google I/O developer summit. If you haven’t tried it yet, it’s effectively an auditing tool for your website, tested through a emulated phone. It gives you an insight into the performance, accessibility, best practise and even the SEO of a webpage. It’s an invaluable tool, literally giving you reasons why Google may not like your site.
If you haven’t tried it yet, it’s a must do. It’s helpful for you to know how Google is interacting with your site, and helpful for your developer. Instead of rummaging around for potential issues, Google hands it over on a silver platter.
Part of the web.dev tool mentioned page load speed. In a world where everything is right at our fingertips, a user won’t wait around. If you have a slow page load speed, you can basically kiss that user goodbye. And Google will also take notice of this. If a user can’t view basic page information in literally milliseconds, then Google will penalise you for it.
Just like in sports, milliseconds count. If you can reduce page load speed by even half a second, or even less than that, it may just put you ahead of a competitor with a similar page. Your website needs to be light and fast, not bogged down by unnecessary resources. Strip back anything that isn’t absolutely essential, ensuring you have a page that a user will find easy to load and find even easier to navigate.
As with any update, it’s been a confusing and frustrating event. No one person within the SEO community can really point to the exact recipe that makes up the update. It’s built on top of iteration. Google continues, every day, to grow and adapt their algorithm and so no one, other than the Algorithm elves, knows what’s happening.
There will always be that unknown factor; Google. It seems that the only way to escape these updates, or improve off the back of them, is to have a well-rounded and proactive approach to online marketing. You can’t focus too much on just one aspect of SEO. Instead, give equal weight to the core pillars; Technical SEO, Content and Links.