Transforming your Online Business through SEO

Transforming your Online Business through SEO

Many of you may be familiar with the inverted pyramid writing style, where the most valuable and newsworthy content is at the top and the least important content is at the bottom. Every site I develop follows this structure; however, all the points below are not to be merely slept on.

They’re all major issues that commonly appear among even the best sites.

If you want to get the most out of this article, check out every point here. As you know, there are no shortcuts when it comes to SEO.

Most Common SEO Issues for 2019

I do not accept or follow many so called SEO gurus, because most of them are full of shit. I test, test again, and retest again to ensure my SEO procedures are going to get the best results. My secrets are proven through testing and big data before I implement them.

There are countless ways to optimize conversion rates, improve sales copy, increase engagement… but ultimately the only way to truly scale is to drive more traffic.

Drive more traffic in your own business or online venture and your profits will increase.

The majority of SEO pros get it wrong: the most common SEO issues I come across when re-optimizing a client’s site are:
  • Index Management
  • Localization
  • Keyword Cannibalization
  • Over Optimized Anchor Text
  • Poor Linking Strategy
  • Low Quality On Page Content
  • Low Quality Affiliate Content
  • UX (User Performance Metrics)
  • Titles, Tags & Meta Descriptions
  • Internal Redirects
  • Low Quality Pillow Links
what is your seo score

Index Management Problems

The first and most common issue that I’m seeing is accidental devaluation of the website because of indexing issues.

It stems from a common misunderstanding about how Google actually works.

(More on this in a bit…)

Most people think that if they build links and noindex junk pages they’re fine. However, it’s not that simple – and I’m about to show you a real example.

Below you will find a screen shot from my Screaming Frog. This is from a crawl of an eCommerce website that had a lot of onsite issues that need to be fixed:
screaming-frog-used-by-SEOSecretSauce.com

It’s quite hard to see but you may notice I have highlighted the number of HTML pages that are filtered. It’s a whopping 32,064 pages and, yes, it took us a helluva long time to crawl. The time and effort is worth it!

None of the 32,064 pages found in this crawl included a noindex tag, which means (in theory) Google should be able to crawl and index these pages. So, let’s check this against our numbers in the Google Search Console:

search-console

When we check in Webmaster Tools, we’re seeing 14,823 pages indexed. While this is a large volume, it’s still less than 50% of the pages that were found with Screaming Frog.

This is the first sign that something is seriously wrong, but the next screenshot will show you the extent of how badly our client had been stung with Panda’s low-quality algorithm.  We use the “site:domain.com” operator to pull up the number of indexed pages:

in-the-serpDespite the website having 32,064 pages crawlable and with index tags, and despite Google having indexed 14,823 in Search Console – only 664 have made it into the actual index. This site search shows us that Google has highly devalued most of the website.

It is a crawling nightmare.

So, the question is, how can you fix this?

Thankfully the answer for most people is quite simple.

Start by performing a site:domain.com search and auditing Google’s index of your site. If you go to the final page and you’re greeted with the below message, you have work to do:

error-message

Take a hard look at which pages shouldn’t be indexed and start proactively removing them from your crawl budget.

The problem with Google is that despite you adding a noindex to your pages, they remain indexed until Google recrawls. Some people add robots.txt to block these pages and save crawl budget – which is a good idea, but only after the pages are removed.

For the rest of us, we’re going to need to use the URL Removal Tool.

2. Localization Issues

The second most common issue we are seeing is when clients have multiple languages. While it’s great to have international coverage and provide foreign users with localized text – it’s a nightmare for Panda penalties if not setup correctly.

Many people are familiar with the URL structure that you should use for localized text, but many people forget to setup HREFLang on their website.

If you are looking to setup HREFLang codes, I suggest you use this website to get the right country and location code every time.

Below is an example of an eCommerce client. Whereas the previous client had issues with index management, this time it’s caused by HREFLang, and one more thing that goes unnoticed…

hreflang problem solving

While the client has successfully included hreflang in their source code, they had not included both the location and language code. The one time they try to do this with en-GB, the page no longer exists and redirects to their sitemap.

To add, this covers just 50% of the languages their website operates under. This has created an enormous amount of duplication to be indexed.

However, there’s still one more thing that was missed.  Each page has the Open Graph local set for en_US:

open-graph-locale-issues-demonstrated

This includes the pages that aren’t in English.

While this setting isn’t as clear cut as hreflang, it is indeed something that will provide Google with information on locale, and therefore creates confusion.

If your website has a similar issue, we advise you make the locale dynamic to match the current language.

Client Testimonial

The guys at SSSmake SEO look easy!.. We were a completely new website planning to operate in arguably the most competitive online marketplace which made the task ahead extremely difficult as we were going up against many well-known global businesses which also resulted in many different SEO agencies reluctant to work with us. SSS (Douglas) wasted no time in implementing their campaign and within 4 months our website was ranking on page 1 for some of our most profitable keywords. The guys at SSS are constantly keeping me updated and send me monthly reports on my campaign performance and keyword tracking. I would recommend their services in a heartbeat!” – Jon H

3. Keyword Cannibalization

This is a surprisingly common issue for most websites that we encounter. Despite the large amount of resources online to help with cannibalization, you would be surprised how many people still suffer from it.

Never heard of it?

Quite simply, it’s when you have multiple pages on your site competing for the same keywords.

And guess what?  Google doesn’t like it.

The first step is to learn to diagnose the culprit pages, because if you cannot find cannibalization – how can you fix what you can’t see?

At SEO Secret Sauce we have a few ways to find cannibalization but here’s the easiest and most effective.

Use Keyword Tracking Tools

One of the benefits a client gets from working with SSS is that we track keywords up to twice daily with Diamond Analytics, one of our partners.

The tool includes an overview of the site’s overall keyword performance, such as below:

SEOSecretSauce-rankings-1

Aside from showing us an overview of how this client has performed over the past 7 days, we can use this to track each keyword’s performance independently too:

we-use-daily-tracking-metrics

In this photo, you might notice that there had been a jump from the 3rd page to the 1st page for their target term after implementing some of our onsite advice. However, more importantly you will also be able to see that the Google URL had started flipping between their category page and their homepage.

This is an obvious sign of cannibalization and once noticed we jumped into action to fix the problem.

4. Over-Optimized Anchor Text

There was a significant update in October 2016 as Penguin 4.0 rolled out.

Penguin 4.0 was an update that changed how Google perceives and interacts with links.

As part of our auditing process for each new client, we analyze your existing anchor text and break the types down into the below values:

  • Branded – an anchor text that includes your brand name or slight variation, for example: ‘seosecretsauce’, ‘visit seosecretsauce’, or ‘SSS.
  • Generic – an anchor that uses a generic term but does not include branding, for example:‘here’, ‘read more’, or ‘visit site’.
  • Image – a link that has no anchor is generally shown as a blank in AHREFS export feature. Other clues might be file extensions in the alt attribute ‘jpg’ is probably an image.
  • Miscellaneous – an anchor that does not qualify as generic, but is otherwise unrelated to the website. Forum and comment spam often includes anchors such as ‘Steve’, ‘Stuart’, or ‘Stan’.
  • Low Quality – an anchor that is more than 100 characters is generally an irrelevant anchor unless it’s a long URL. Another low quality anchor is a foreign language and symbols.
  • Targeted – an anchor that includes the exact or partial term you are trying to rank for, effective to gain rankings but higher risk for tripping a Penguin filter against your site.
  • Topical – an anchor that is on topic, but does not include your targeted term. For example, an affiliate site reviewing ‘best weight loss system’ might include topical anchors such as: ‘healthy workout’, ‘burn lots of calories’, or ‘high impact sport’.
  • URL – this is arguably the most obvious one, but anchors that are naked URLs such as ‘example.com’ and ‘https://example.com’ would count as URL.

Here’s an example of a client that recently joined and has an issue with anchor text. The labels match the descriptions above:

anchor-distribution-issues-explained

In this example, the website in question has chosen to use lots of targeted anchors, but has also picked up lots of low quality anchors along the way. The solution was to increase the amount of topical, branded, and generic anchors so he could meet the anchor requirements as determined by the niche average.

By increasing the volumes of those anchors, the client regained organic traffic loss and is now setup to survive future updates.

It’s important to note that most people use low quality pillow links and press releases to redistribute their anchors.

There are indeed some issues with this that are covered in the next two points.

5. Poor Linking Strategy

Up until this point, 3 of the top 4 issues have been related to onsite. While Issue #5 is indeed another offsite link building issue, it’s important to recognize the connection.

When a website has fixed its technical issues, pumped out valuable content, and improved user performance metrics – link building becomes a lot easier.

Rather than needing 100’s of links to rank a site, you can achieve a lot more with less. Since link building and onsite both cost money – you may be wondering why not just spend money on links?

The answer is simple…

Google has introduced many link building filters to thwart your efforts, therefore, the more links you build the more likely you are to be caught. By delivering better content you will not only improve your conversion rates, but you will make it easier to rank higher, permanently.

Check out one of our older clients who has been relaxing on page 1 for two years comfortably:

long-term-google-rankings

So, the question is, what makes a link strategy good?

The first thing is to avoid over-optimizing your anchor text, because this is eventually going to cause a penalty. Choose to use topical terms and branded anchors to hit your pages instead.

The second thing is to target other pages than your main core pages. If you have created a blog post that is valuable and internally links to one of your core pages, throw some links at that page too. Avoid becoming the Black Sheep.

Not only will this help prevent overcooking your page, it’s going to help you rank for long tail keywords that you didn’t claim before.

The reason your competition doesn’t do this is because they fail to focus on any pages other than their money pages.

Big mistake.

If you fix user flow on your entire website, then every page becomes a money page.

Write that one down.  Post it on your wall.

6. Low Quality Affiliate Content

This should go without saying, but if your content is not good then you don’t deserve to rank. Period. Google wants to serve ONLY valuable content. Google has to trust you and your site. Is your site transparent?

However, what most affiliate sites are guilty of is not mopping up all the juicy long tail keywords that are easy to rank for and provide noticeable traffic. It’s not that they don’t want to rank for those keywords, it’s just that they don’t know how. Lots of people just don’t know what they don’t know.

I want to share a couple of images with you that show just how powerful content can be, and if you don’t value your content – what can happen:

Low Quality Affiliate Content

This client has recently joined us and suspected a penalty, and at first glance there’s a dip in visibility, but nothing that seems too unusual. Until you zoom into the top 10 positions:

the top 10 positions

What initially looks like a slight dip is really a huge drop in rankings, and this person has suffered from decreased traffic for about 8 months.

The culprit? Content.

This is no fault of their own, but somebody has scraped all their website and created duplicate content across the web. We’re currently working to re-write their entire website’s content, file DMCA requests and fix holes in their linking strategy. We anticipate a return in rankings within the next 3 months.

While this highlights the power of content in Google’s algorithm, this is slightly different from what I am describing with low quality affiliate content. The main culprit we see is when every page has an affiliate link and there’s no actual user value.

It’s possible to rank this way, but there are some drawbacks.

Let’s look at User Performance Metrics and how this can help guide our content strategy.

Client Testimonial

The SEO Secret Sauce team has helped to grow our business where four other digital marketers have failed. The team is not only flexible, responsive and reliable, they also make decisions that are data-driven with their proprietary tools. They are not testing and guessing with their work as most others do. We will be rounding out the year with them and looking forward to a long and profitable engagement.  – Vik C

7. User Performance Metrics

We have noticed that many affiliate sites and even some eCommerce companies are not focused on user performance metrics. This is bad for several reasons:

First and foremost, you’re going to be limiting yourself severely.

There is a finite amount of people searching monthly for what you offer. By avoiding the issue of content, you’re forcing yourself to spend money on links to brute force rankings towards terms that are not relevant and not fruitful.

Instead, we would suggest that you should focus on converting your existing traffic while simultaneously growing your potential traffic. Take this example:

If you have 1 customer in 1000 visitors that purchases from you, then to triple your customers you have two options:

  • Increase traffic from 1,000 to 3,000
  • Increase conversion from 0.001% to 0.003%

We applied this strategy to one of the eCommerce websites mentioned earlier in the article. Their website had been devalued and it’s going to take time to grow traffic, so we decided to pursue conversion while working on fixing the huge issues.

Here are the results:

conversion-increase-after-optimization

In the past 2 weeks, we have managed to increase the sessions by a modest 4.26%, which we’re happy to take, considering the condition of the site – but lower than average for our clients.

However, the main point to notice is the 79% increase in Conversion Rate, 86% increase in Transactions, and 49% increase in Revenue. These changes mean that as we fix the devaluations against the site this client is primed to make a significant revenue gain.

8. Titles & Meta Descriptions

This is like keyword cannibalization in that it’s surprising how many websites still have issues with it. This is something I would expect most people to be getting right by now since there are literally millions of pages on the topic:

Titles & Meta Descriptions Issues

I find it so confounding that the majority of SEOs still aren’t getting this. I think it’s primarily due to their laziness and always searching for the next shiny thing to make their work easier. SEO is not fucking easy…if it was, so many more sites would be ranking higher…or at the very least they’d finally be out of the Google “Sandbox”.

9. Internal Redirects

This is one of the most common issues that websites face. A large volume of 3XX Redirects on your website seems fine to most – if it’s a 301. However, this isn’t strictly true and here’s why:

A 301 redirect is designed for when a user requests a page that is no longer available and has been permanently moved. This is something that happens a lot across the internet. The server after a moment of latency returns a different URL and the page loads as usual.

The issue with the above is the term latency and it’s something most webmasters ignore. The physical distance between a user and your server means that even a tiny bit of header information takes time to send and receive.

If you are looking to improve your user experience, then you should make your website as fast as possible and therefore remove all 301s when not absolutely needed. This will be better for your user and help authority flow within the website unhindered.

However, what’s the difference between a 301 and a 302 redirect?

Whilst a 301 redirect points towards a permanent move from one location to another, a 302 is a temporary move. From Googlebot’s perspective this means that:

  • 301 Redirects should index the new URL, rather than the previous URL
  • 302 Redirects should index the previous URL, ignore the new URL

Google has claimed to handle both the same, but it doesn’t make sense that they would do this. Both status codes have different purposes and should be treated differently.

Make sure you’re using the right redirect on your website.

10. Low Quality Pillow Links

What are pillow links?

Pillow links are used to diversify your anchor text ratios – in the audio industry this would be comparable with signal-to-noise ratio. When you buy a microphone, you want low noise and high signal – but this is the complete opposite to SEO link profiles.

If your link profile has a high signal and low noise, it’s going to be easy for Google to analyze your website’s links and pick up unnatural trends with their machine learning. You could consider pillow links to be dithering for the audiophiles out there; you add noise to the signal to improve the quality.

Here’s a shocking figure for you from a recent client who had built 100s of pillow links attempting to dilute a high target anchor text distribution:

index-status-w-pillow-links

Despite the client having built 315 pillow links in the past 12 months, only 25 of these links were indexed. This means that despite all the time and money spent, it provided almost no value.

The solution is to use indexing tools that encourage Googlebots to index your pillow links… or just build all quality links and you won’t have this problem.

For indexing, we use our own proprietary tool but there are some online services that can do this for a fee.

What’s the #1 goal you have when it comes to your website?
SEO-Secret-Sauce-Rocket-Your-Profist

4. What’s the #1 goal you have when it comes to your website?

  • Do you want to get your site out of the Sandbox and start ranking?
  • Do you want to optimize your online presence for your business?
  • Do you want to increase your sales by organic means?
  • Do you want to go full steam ahead to outrank your competition and steal their business?
  • Do you want to dominate search by getting placed in the sweet spot?
  • It’s called the Local 3-Pack , where you rule and profit the most.

Our SEO expertise aligns w/your Google ranking domination, business success and increased profits. We do it on a daily basis.

We accept only one client per niche industry/business because there are only 3 (THREE) spots in the Local 3-Pack. Consequently, we don’t compete against ourselves and we don’t place two of the same clients in the same 3-Pack.