- Advanced SEO, SEO, SEO Tools

What Is An SEO Audit And What Tools To Use To Improve Your Google Ranking

If you’re typing a key phrase “SEO audit” in Google, it seems that something went wrong with your project. I bet your organic traffic started to decrease, and you are looking for an SEO expert or some company, which might assist you to define the possible ranking issues of your website. However, before asking someone for help, I’d suggest you follow a series of steps to determine the flaws of your site by yourself.

In fact, there are only four possible types of issues:

  • Technical issues
  • Content issues
  • Backlinks
  • The impact of the events you can’t/couldn’t control

Depending on the reason of sanctions, there are different ways to improve your situation. Even if your final decision is to enlist the assistance of professionals, it would be useful, in any case, to evaluate the scale of the problem and count the amount of time which needs to solve it.

Therefore, before accomplishing an SEO audit, be ready that this process:

  • Takes a lot of time (depending on the scale of your project) to recover
  • Depends not only on you and your SEO expert but Google as well
  • Is needed in usage many human resources

So, if you’re ready to deal with the aftermath, let’s get it started.

First, let’s take a look at the age of your website since it’s a big difference to deal with the young and old project.

In the case of a young project, the recovery might be not so painful since you had not enough time to create thousands of pages. But if your website is vast and old, your primary issue could have the ancient roots.

There are a lot of SEO tools, which we will be using during the audit process, but the first one we need to apply to your project is the so-called “Panguin SEO Tool,” created by the Barracuda Digital company.

Use the Panguin Tool to define the type of penalty

This tool is free of charge, but to work with it, you will have to provide offline access to your Google Analytics account.

The Panguin SEO Tool allows you to juxtapose the date when your organic traffic started to decrease with the period when Google rolled out another algorithm update. Such action is the first thing every SEO expert should do before proceeding to the in-depth analyze your site.

To be honest, Semrush has also introduced the similar tool, but we used to use the Panguin Tool instead.

There are various types of Google algorithm updates such as:

Depending on which filter has been applied to your website by Google, you need to follow different instructions.

Despite the fact that most SEO experts accomplish the SEO audit to deal with your technical issue, I almost sure that, after all, the vital problem relates to your content. But before paying attention to the content of your website, let’s check up that everything is okay with your technical setup.

Dealing with the basic technical issues

In general, you know the following steps, but, anyway, I’d suggest you check the options presented below:

  1. Check the accessibility of your website
  2. Log into the Google Search Console and take a look is there any new messages from Google on your dashboard
  3. Shortly afterward, visit the section titled Manual Actions
  4. After, go to the Security Issues section to be sure that everything is okay
  5. Next, you need to check crawl stats & errors
  6. Double check your robots.txt
  7. Pay your attention to the Blocked Resources
  8. After, go to check your Sitemaps
  9. Move to the Structured Data section
  10. Fix possible issues in your Accelerated Mobile Pages
  11. Finally, try to use the Fetch as Google tool

Check the accessibility of your website

This point may look evident, but sometimes, not everything is so clear as it may seem.

For instance, a webmaster who watches for your site could use the Let’s encrypt SSL certificate. This certificate updates once every three months and may be outdated. As a result, Google Chrome forbids the visitors to access your website since the browser considers it insecure.

Only one version of your website must be accessible (in my case, HTTPS without www)

Next, you will have to check which version of your website is available for a visitor. So, if you own a website https://mywebsite.com, it’s necessary to have only one version of your site accessible to the visitors. However, if your website administrator/webmaster is inexperienced, your project could be available in several ways such as:

  • https://mywebsite.com
  • http://mywebsite.com
  • https://www.mywebsite.com
  • http://www.mywebsite.com
  • 44.221.25.3 (or another IP address)

Google might consider each such address as a mirror or even a duplicate of your project. Therefore, you must be sure that there is only one version of your website.

Finally, you need to check whether your website is indexed by GoogleBot entirely. To do that, type “site:mywebsite.com” in the Google search string. I bet you know how many pages (approx.) contains your project. If the number of pages is less than you expected, it means that GoogleBot has considered some of your pages useless or having thin content, and excluded them from the index. By the way, these pages could still exist in the supplemental index.

To easily cope with this initial task, I suggest you use WooRank as your assistant. This SEO audit tool makes it easy to look up at the possible initial issues of your project. Besides, the basic functions are free of charge.

Google Search Console

As a rule, Google warns you about the possible critical issues with your site. The best way to check this out is to log in to your Search Console (WMT) and explore the automatic messages from Google.

Manual Actions

The next thing you need to care about is Manual Actions. Google applies such kind of penalties to those sites which severely violate the rules. However, in that case, you have a chance to recover if you made all the necessary changes. You will have to suggest your website for a review when all mistakes were fixed and improved.

Security Issues

It happens that an ill-wisher hacked your project and uploaded a virus or intrusive ads on your website. When Google detects that your site is being attacked, the system automatically notifies a webmaster.

Crawl Errors

If some URLs of your website are not accessible for GoogleBot, you see several crawl errors in the WMT. Though this issue is not vital, it might turn into crucial, if there are too many errors. Be sure that you’ve fixed all of them.

Robots.txt

This option is obvious for a vast majority of webmasters. However, some of them with no experience frequently make a lot of common mistakes. For instance, a lot of websites use such platforms as WordPress or Joomla. When they install one of these PHP frameworks, the installation guide asks them to check the checkbox “Discourage search engines from indexing this site.” This option looks necessary since the developers who created WordPress or Joomla don’t want that GoogleBot started indexing the empty website. However, those webmasters who don’t have enough experience might forget to uncheck this checkbox later. Even if you’re an experienced webmaster, you need to check your robots.txt file since some plugins or other third-party instruments could make some changes in this file.

Blocked Resources

You must be sure that GoogleBot has access to all the necessary resources of your web project since if Google can’t access them, the page might be indexed incorrectly. As a rule, this option relates to CSS/JS files that you could block in robots.txt.

Sitemaps

GoogleBot frequently checks your sitemap files. As a rule, it’s easy to define if something wrong. All you need is to compare how many pages were submitted and how many of them were admitted by Google. If the indexation process goes appropriately, you will see almost the same amount of submitted & indexed page (+- 10%). But if GoogleBot indexed less than 50% of your website, given the fact that it is not fresh, it seems that something wrong with your project.

Structured Data

You need to be careful with your structured data markup. Otherwise, Google may punish your website, thinking that you want to manipulate with such kind of markup. As a rule, the most severe Google penalty relates to a star rating since most webmasters use the third-party solutions to show the “stars” in their rich snippet. Warning! If your stars disappeared, it’s a bad signal. Google wants to be sure that the real/registered people rate your products/articles and write reviews. Bear these data in mind.

Use the Structured Data Testing Tool by Google to solve this issue.

Accelerated Mobile Pages

The Accelerated Mobile Pages (AMP) it is a new format introduced by Google to speed up your pages. This technology is essential for such web projects that spread the news content. But since the technology is new, it’s easy to make mistakes while the implementation.

Use the AMP Test Tool by Google to eliminate such issues.

Fetch as Google

Finally, use the Fetch as Google tool to be sure that the GoogleBot crawls your pages correctly. Unlike you, Google looks at your page as a bot not human. As a result, you might not detect some issues while looking at the visible side of your site.

Dealing with the advanced technical issues

If the basic functionality of your website is tuned up correctly, it’s time to talk about the possible advanced issues.

As a rule, your advanced SEO audit will consist of the following steps:

  • You need to check up your server setup
  • Check your DNS accessibility
  • Define your website speed issues
  • Detect the possible SEO issues of your platform/PHP framework
  • Double check your plugins, especially SEO plugins
  • Determine the potential weak spots of usability and mobile-readiness
  • Protect your website against various attacks (fix the backdoors)

Server setup for SEO

Your server must send the correct HTTP headers to Google. Thus, you need to take care of your

  • Last-modified header
  • Tune up 301 redirects
  • Pay your attention to rel=”canonical”
  • As well as to HREFLANG tags
  • Shortlinks
  • Share image
  • Encoding
  • Content length
  • File type

This SEO audit tool might be useful for you.

DNS accessibility

Your next goal is to check up your DNS health since you need to be sure that your domain name has been set up correctly. To cope with this task, I’d suggest you use the DNS Check Tool by Pingdom.

Page Speed

This point is crucial in the new era of SEO. Google is concerned about your website speed. Do you know that the vast majority of people access the Internet via 3G using their mobile phones instead of the desktop computers such as MAC and PC?

I wrote a guide which helps you to increase your page speed.

After reading my guide, use the following tools to check up your website speed:

The third tool in the list is the most vital since Google recently rolled out the mobile-first index. Also, I’d recommend you to double check your AMP pages if you publish news content.

SEO issues of your platform

Your platform such as WordPress, Joomla, Drupal, etc. is the universal solution created to fit not only your needs. As a result, there could be some technical issues hidden inside your platform as well as theme.

For instance, most of you use WordPress since this platform is easy to install and tuned up. But this platform has the built-in category, tag, and author pages. If you are a sole author of your blog, you have at least three pages that duplicate your content:

  • Front page
  • Author’s page
  • Category page (if you use only one category)

Some inexperienced webmasters use dozens of tags for each article. As a result, each such tag contains only one page with the same content. GoogleBot considers these pages as having the so-called “thin content.”

The same relates to your WordPress theme. If you’ve installed some layout from the WP repository, it doesn’t mean that this template is suitable for SEO, even if its author claims that this theme is SEO-friendly. Also, some themes affect your rich snippet, microdata markup, etc. So you may suffer from the improper implementation of the functionality by the theme author. The same thing may happen if you did your layout by yourself.

In any case, if any of Google’s algorithm updates have hit you, you’ll have to face this issue.

I would recommend you to use the following SEO audit tools that accomplish an in-depth crawl of your website, which simulates the same action of the GoogleBot.

Plugins & SEO Plugins

YOAST SEO Plugin Report

Remember, any third-party solution you use to improve your ranking might hurt your SEO visibility instead. You must be confident in the author of a plugin before installing it. I strictly recommend using the Yoast SEO Plugin since its author possesses a firm reputation. Besides, this plugin will assist you with the possible on-page SEO optimization issues.

As mentioned above, pay your attention to any of your plugins that affect your microdata markup. The use of such markup is dangerous if you’re not an experienced webmaster.

Usability & Mobile-friendliness

Both these technical issues affect your SEO visibility, especially when it comes to the mobile-friendly layout. I know that it looks evident. However, it’s not.

For instance, let’s talk about your ads. Can your advertisement negatively affect your SEO visibility? Of course, it is! Despite the fact that Google now allows you to place an ad block 300X250 on above the fold, I still don’t recommend you to do that. The main issue is that you need to separate your content from advertising, and your layout, especially, a mobile version of your website must be tuned up correctly, to follow this rule. You need to set up the navigation of your project correctly as well. Nothing must serve an obstacle on the way of a visitor.

Be sure; if GoogleBot detects that the primary content of your site is inaccessible because of poor navigation or due to the over-usage of the ad blocks, you’ll face a penalty from the side of Google.

Of course, the Google PageSpeed Insights tool provides you the usability report, but I’d recommend you to accomplish a series of blind tests with people you don’t know to gather the necessary data about your usability and mobile-friendliness. Though many SEO experts refer these issues to your content, I claim that this is a 100% technical issue.

Secure your website

Even if you didn’t find any warnings in the Search Console about the security issues, it doesn’t mean that everything is fine with your security.

For instance, there is a common issue with the WordPress security, called the XML-RPC attack. While the Google PageSpeed Insights tool is showing that everything is okay with your server, in fact, you’re under attack. As a result, the potential visitors can’t get access to your website. You need thereby to close all the possible vulnerabilities that might affect the security of your project.

Content issues

As a rule, if you have never purchased backlinks and your technical setup is perfect, the only way to lose your SEO visibility is to create the content of poor quality. Unfortunately, it’s too complicated, to deal with this issue.

In general, Google considers your content inappropriate due to:

  • Weak or wrong on-page optimization
  • Low quality of your content
  • Similarity and duplication of your content
  • Thin content
  • Over-optimization
  • Poor behavior factors

On-page optimization

First, I strictly recommend you to follow all the 30 points described in my article devoted to the correct on-page optimization. All the things described in this blog post are essential. In fact, the on-page optimization is the only way to explain GoogleBot what is your content exactly about, make the right accents, and emphasize the most critical paragraphs of your data. Never exclude the on-page optimization from your SEO practice since this aspect is vital.

I don’t want to duplicate my content, but I’d like to emphasize some necessary points:

  • Your content must be readable
  • You need to show a structure of your data to GoogleBot thanks to the HTML markup and microdata
  • Your goal is to be natural both for humans and Google
  • In some niches, you need to be an expert
  • You must use only reliable sources and link to them as your references
  • Mix your data with a virus content, to attract the audience from the social networks
  • Add some diagrams, infographics, pictures, video, etc. to make your content not so boring, if possible

You will find all the detailed data on the subject in my article described above.

The content of low quality

The first time when Google smashed the SERP, back in 2011, was the first algorithm update related to quality, known as Google Panda. Before this event, the Internet was full of content of poor quality. This filter applies to those sites which contain useless or borrowed content. After, Google Panda has become a part of the Google core algorithm which updates regularly. So, if you don’t want to be hit by this “animal,” it’s time to take care of the quality of your content.

The first thing you need to define whether your content is useful to your visitor. We will discuss it later how Google detects poor content. But this time, I want to pay your attention to your feelings. However, in fact, it’s time to present your project to a focus group or better target group. For instance, if your project is about recipes, you need to invite 50 homemakers in your office from across the country and ask them what they really think about your recipes?

Perhaps your data is out of date, or your authors have no enough qualification. Your primary goal is to define why your content is not suitable for your target audience.

Next, you need to be sure that your content is entirely original. I’ll give you a hint. Start working with your titles for a start. If your headlines are not original, it is a bad signal for Google. However, if your texts are not original, I bet Google Panda is coming for you.

Your website must be in the 1st position if you paste a paragraph of your text

How to define whether your content is original? First, try to cut a piece of your text containing two-three strings and paste it into the Google search string. If your page is not in the first position, it’s a bad signal. But if you see someone’s website with your content instead of your own, it’s a disaster. We’ll discuss such a situation later. For now, remember, your content must appear only on your website and nowhere else. Don’t allow someone to copy your texts without your permission. But I strictly recommend you not to spread your content, even with your approval at all.

Remember, if some popular website borrows your content, Google will rank it in the 1st place. Don’t allow any famous resource to take even a small piece of your data. If you did it, request your partners to remove your property. And I think it’s needless to say that it’s prohibited to borrow someone’s content, even if you link back to this project. This approach doesn’t work. Google will punish you for sure.

Thin content

You may publish quality, but thin content. Remember, your data must cover the topic entirely if you want to succeed in the Google SERP. In March 2017, Google confirmed a series of mysterious updates that the company nicknamed as “Fred.” Google Fred punishes the websites with thin content.

What Google calls the “thin content?” In a nutshell, you need to avoid such types of data:

  • The article that consists of less than 350 words
  • The blog post that doesn’t cover a topic fully
  • The content that useless or out of date (especially news)
  • The so-called rewrite content
  • Repetitive, duplicated, and poorly written content

Also worth noting that Google now pays attention to the reputation of the author who wrote the content, especially, if your data relates to the so-called “YMYL” genre, such as medicine, astrology, etc. Thus, if your content might affect someone’s health or fate, you must be sure that your authors are qualified specialists.

To avoid the “thin content” label, you need to paste the profiles of your authors on your project. Each author must present his certificates, degrees, diplomas, etc. In addition, you need to provide the comprehensive information about your company. So, you need to take care of your “about” page. Take a look at famous brands, how they present their informational pages.

Over-optimization

Over-optimization is a term that came from the 90th when the Google algorithm was “naive.” Unfortunately, lots of webmasters still use dozens of keywords in their articles. Forget about it! GoogleBot is tricky and sophisticated in 2018. So if you want to avoid this penalty, you should take into account that you deal with a human, not a robot.

What to do to avoid over-optimization:

  • Try to use a keyword only in your title
  • Stop using dozens of keywords in your blog posts and single pages
  • Don’t take care of your meta description since Google knows better what to show in the rich snippet
  • Forget about keywords in the ALT tags (don’t use long phrases)
  • Use subheads, to split your content into blocks, but use no more than two keywords in your H2, H3, H4 headers
  • Don’t avoid the usage of outbound links
  • Never close all the links in “nofollow” since it looks suspicious
  • Use at least one dofollow link in a single page
  • Forget about dozens of internal links, similar to Wikipedia
  • Never use supplemental, useless texts written in small type

Behavior factors

An excellent example of bad indicators

It’s time to explore your Google Analytics report on the subject of behavior mistakes.

You need to remember the golden rule, the more time a visitor spends on your website and more pages he read, the better position of your project in the Google SERP.

As a result, there are three vital factors you should pay attention to while studying your GA report:

  • Bounce rate
  • The average duration of the visit
  • Number of Sessions per User

This is how Google evaluates your website behavior factors.

Thus, if you discovered a drop in one of these indicators, it’s time to define why the visitors started to spend less time on your website. Ideally, is to determine what pages in the list are the least visited to remove them or improve their quality.

Backlinks

It’s time to invade the territory of SEO experts since the issue related to backlinks can’t be solved without secret SEO audit instruments. Let’s talk about your backlink profile.

Back in April 2014, Google introduced its new algorithm, called “Penguin,” which targeted the websites with suspicious backlink profiles. The fact is that, over time, webmasters who use the black-hat SEO practices have managed to discover a unique solution allowing them to cheat the Google algorithms. Given the fact that GoogleBot mostly relied on PageRank, these webmasters started to manipulate with backlinks to increase their SEO visibility in Google. To punish them for such inappropriate actions, Google has rolled out another algorithm update. I don’t want to dip more in-depth into the history since my goal is to assist you with the SEO audit. Therefore, let’s return to the subject.

I needed such a lyrical digression since I want to clarify that if Google Penguin has hit you, it doesn’t mean that you used wrong links. On the contrary, it means that you have the suspicious backlink profile.

What backlink profile Google considers suspicious?

  • If there are unnatural jumps on your backlink diagram
  • If Google detects the unusual increase of links in your backlink profile
  • If the domains linked to you, don’t possess enough authority
  • If your DoFollow/NoFollow ratio looks doubtful
  • In the case, if you really purchased backlinks

The latter point is the worst scenario among all since your recovery might take years. Unfortunately, such situation is the most common. As a rule, dealing with this issue is that all the SEO experts mean under the term “SEO audit.” They use a set of tools for the in-depth analysis of your backlink profile.

Among the SEO audit tools for backlinks the more useful are the following:

I personally use Ahrefs since this tool regularly updates its database. However, Semrush ideally fits the fast SEO audit in the first stage. Ok, let’s allow the magic to happen.

Unnatural splashes

Such graph doesn’t look natural | Ahrefs
As a result, the organic traffic looks volatile | Ahrefs

Let’s first take a look at your graph. Does it look suspicious? Ideally, the shape of your backlink curve must resemble an exponent that means that the increase of backlinks should be smooth and measured, with an uptrend. Of course, you will observe the insufficient jumps. But if to look at the curve’s trend, it must be exponential.

Why such an image seems unnatural? The answer is simple. When other webmasters place natural links to your website, the ratio between new and lost backlinks is usually in favor of the new ones. At the same time, if you oversee the sudden jump on the graph in both directions, such a trend looks very suspicious.

An unusual increase or decrease of backlinks

Why does it happen? When GoogleBot oversees spikes or dips in traffic, it’s a signal for the search engine that you bought a bunch of backlinks (even if you didn’t). In the case, if you acquired some backlinks for your website, you can’t control their natural growth. You need to order new links regularly to provide the visibility of natural gain. But in practice, it’s almost unreal to cope with such a task.

Only six referring domains in February VS. ~ 200 in March? Is it natural?

If you have chosen a black-hat SEO approach, you need at least to buy perpetual backlinks. Otherwise, if you stop purchasing links, those webmasters who sold them to you will have to drop them. As a result, both GoogleBot and you will see the disastrous decrease on your backlink graph.

Such situation readily illustrates what happens to your project when you decline the services of your previous SEO expert or a webmaster who took care of your website. Your solution, in that case, is to return to the natural state of affairs.

The grey-hat SEO practice thereby is to make a boost thanks to the purchased backlinks. But later, the experienced SEO expert removes such unnatural backlinks gradually to allow natural backlinks could overlap the negative impact of the initial, unnatural backlinks. Do you understand the main idea? Ok, I’ll try to make it clear.

How the “grey method” looks like:

  • You purchase a bunch of backlinks that give you a rapid initial start
  • Your positions start increasing
  • Some page of your website climbs onto the 1st place in the Google SERP
  • You start receiving natural backlinks because of this
  • You begin gradually removing bad backlinks
  • Your natural backlinks start to supplant bad links
  • Your backlink profile becomes closer to a natural

Is it clear now?

I don’t recommend you use this method since it’s harmful and not legitimate. My goal is to showcase you how it works to avoid the mistakes. This is my hint to a possible method of recovery. You need thereby try to dilute your backlink profile with natural backlinks and reduce the ratio between good and bad links. How to do that exactly? Oh, this is a big secret which costs a lot of money. I don’t want to discuss this topic in the current article.

But I’ll give you a hint. The first thing you need to do is to use the Disavow Tool to get rid of your bad backlinks. To do that, you will have to download the list which contains all the links from your Search Console to separate bad and natural links. Shortly afterward, you will see your real backlink profile. You have no chance to recover immediately, but this is your hope for recovery. Next, you will have to find the way where to get natural backlinks which you can buy. This way is long and full of obstacles. In any case, you will have to be ready to spend a lot of money and human resources.

However, if your project is fresh, there is no reason for panic. Use the Disavow Tool and keep working on your project; that’s all you need.

DA/PA

PA/DA (In Ahrefs UR/DR)

The quality of your backlink profile mainly depends on the authority of the domains and pages that link to you. You may have millions of useless backlinks from various spammy forums. However, one backlink from NYT, WSJ, or HuffPost means much more than all these spammy links. As a result, your goal is to get more quality backlinks as you can. In other words, if Google Penguin has hit you, you need to “find” some influential friends, capable of affecting your backlink profile. Don’t you have them? Try to negotiate with them? Is my hint clear?

DoFollow/NoFollow ratio

As a rule, any webmaster hunts for DoFollow backlinks since it is believed that only this kind of links gives you real effect. However, this statement is a delusion. First, I suggest you read my blog post on the subject.

Nonetheless, I’d like to explain you some basics:

  • NoFollow backlinks are essential for your backlink profile
  • Your DoFollow/NoFollow ratio must be in favor of NoFollow links since they look natural
  • Don’t close all your outbound links with rel=”nofollow” tag
  • NoFollow backlinks affect your DA/PA
  • GoogleBot tracks down your NoFollow links, their quality, and quantity

NoFollow backlinks are essential

As a rule, each webmaster tries to avoid NoFollow links, as DoFollow links transfer the so-called “link juice” to other websites and decrease the authority of a donor. But in that case, who will link back to you? If you consider the content of a site you linked to useful, I think you need to paste a DoFollow link to such a resource. Am I wrong? Otherwise, all webmasters would use only rel=”nofollow” tag. In fact, NoFollow is just an instrument to avoid spam in the comments, etc.

DoFollow/NoFollow ratio

If we take into account the rule above, the most of your links must be NoFollow. Yeah, I know that every SEO expert recommends you that DoFollow must surpass NoFollow. But they live in delusions. Other experts say that the ratio should be 50/50. But I claim that this is another delusion. Remember; naturally, you can’t control this factor. YOU CAN NOT CONTROL your DoFollow/NoFollow ratio.

However, if you got into troubles due to another Google penalty, I’d recommend you to dilute your backlink profile with NoFollow backlinks. The reason is that such increase looks naturally. You need to think logically. If you prefer to close your outbound links with rel=”nofollow” tag, why should other act differently? Just think about it.

Don’t use the rel=”nofollow” tag for all your backlinks

Another detail which looks suspicious for GoogleBot is that all the outbound links you use on your website are NoFollow. Ask yourself, why this seems doubtful? The reason is that this is not natural. Many years ago, it was right to follow the rules. But now, in 2018, the only thing you need to care of is how to be natural for GoogleBot. Google treats any manipulation with links like an attempt to cheat it. So, my advice is to mix your DoFollow and NoFollow outbound links.

NoFollow backlinks & DA/PA

There is no more such indicator as PageRank. Instead, Google uses the authority of your domain or a page. Generally speaking, Google still pays attention to your PageRank, but you can’t see it in your toolbar. When any valuable website links to your project, it’s a signal for Google to increase your SEO visibility despite the fact that your PageRank remains the same. Therefore, NoFollow backlinks do matter.

The events you can’t/couldn’t control

Finally, let’s talk about such a scenario which you couldn’t predict. Sometimes, the SEO audit concludes without a result. Everything is fine with your project, and there is no reason for the unexpected reduction of your organic traffic. In that case, you may have faced the events you couldn’t control. Unfortunately, in some cases, you can’t affect such circumstances.

Let’s take a look at some examples:

  • Some ill-wisher created a full copy of your website
  • You’re experiencing a DDOS attack (or another attack)
  • You discovered thousands of unknown backlinks in your Search Console
  • The impact of the most recent Google core algorithm update which re-evaluated the previously underappreciated websites (but not yours)

Mirroring

Sometimes, during the in-depth SEO audit, you may discover that there is a copy of your entire project or a lot of clones. Since GoogleBot can’t define the primary source of the content (Why DO I Hate Google), Google might punish your project, decrease your SEO visibility, or even exclude your website from the search results.

What to do in such a case?

  • Write a DMCA complaint
  • Contact the domain registrant of the ill-wisher
  • Contact the hosting provider which hosts the domain

DMCA

WMT DMCA Dashboard

You can’t only paste a domain name in your DMCA request. You will have to showcase every single link to the infringing material and present your original work. You need to be ready that this is a long-term process. Even if your DMCA complaint is accepted, you may observe the ill-wisher’s website in the Google SERP for one year or more.

Contact the domain registrant

Sometimes, it’s much easier to contact GoDaddy or another domain registrant. In my case, the bad guy used the domain zone .tk. The international laws do not control this zone. As a result, it was a long way to define any contact data. Eventually, I managed to reach them. As a result, the ill-wisher’s domain has been deactivated within a minute.

Contact the hosting provider

In such a case, you will have to define the IP range of the criminal to determine his hosting provider. After, you need to find the abuse email. But it could take a lot of time depending on the hosting provider to solve your issues. However, in any case, it would be faster in comparison with the Google DMCA subsidiary.

DDOS attack

Unfortunately, the DDOS attack is a common phenomenon on the modern internet. Because of such offense, your website might work slow that affects the behavior of your visitors. Be sure; Google will decrease your positions in the case of DDOS attack. I can’t assist you with this issue. There are a lot of business solutions on the market. However, I’d recommend you to turn on the PHP FastCGI caching of your resources which might reduce your losses.

Unrecognized backlinks in the WMT

If you discovered tons of backlinks in your Search Console, it is another type of attack. An ill-wisher could order the so-called “spam attack” on your project to affect your reputation. Unfortunately, the only solution to avoid a severe penalty from Google is to use the Disavow Tool. However, there is no guarantee that in your case the tool might help. I, personally, haven’t seen an effect from this tool. Nonetheless, for other webmasters the tool was useful.

The recent Google core algorithm update

Recently, Google rolled out another core algorithm update. Unlike all the previous updates, these changes affected the white projects that had never violated any rules. According to Google, they re-estimated the project previously punished. As a result, you could suffer because of their raise. Unfortunately, there is no remedy. I doubt that there is at least one SEO expert capable of dealing with such an issue.

Conclusion

Well, your first SEO audit is complete. I know, it was a long article. Unfortunately, I’ve only showcased you the basics of SEO audit. Nonetheless, all the data described above are useful and actual at least in 2018-19. Now you know what to do first. If as a result of your efforts you’re still in troubles, it’s time to find an experienced SEO expert/company. In fact, these guys will take into account the same factors. The only difference between them and you is that they know the details, and, of course, their experience does matter. At least, I’ve attempted to assist you for free. Your later expenses will depend on your unique case.

Featured image is designed by Freepik

Leave a Reply

Your email address will not be published. Required fields are marked *