Google indexation

How To Force Google To Index My Website?

Well, you’ve just created your first personal blog. However, it seems that Google doesn’t want to visit your website. You’re frustrated and have no idea what to do next. What’s your first movement? I bet you go to Google in an attempt to find some data on the subject. Next, you submit your sitemap to GWT. After, you try to “fetch as Google” since everyone recommends to do this step. Congratulations! You are on the wrong track. You may object, “it’s obvious; Google recommends to follow these rules.” However, such technic gives no result.

To be honest, all the steps described above are right. But they don’t work for any reason. I’ll give you a hint. The main reason why Google doesn’t react to your actions is that the search engine expects such movements from you. But when you’re in a hurry, you make mistakes. So, what you are gonna do to surpass the algorithm. The right answer is nothing. You just need to allow Google to do his job, without your help. Why??? What??? What are you talking about???

Each time, during every conference, the Google spokesmen are trying to convince you that their search algorithm is perfect. Indeed, Google knows how to index content. Once you’ve purchased your new domain name, Google detects this action. The search engine’s crawler immediately visits your domain to index your front page (which usually looks freaky). Since then, Google knows about your existence. You may type your domain name into a search string to discover your first indexed page. But what is Google going to do next?

The Googlebot‘s next step is to revisit your website in an attempt to discover some content on it. Here is a starting point for your mistake. During its first visit, the crawler is about to index all the data on your website. But, as a rule, you block this first visit in robots.txt. Otherwise, you publish your first webpage or even two web pages, index.html/index.php and about.html/about.php. Perhaps, you also managed to post your first article.

All the actions described above are precisely what Google expects from you. So what do you get as a result? Next time, the search crawler will visit your website in a week or two. Why? Because Google is waiting for some content to feed on. Subsequently, you could submit your sitemap.xml, add your website in GWT, add Google Analytics, etc. Google realizes that your project is not ready to be indexed. So what is better to accomplish instead?

You will have to show the Googlebot that your site is ready for indexation. To show it to the search engine, I suggest you create your website offline. As soon as you wrote at least 30-50 articles, created all supplemental pages (such as about, contact, disclaimer, etc.), it’s time to publish your website. Moreover, next days you will have to add at least 3-5 articles every day. How this approach differs from the previous one?

  1. Google defines that your site is ready for indexation during its first visit since there is something to index;
  2. Since there are some data on your website, the crawler is wondering, whether the content of this project will be updated? That’s why Googlebot visits your website one more time;
  3. During the second visit, Google discovers at least 3-5 new articles + tags, categories, etc. (maybe 7-10 additional pages);
  4. Well, the crawler is about to test you one more time, next day; this day is also productive for Google’s feeding;
  5. Eventually, the search engine realizes that your website is regularly updated. Therefore, there is a reason to visit your project every day.

What happens when you are trying to feed Google using the feature “fetch as Google?” This way, the indexation of your website is unnatural since you’re trying to force Google to index your articles and pages. Of course, the search engine will do it but not correctly. You might even discover your pages in the SERP. However, this is a temporary effect. When you submit your URL into GWT, Google doesn’t apply various filters to your article or page. As a result, such approach is the best way to get into Google’s supplemental index or even a sandbox. On the contrary, when you allow Googlebot to visit your website naturally, the search engine appreciates such approach since it works without enforcement.


Googlebot is about to index your website in any case. This is just a matter of time. But how long it could take depends on you. Maybe my approach is wrong in comparison with the traditional way of submitting your website to Google. However, this method works. Remember, each time you will have to shock Google with your original actions. If you do everything like everyone, why Google should consider your website the best one? Googlebot is a brilliant algorithm. Don’t be so naive thinking that you assist Google by showing him your new pages. Googlebot is similar to your favorite pet who adores you when you feed it regularly! But please bear in mind, all these techniques work only if you follow the other instructions presented on this blog. If you borrowed someone’s content, this strategy gives you no result.

Leave a Reply

Your email address will not be published. Required fields are marked *