We all know that Google loves & ranks those websites higher that perform well on its parameters.Nowadays Google is using machine learning technology called rankBrain for delivering relevant results for each query on SERPS. In order to rank above billion of pages, it is important to check and improve the site performance from time to time.
I’ve seen websites that start to perform poorly or suddenly vanishes from the SERP results this can be due to several potential issues which need to be checked and fixed.
The constant updates in Google ranking algorithms can cause fluctuations in the organic rankings.Generally, these fluctuations are also called as flux or Google dance. Google says if your site has good quality inbound links then there are high chances your website will rank again to its position in its next crawl. The Google continuous updates in the database have brought nightmares for many big and small brands. So keep getting quality links and your website will not get affected by these continuous updates.
It is very important for webmasters to follow the Google webmaster quality guidelines for continuously maintaining and improving the organic rankings.
Google recommended steps that need to be followed if your site is not appearing in search results or performing poorly than before,these steps are
Checking Indexing of Website
Checking Manual Spam Penalty
Make sure Google can find and crawl your site
Making sure Google can Index your site
Make sure your content is useful and relevant
The first and foremost step is to check whether your website is being indexed by the Google.This can be done by putting the following command site:google.com. If your website landing pages are showing in the search results then it means it is being indexed by Google.However if your website used to be indexed and now not showing in SERP,it may be removed due to some violation of webmaster guidelines. Review the guidelines and fix the issue and submit a reconsideration request.
Checking whether your site ranks for your Domain Name
By putting the following command www.[yourdomain].com one can check whether site ranks for its own domain or not,if your website isn’t showing in search results or ranking poorly then there are chances of manual spam penalty for violation of webmaster guidelines. The manual action can be of two types
Site Wide Penalty
Partial Match Penalty
The site wide action penalty means your entire site is being manually penalized.
The partial match penalty means some pages or a particular page is being penalized.
The manual actions can be due to following reasons Hacked site,User-generated spam,Spammy freehosts,Spammy structured markup,Unnatural links to your site,Thin content with little or no added value,Cloaking ,Unnatural links from your site,Pure spam,Cloaked images.
Making sure your Website is Crawled by Google
Generally, Googlebot indexes the website pages with the help of sitemap provided by the webmaster.Googlebot keeps on updating the data from time to time and notifies the webmaster about the issues in Google search console.
In order to improve the website performance, it is important to check the crawl errors in Google search console.The crawl errors report will tell about the URL Googlebot tried to crawl but couldn’t access. The crawl error report is segmented into two parts
In site errors following errors are highlighted
The URL errors section of reports is segmented into categories and each category shows top 1000 URL errors.It is not important to give important to every error that you see in this section but the errors which can have a negative impact on organic rankings and website performance need to be monitored constantly.
Following factors needs to be considered
Fixing of 404 errors with 301 redirects
Updating your sitemap
Keeping redirects short and clean
Analyze your robots.txt file & make sure that it is not blocking the important landing pages.
Make sure you are not blocking the website landing pages through meta tags. Make sure your internal links and page are not blocked by the following code
<meta name=”robots” content=”noindex,nofollow”>
Analyze your site structure and make sure it is easily accessible to Googlebots and other search engines.Generally, search engines can crawl any type of text available on websites or app but the use of flash, Silverlight, iframes and other rich media formats can stop the search engines crawlers in indexing the text. It doesn’t mean that you cannot use rich media;make sure any content you embed in such files should also be available in text format.
Make sure that you always create a sitemap and submit the same in Google search console. Sitemap helps in faster indexing of new page and those content which are not easily discoverable.The sites which have dynamic content or content not easily discoverable by Googlebot, or site is new then sitemap will be helpful.
Making Content Useful and Relevant
Content is king for Google. Google loves content that gives value to readers.In order to make useful and high-quality content, it is important to analyze the search queries page in Google search console. The first column will show the keywords on which your site most often appears in search results. It also lists a number of impressions,CTR and clicks for each query.
All this information will provide insights into what users and searching and queries for which user often click on your site. For example, your site may often appear for kid toys and clothes but may have low CTR because the website doesn’t contain much information about those queries. In such case, consider making content more relevant and useful.
Also, avoid keyword stuffing as too much keyword stuffing could have a negative impact on the organic rankings.
Analyze the density and variation of keywords through content keywords page in Google search console. The keyword and its variants are listed in order of frequency of appearance.By clicking on the keywords you can found on which pages keyword appears.
Checking the HTML improvement page in search console. The HTML improvements page highlights the duplicate or missing page title and descriptions and also on -indexable content,such as some rich media files.
Check if any of your content is flagged as an adult by turning off safe search in search engines.The safe search filters eliminate the adult content.
Images with good content generate traffic. Give your images detailed and informative filenames. For example, black-leather-purse.jpg is a lot more informative than 123345.jpg. Google sometimes uses image filename as a snippet in search results,if it is unable to find suitable text in the page.