Soumya Roy, Founder & CEO of PromozSEO Web Marketing Academy | Seasoned Digital Marketer & Search Engine Specialist | Webmaster - Entrepreneur - Trainer - Mentor - Coach
Posted on Dec 03 2018,10:07 pm
How Using Google Search Console Tool can Make You More Profit
If you are an internet entrepreneur running a business and interested in digital marketing, then most likely you are aware of the Google Search Console tool (formerly known as the Google Webmaster tool). This is one of the most important SEO and internet marketing tools for webmasters, digital marketers, and website owners.
Being an internet startup owner, you might be doing digital marketing, search engine optimization, paid marketing etc. But have you thought that using Google Search Console tool consistently can make your business website more search engine friendly and eventually make you more profits? Yes, this is true that if you start using Google Search Console regularly, you can improve your websiteâ€™s user experience and overall digital performance significantly.
How to Start with Google Search Console Tool
If your business website is not yet added with the Search Console tool, then your first job is to integrate it with the tool.
To do this, you may open a fresh, new Gmail address and can use it for creating a new Search Console account. Once the account is ready for use, add your website with the tool. This shouldnâ€™t take more than 5 minutes.
After the integration process is complete, you can wait for a few days before you can get many important data about your website from the Webmaster tool.
There is a couple of crucial data you can get and analyze using GSC tool.
â€˜Search Analyticsâ€™ section tells us all the keywords our website (or webpages) is ranking for. Additionally, you will get the click numbers, CTR percentage (Click Through Rate) and average organic position for each keyword the website is ranking (has impression) for.
Besides, there are many filters which we may use to customize this list of data as per our requirements. Device, Country and Date filters are available here to tailor the data and reports.
Links to Your Site:
â€˜Links to Your Siteâ€™ section gives us all the backlink data of the website. This section lists all those backlinks which are acquired by the website and indexed by the Google.
We can utilize this data to analyze our inbound link profile and take valuable â€˜Disavowâ€™ decisions.
â€˜Manual Actionsâ€™ section shows information about if any manual penalty is taken against a website due to its poor content and toxic backlinks.
â€˜Mobile Usabilityâ€™ section tells us about if the website is mobile responsive or not. Along with this, it also shows various mobile usability errors of the website. Correcting these will not only improves the websiteâ€™s mobile ranking but also the user experience.
â€˜Index Statusâ€™ section shows the index graph along with the number of indexed pages by Google. Though this doesnâ€™t show any of the indexed page URLs, we can see those just by using the search operators, like site:example.com on Google.
â€˜Blocked Resourcesâ€™ section lists all the resources which are blocked from Google, along with their path. Blocking any webpage or resources from Google is considered to be a bad practise. Therefore, nothing should be blocked from Google and other major search engines. If you see a list of resources are blocked in this section, check the robots.txt file of the website and unblock those ASAP.
â€˜Crawl Errorsâ€™ section may list 5 types of errors. DNS errors, Server Connectivity error, Robots.txt fetch error, 404 errors and Soft 404 errors.
For DNS and server connectivity errors, please consult with the server or hosting company immediately.
Robots.txt fetch error happens due to mistyping, wrong extension, wrong placement and not-availability of the file. Robots.txt file must be uploaded on the websiteâ€™s top-level root folder, the name must be robots and the file extension must be .txt.
404 and Soft 404 errors mostly happen due to the nonexistence of a URL. The solution is the â€˜301 permanent redirectionâ€™.
Solving these crawl problems will not only help Google bots to crawl and index the website and webpages, but also improves organic ranking substantially.
Fetch as Google:
â€˜Fetch as Googleâ€™ section should be used whenever there is a big change on the website. If one new page is created or an existing pageâ€™s content has been changed, fetching those URLs using the â€˜Fetch as Googleâ€™ will get quick crawling. Therefore generally use this portion if you want one page to rank faster.
â€˜Sitemapâ€™ section gives us the option to add websitesâ€™ sitemap files to the Google Search Console tool. Google may use websitesâ€™ sitemap files to understand and identify the internal URLs quickly.
Additionally, we can resubmit the already added sitemap files to the Search Console tool whenever we create new webpages on the website.
Additionally, we can set the preferred domain version using the Google Search Console tool. We can select one of the three options (www version, non-www version and do not select) and set as the preferred domain version. Preferably, we should do this using the htaccess file or the website itself.
Google Search Console tool has many other data and tools which should also be checked, explored and analyzed periodically to understand the website, its performance, user-experience etc.
Conclusion: Checking these data regularly will get much error information which is otherwise hard to find. Correcting these errors will lead to better ranking, more organic traffic, better user experience, and indeed, more sales.
Therefore being an internet business owner, it is your job to add your website to the Google Search Console tool, check its data regularly, correct the errors with priority and improve websiteâ€™s overall organic ranking performance.