The service includes tools that let webmasters
- Submit and check a sitemap.
- Check the crawl rate, and view statistics about when Googlebot accesses a particular site.
- Write and check a robots.txt file to help discover pages that are blocked in robots.txt accidentally.
- List internal and external pages that link to the website.
- Get a list of links which Googlebot had difficulty in crawling, including the error that Googlebot received when accessing the URLs in question.
- See what keyword searches on Google led to the site being listed in the SERPs, and the total clicks, total impressions, and the average click through rates of such listings. (Previously named 'Search Queries'; rebranded May 20, 2015 to 'Search Analytics' with extended filter possibilities for devices, search types and date periods).
- Set a preferred domain (e.g. prefer example.com over www.example.com or vice versa), which determines how the site URL is displayed in SERPs.
- Highlight to Google Search elements of structured data which are used to enrich search hit entries (released in December 2012 as Google Data Highlighter).
- View site speed reports from the Chrome User Experience Report.
- Receive notifications from Google for manual penalties.
- Provide access to an API to add, change and delete listings and list crawl errors.
- Rich Cards a new section added, for better mobile user experience.
- Check the security issues if there are any with the website. (Hacked Site or Malware Attacks)
- Add or remove the property owners and associates of the web property.
- Google Search console brought an advance featured breadcrumbs and Amp to provide ultimate help to the users.
- As of July 2020,
you can analyze on-site structured data schema in the Google search console's dashboard or use one of Google's tools: Rich Results Test or Schema Markup Validator.
- Search Console provides information on how Google crawls, indexes, and serves websites. This can help website owners to monitor and optimize Search performance.
- Google Search Console also has a feature called the URL Inspection Tool.
Criticism and controversy
The list of inbound links on Google Webmaster Tools is generally much larger than the list of inbound links that can be discovered using the link:example.com search query on Google itself. The list on Google Webmaster Tools includes nofollow links that do not convey search engine optimization authority to the linked site. On the other hand, the list of links generated with a link:example.com type query is deemed by Google to be "important" links in a controversial way. Google Webmaster Tools, as well as the Google index, seems to routinely ignore link spam.
Once a manual penalty has been removed, Google Webmaster Tools will still display the penalty for another 1–3 days. After the Google Search Console rebrand, information has been produced demonstrating that Google Search Console creates data points that do not reconcile with Google Analytics or ranking data, particularly within the local search market.
Due to a persistent bug in the new Google Search Console, many users cannot upload a valid sitemap ("Couldn't fetch sitemap" or "Sitemap could not be read" errors).