SEMrush, Ahrefs, Moz…there is some great SEO tools out there; and with them, you can do the varieties of thing. You can check backlink profiles. Do the best keyword research. Finding to unlink mentions and guest posting opportunity. You can even run comprehensively SEO audits with the click of a button. But whether you are agencies or in-house, small business or enterprise—there is a certain area of functionality where those tools fall short. And what is the fail, Google Search Console prevail?
While important, the garden-varieties SEO tool is, or should be, collection to your SEO strategy. If you are in the business of optimizing for organic search, you should be living in Search Console and used other tools to help you completely addition tasks. Not total comfortable which is the Search Console as a live spacing? Not to worry. Today, I am going to teach you how to get nice and cosy with the most pivotal features Search Console has to offer.
Even better: after launching the new and improving Search Console in January, Google officially moving it out of beta last week. So today, while I am going to be teaching you Seven(7) steps to making the most out of the new Google Search Console, I will do also discussing how the new interface and the old interacted differ.
Alrighty, then! Let’s hope in.
Step #1. Add and Verify Your Site
Before us getting into the function, you are going to want to add and verify your site within Search Console. Head of the dropdown at the top left of your dashboard and click “add property.”
Make surely your enter your site is URL exactly as it appears in your browser. If you supporting multiple protocols (Http:// and Https ://), or multiples domains (example.com, m.example.com, and www.example.com), make sure you add each as separate property. Once your site is added, Search Console will be beginning to collecting data.
Just adding a property won’t give you access, though—you also have to verify that you own it. Head to the Managing Property tab for the properties you added on the Search Console home page.
Select “verify property” in the dropdown and choose one of the recommendations verification methods. These will vary depending on the makeup of the site you are verifying. If you are struggling to implement one of the verification methods, want to changes your verification method, or simply want a more in-depth explanation of each process, this page is a nice resource on all thing about site verification.
Step #2: Indicate a Preferred Domain
Indicates a preferring domain tells Google whether you want your sites listing https://www.example.com/orhttps://example.com. Chose one over the other is not going to give you any kind of advantages in organic search; however, you do want to make sure you choose one or the other.
Select your properties from the Search Console home page (note: we are doing this in the old Search Console). Once in, click the gear icons in the top rights of your dashboard and selecting Site Settings:
In the Preferred Domain section, you will see the options to select between the example.com and https://example.com/ versions of your site. You will also see the option “Do not set a preferred domain.” Select this option, and you leave yourselves open to the possibilities of Google treating “non-www” and “www” URLs as different URLs. Doing so could really crumb the link equity of those pages and hinder search visibilities. By instead selecting one version of your site as “preferred,” you are telling Google to treats any non-preferred domains it is coming across as your preferred domain.
Step #3: Integrating the Search Console with Google Analytics
Analytics gives you traffics and conversion data; Search Console gives you a look at the causal search factors underlying that data. Linking the two gives you a huge boosting in reports.
To link Analytics and Search Console, head to the admin panel at the bottom left of your Analytics dashboard. From there, you will want to click into the Property Settings at the Property level.
Note: If you do not have access to Property Settings, as I do not here, it means you don’t have edit permissions at the Property level. You will be needed to acquire that from another owner if you want to link Search Console yourself.
Next, scroll down to Search Console Settings. You’ll see your website’s URL, which confirmed that the website is verified in Search Console and you have permission to make changes. Under Search Console, select the reporting view in which you want to see data, click Save, and you are ready to rock.
You’ll now see a Search Console report within the Audiences tab of your Analytics dashboard.
Using that report, you now has the ability to correlate pre-click data like impressions and queries with post-click data like bounce rate and goal completes. The Landing Pages reports houses search data for every URL on your site that is displayed in the search results. So if there is a page you recently updated and you are hoping better rankings will translate to more traffics for that page, or if there’s a page that is tanking in traffic and you want to find out if which specific search metrics are contributing to that tank-age, you can use the Landing Pages report to fully understanding those correlations.
How has a change in click-through rate over time affected goal completes? How has an average position in the SERP affected sessions or time-on-page? Linking Analytics and Search Console allows you to analyzed all of this unique relationship. You can also use the Devices report, the Countries report, and the Queries report to analyze these same metrics relationships when broken out by device, country and search query. Countries
Another note: Unfortunately, the Search Console reports only show data as far back as Search Console has been collected it for your site. Fortunately, while the old Search Console giving you just three months of search data; the new Search Console gives you 16 months. This is null and void if you are just linking Analytics to Search Console now, but over time, it is definitely helpful having that extras data to analyze (think of how often you look back past three months to look at a page is historical traffic/conversion numbers).
Step #4: Submit a Sitemap
Not sure if you having a sitemap? Head to example.com/sitemap.xml. If there is nothing there, you do not have one.
Naturally, you needed to have a sitemap if you want to submit one to Search Console. some sitemap generated best practices:
- Files size: Less than 50 MB
- Number of URLs: 50,000
- If you have more than the 50,000 URLs: Generate multiple sitemaps
- Only including canonical URLs. Exclude URLs you have disallowed with robots.txt
- From Google: “XML sitemaps should be URLs of all pages on your site.” If you having a large site, you can paraphrase this as, “…all valuable pages on your site.” That including any pages with quality, original content. It excludes “utility pages”—pages that might be useful to a user but are not useful as a search landing pages.
- Content Management Systems(Common CMS) like WordPress and Drupal have a plugin that helping you generate sitemaps. Some, like Squarespace, update and generate them automatically.
- If all else fails, this articles giving a rundown on how to build a dynamic site map. If you have a small site, this tool will be building one for you.
Ok, so you have your sitemap! Now, to help Google understand the content your site consists of, you are going to want to submit it. To do so, heading to the Sitemap tab in the new Search Console:
Enter your new click submit, sitemap URL and bam! You are in business.
Step #5: Leverages the Indexing Coverage Status Reports to Fix Site Errors
The old Search Console house the Indexing Status reports in the Google Index tab; the new Search Console displaying it right in the dashboard, where you can not miss it.
They have also updated the name to the Index Coverage Status report. It looks like this:
Per Google, the new reports provide all the same information as the old report, plus detailed crawling status information from the Index. What kind of glorious insights can you glean from this news (but ostensibly the same) report? Let’s run through each of the tabs.
1.Error: Runs through all potential sites error so you can go through and make fixes. These could including server errors, robot.txt errors, redirect errors, 404s, and a variety of others.
2.Warnings: A warning means a page is indexed but blocked by robots.txt.If you want to block pages from the index, Google prefers the ‘no-index’ tag over robots.txt. Pages blocked via robots.txt can still show up in the index if other pages link to it. These warnings give you the opportunities to go through and correctly de-index those pages.
3.Valid Pages: All of the pages are in the index. If you see the “Indexed, not submitted in sitemap status” you should be making sure you add those URLs to your sitemap. “Indexed; consider markings as canonical” means that pages have duplicate URLs, and you should mark it as canonical.
4.Excluded Pages: These are pages that have a block from the index by a ‘noindex’ directive, a pages removal tool, robots.txt, a crawl anomaly, by virtue of being duplicate content, etc.
Google gives great insight on what each and every one of these statuses means, and how you should go about fixed them. We do not have room to get into all of them here, but generally speaking, you can get the 411 on each URL by click on the tab you want to an investigation, then clicking on the description that populates the Details section of the report:
Click the URL in the Examples tab:
That will open up this nice panel that gives you a few different ways to inspect the issues:
Here is what you can do with each of these functions:
- Inspect URL: Look at the referring pages, the last crawl time, whether or not crawling is allowed, whether or not indexing is allowed, whether or not you have declared that page as canonical, and whether or not Google views that pages as canonical.
- Test Robots.txt blocking: Head to your site is robots.txt file (example.com/robots.txt) and you can see all the elements on your site-blocking from the Index. Naturally, you are not always going to remember which elements appear on which pages. The robots.txt test highlights parts of your pages that may or not be trigger robots.txt blocking.
- Fetch as Google: Allows you to see your pages exactly as Google sees it. Googlebot’s heads immediately to your pages and shows you the downloaded HTTP response it reads. Uses “Render and Fetch” and you can also see the physically layouts of your pages as Google sees it.
- View as Search Results: Allows you to see what your pages looks like in the Index.
These functions are available for all of your URLs. Search Console picks up in the Index, so you can use them to sort through the statuses associating with your warnings, errors, valid pages, and excluded pages. Make sure to validate your fixes so Google puts a rush on re-crawling the affected pages:
That is it in a nutshell! The Indexing Coverage Report can be used to detect and remedy every error associating with your site.
Step #6: Leverage the Performance Report to Update Content
Basically, all metrics that you see in Analytics when you link to your account to Search Console coming from the Performance Report. The Performance Report replaces the “Search Analytics” reports in the old Search Console; like the Index Coverage report, there is not too much of a difference between the old reports and the new one. But you can be stilled do some pretty sweet stuff with it. Let us take a look.
First, open the Performance Report. It is the first table you see in your overview.
You are not limiting to track these metrics in Analytics; you can use them to look for the opportunity to improve performance. The best way to do this is to use the filters function:
Use the tabs to investigate your pre-click metrics at a page, country, query or device level. You might want to see, for instance, which of your pages sit on pages one of the search results but have a lower than site-average CTR in the pasts six months:
Or maybe you want to find queries for which you ranked outside the top 10 but stilled get a solid amount of impressions, so you can then go back and optimized the corresponded pages in an attempt to gain rank:
Most third-party SEO tools have similar functions by which you can go in and look for keyword opportunity, but it is nice getting data straights from Google!
Step #7: Use the Links Reports to Boost Specific Pages
The Links Reports is locating at the bottom of your dashboard:
There are a few handy things you can do with it. Here are my two favourites:
1.Boosted specific pages using your most linked-to pages.
The most linked-to contents on your site are where the most link equity lies. Linking internally to pages you want to boost from those equitable pages is a greats way to increase rank. To find out where the most link juices reside on your site, click into either of the tops linked pages sections of the link report:
The external links section allows you to sort by numbers of referring domains—which is a big ranking factor for Google, so those pages are going to have a lot of inherent equity. Find pages that are on the cusp of driven serious business value, link to them (naturally) from these high-quality pages, and track the results!
2.Disavow links from spammy sites.
Head to “Top linking sites” in your Links Reports overview. Expanded the listing and you can see all the domains linking to your site. Add any low-quality or spammy domains to the file you will upload to Google’s disavow links tool. Note: Per Google, you should only disavow links if you are confidents they are doing harm to your site. Disavowing links that are boosted performances will harmful your site. Still, it is worth investigating whether or not there are domains pointing to your sites that you should be concerned about.