Search engines use your site structure to find, crawl, and rank content on your website. Here’s how you can build an SEO-friendly website structure that Google (and your users) love.
Marketing | Nov 30
Jessica Greene on July 29, 2019 (last modified on July 31, 2019) • 25 minute read
After all, “technical” is right there in the name, and it’s an adjective that many marketers would never use to describe themselves.
But the reality is that technical SEO sounds more complex than it really is. If you take a little time to dive into technical SEO, you may find that it’s not as complex and daunting as it seems.
In fact, with a little education on the topic, you may even be willing to dive in and conduct a technical SEO audit of your own website.
You may not be able to fix all of the issues you find—some will certainly require help from your development team—but conducting the audit will at least let you bring big issues to the attention of the people who can fix those issues.
To find out what you need to look for when conducting a technical SEO audit, we asked 54 marketers to share the technical SEO best practices they follow on their own—and their clients’—websites.
The result: a list of 17 questions you can ask—and answer with the right tools—to conduct a DIY technical SEO audit of your website.
Technical SEO Audit Checklist:
Editor’s note: Want an easy way to track technical SEO issues alongside other key SEO metrics like traffic and rankings? SEMrush users can grab the free SEMrush Keywords and Audits dashboard below to keep an eye out for new technical errors and warnings while monitoring the stats you check regularly.
SEO is a broad practice with three main facets: on-page SEO, off-page SEO, and technical SEO.
Technical SEO is important because if search engines can’t crawl your content, they can’t index your content, so your content won’t appear in search results. If search engines can’t crawl and index your content, no amount of on-page or off-page SEO will move the needle.
But it’s actually rare for technical SEO issues to result in the total inability for search engines to crawl your website. Rather, they usually lead to less efficient crawling, which can delay the speed in which your pages get indexed or lead to only a portion of your pages being indexed.
On-page, off-page, and technical SEO all work together to create a comprehensive strategy for optimizing your site for search. Unless you’re already focused on all three facets, there’s probably more you can do to improve your site’s SEO.
For many marketers, technical SEO is the missing piece in their SEO strategies. In fact, 55.6% of our respondents said that marketers tend to place too little importance on technical SEO:
If technical SEO is the missing piece of your SEO strategy, you can easily fix it by asking the right questions about your site—and answering those questions using the right tools.
Here are 17 questions to ask when conducting a technical SEO audit to identify issues that are impacting the crawlability and indexability of your website.
“When search engines scan websites to discover pages, they use crawlers (or bots),” says Holly Ozanne of Venta Marketing. “Even the smallest problems with crawlability can result in your site losing rankings.”
Because of this, Borislav Ivanov of Best Response Media says that “the most important technical problem that needs to be solved is crawlability errors.”
To identify crawl errors, Best Company’s McCall Robison says to “review the ‘Coverage’ report in Google Search Console.”
“This report shows you issues found in the last month that are either threatening to remove a page of your website from the index—or webpages that aren’t being indexed at all that most likely should be. These errors are also prioritized by importance,” Robison says.
“A technical SEO aspect that many people overlook is ensuring that your pages are indexed properly,” says Dan Christensen of Morningdove Marketing.
Quentin Aisbett of OnQ Marketing agrees: “Identify how many pages of your site are indexed. The results will very quickly indicate if there are serious technical issues you need to investigate further.”
“For example, if the site is not indexing the number of URLs you’d expect, then you’re likely blocking crawlers. If there are more URLs than expected, then there are likely duplicate content issues.”
Aisbett says that an easy way to identify the number of pages of your site that have been indexed is to enter “site:yoursite.com” into Google.
Then, compare that to the number of pages that should be indexed and look for discrepancies.
If you have fewer pages indexed than you believe should be indexed, it’s possible that you have a problem with your robots.txt file.
“The robots.txt file is one of the first things Google looks for when crawling a website,” says Alex Membrillo of Cardinal SEO Company. “It’s a file located in the root directory of a website.”
You can view your robots.txt file by navigating to the following URL: yoursite.com/robots.txt.
“Robots.txt tells search engines and other crawlers what they can crawl and index,” says Derek Hales of Modern Castle. “The risk and challenge of the robots.txt file is that you can accidentally de-index large portions of your site.”
“During my agency days, a Fortune 50 client mistakenly de-indexed their entire domain with mismanagement of their robots.txt file. It took a few days for someone to notice, but by that point, the damage had been done. It cost the company millions of dollars in lost revenue.”
“The simplest way to solve this problem is to actively monitor the robots.txt file. You can manually check it daily or weekly, or you can use a service like the Merj Robots.txt Checker Tool. This service monitors changes to your robots file that might impact usability,” Hales says.
Another common cause of “number of pages indexed/number of pages that should be indexed” discrepancies is duplicate content. If Google is indexing multiple versions of your content, it can inflate the number of pages of your site that are indexed.
The solution to this issue: canonical tags.
“A canonical tag is used to help search engines identify the original—or ‘master’—version of a page,” says Andrew Becks of 301 Digital Media. “Canonical tags are especially helpful in instances where content is repurposed in multiple places or syndicated to other sites.”
WordPress sites can be particularly problematic when it comes to duplicate content as the system often creates multiple URLs for a page when it’s assigned to multiple categories—or in the process of creating your content archives.
For that reason, Alistair Dodds of Ever Increasing Circles recommends adding a canonical URL to every page of your site. “Ensure it is self-referencing unless you have copied the work from another page on your own or another site. This ensures that Google always knows which pages should be indexed.”
If you use WordPress, Dodds recommends using Yoast SEO. With the free version of the plugin, you can add in canonical URLs manually. With the premium version, Yoast adds canonical tags to all of your pages automatically.
A good way to make sure that all of the pages you want to have indexed are discoverable to search crawlers is to create a sitemap and submit it to Google Search Console.
“Create an XML sitemap and submit it to Google Search Console to help Google crawl your site and index your pages easier,” says Matt Tudge of WDA Branding Derby. “If your site is built in WordPress, there are many simple plugins that will create a sitemap for you automatically.”
Yoast SEO—the same plugin mentioned above—is one WordPress plugin that creates sitemaps automatically.
“Alternatively, you can use a free XML sitemap generator and send it to your web hosts for implementation,” Tudge says.
To see if you have a sitemap already—or to submit your sitemap to Google—go to “Sitemaps” in Google Search Console. If you have a sitemap that’s already been submitted, you’ll see it displayed on the page. Otherwise, you can enter your sitemap’s URL to submit it.
Another way to improve the crawlability of your website, as Loud Digital’s Daniel Young recommends, is to “analyze your site structure like search engine bots. Does the hierarchy make logical sense in terms of navigation?”
A logical site structure usually looks something like this:
Most sites have multiple levels of content. The homepage is level one at the top. Primary landing pages (like product/pricing pages and blog landing pages) are level two. Sub-pages of those pages (like individual blog posts) are level three.
The number of levels of your site also determines how many clicks it takes to get to those pages. And knowing how many clicks it takes to get to a page is important for technical SEO, as Revenue River’s Juliette Tholey explains:
“Google may discount content that’s more than four clicks away from the homepage, so my advice is to optimize the crawl depth of the website so crawlers are able to find all of your pages.”
9Sail’s Kyle Kasharian says that having a logical site structure also “makes it easy for search crawlers to understand how your site is organized.”
Of course, you may not have the ability to redo your site’s structure and main navigation, but there are still steps you can take to ensure your content is crawlable and not too many clicks deep.
Nate Masterson of Maple Holistics recommends “using internal links, calls-to-action, and landing pages to enable users to reach any page with the fewest amount of clicks. This makes your content more accessible to crawlers and improves the user experience.”
Masterson also recommends “avoiding pagination as much as possible. If your site has a lot of content, then pagination may seem like a necessary evil, but it can have a huge impact on your technical SEO efforts.”
And COFORGE’s Eric Melillo recommends using a “content cluster model where you link out to semantically related posts. This shows Google your content relationships.”
Editor’s note: Want an easy way to see how many clicks it takes to get to different pages on your site? SEMrush users can grab the free SEMrush Site Audit dashboard below to see exactly how many pages of your site are 1, 2, 3, and 4+ clicks deep.
“You must implement SSL on your website,” says Danny Peavey of One Week Website. “It’s so simple, yet it’s one of the most overlooked technical SEO points we have come across in our industry.”
How do you know if your site has an SSL certificate? SSL sites begin with HTTPS rather than HTTP:
You can get a free SSL certificate from a service like Let’s Encrypt. However, if you’re installing a new SSL certificate on a site that’s been indexed without one, it’s important to make sure that every page on your site that formerly had an HTTP URL is 301 redirected to its HTTPS counterpart.
One issue that’s commonly caused by an HTTP to HTTPS migration is redirect chains (though other circumstances can lead to redirect chains, too).
“HTTP to HTTPS redirects—or WWW to non-WWW redirects—are often overlooked by developers and marketers,” says Darko Brzica of Walk Jog Run.
A redirect chain occurs when a redirected page points to another (or more than one) redirected page.
For example, you wrote a new post to replace an old one, so you redirected the old post to the new post. But then you installed an SSL certificate on your site, so you redirected the new post to an HTTPS URL. The old post now redirects to the HTTP version of the page, which redirects to the HTTPS version of the page.
“I’ve had cases where I improved rankings significantly simply by resolving chain 301 redirects,” Brzica says. “Therefore, the first thing you should check on a website is if the redirections are in place and if they are in order.”
“To resolve redirect chains, remove the redirected URL from your internal links and link directly to the live URL,” says Portent’s Kyle Freeman. “Most SEO crawler tools should help you identify redirect chains on your site.”
One tool that’s helpful for finding redirect chains (and other technical SEO issues) is Ahrefs’ Site Audit feature.
“404 errors are a common response code that’s returned when a page can’t be found on a site,” says Ben Johnston of Sagefrog Marketing Group. “They’re not only bad for search engines, but they break user experience and decrease user trust while they’re browsing your site.”
Laura Duncan of Page 1 Solutions agrees: “There’s nothing more frustrating for a user than to arrive on a 404 page. A 404 page is essentially a dead-end for a user’s journey on your site.”
“When your website has 404 errors, it means that you have pages—and/or are linking to pages—that do not exist, which does not make for the best user experience and can negatively affect your search rankings,” says Fisher Unitech’s Jackie Tihanyi.
“To find 404 errors on your website, you can use a variety of website auditing tools such as Alexa, SEMrush, and Google Search Console. These tools will crawl your entire website and pull any URLs that are producing a 404 error,” Tihanyi says.
SyncShow’s Jasz Joseph recommends Screaming Frog for finding 404 errors but also says you should be proactive in avoiding them: “You can do this by ensuring that every time a page is sunsetted or archived, the URL is redirected.”
However, Terakeet’s Jonas Sickler cautions against redirecting everything just because a page no longer exists: “Redirects can cause serious issues if you use them improperly. Google states that redirects should be 1:1, meaning the content you combine should be very similar.”
“If you combine two very dissimilar pages, you risk changing how Google perceives the new page. The more keywords a page ranks for, the more impact it could have. And when you redirect URLs to your homepage, things get even worse.”
“One client’s homepage dropped from #1 to #5 for their most important keyword. After some digging, we realized they had redirected several high-traffic blog posts that ranked for several thousand keywords to their homepage.”
“We republished the content and removed the 301s, and within one day their homepage ranked #1 for that keyword again,” Sickler says.
Of course, sometimes a page just doesn’t exist anymore, and there’s no logical page to replace it with. And that’s not a bad thing. As long as you don’t have any internal links that point to that page, a 404 error is a perfectly reasonable response, and eventually, Google will remove that page from its index.
If you need to create 301 redirects on a WordPress site, you can use a plugin like Redirection. However, your developers may already have a redirect file, in which case you should be able to just send your redirect requests to your development team to have them implemented.
Broken images are similar to 404 errors resulting from links that point to pages that no longer exist, but in this case, it’s content that’s pointing to images that no longer exist.
“I’d say one important technical SEO tip I’d give is to periodically fix broken images,” says Chris Hornak of Blog Hands. “If Google crawls a page with obvious user experience errors like broken images, it’s likely to decrease the visibility of that page over pages that are functioning properly.”
“I’ve seen time and time again content drastically lose its search position, only to find out one of the links or images were broken. It’s an easy fix that can help you maintain higher search visibility.”
Screaming Frog is a great tool for finding broken images on your site.
Image alt text is one of those areas where on-page SEO and technical SEO overlap. But while adding alt text to images is something that should be done each time you add a new page/blog post to your site, it’s often overlooked. And when it’s overlooked, you can find those issues during a technical SEO audit.
“Alt text is an accessibility feature that allows screen reader software to describe images to vision-impaired users,” says John Donnachie of ClydeBank Media. “And search engines favor sites with high accessibility.”
“Many webmasters think that because visitors to their site can’t see this text that it doesn’t matter,” says Max Robinson of Streaming Movies Right. “But keep in mind that Google is reading your site too, and it will analyze the alt text of your images to determine what your pages are about.”
Again here, Screaming Frog is a great tool for finding all of the images on your site that don’t have alt text.
“Website speed is a very important technical SEO factor, and not enough marketers or webmasters focus on it,” says Jonathan Aufray of Growth Hackers.
“Load speed impacts the user experience. People don’t want to wait for 10 seconds for your website to load. They want to wait for less than two seconds. So search engines give better rankings to fast websites than slow ones,” Aufray says.
“Site speed should be one of the first things you address when trying to improve your organic traffic through search engines,” says 9Sail’s Bryan Pattman. “If your site struggles to serve users the content they want to see, your bounce rate is going to skyrocket.”
Our respondents recommended a number of tools for measuring the load speed of your website and its pages.
Gerard Westwood of e4k Digital Agency recommends using Google PageSpeed Insights: “It will provide suggestions for how to reduce your loading time, but if you’re not a developer, you may need help with this.”
If you do need help implementing the recommendations, consider doing what Alex Cascio of Vibrant Media Productions did: “We hired a company specializing in WordPress speed-ups (such as browser caching, CDN, compression, etc.), and it has paid off tenfold!”
And while measuring the overall load speed of your site is important, John Locke of Lockedown Design & SEO says to also “be sure to consider the time to first byte (TTFB).”
“While most SEOs obsess over getting a 100 score in PageSpeed Insights, that doesn’t always mean that the overall page loads faster than the competition.”
“Having a high score in PageSpeed Insights also doesn’t always correlate with better rankings. There are often sites with mediocre scores in GTmetrix and PageSpeed Insights that rank quite well. What seems to make a difference for competitive keyword queries is having a lower TTFB than your competitors.”
“You can find the TTFB by opening up Dev Tools in Chrome (right-click on your webpage and select ‘Inspect’), going to the ‘Network’ tab, and reloading the page. Click on the first asset loaded—the domain name—and then select the ‘Timing’ tab to see stats for TTFB and some other DNS speed stats.”
“TTFB is usually much higher on cheaper shared hosting and lower on managed hosting or dedicated hosting,” Locke says.
Our respondents also offered several tips for how to decrease the load time of your pages:
“Now that mobile searches make up the majority of search engine queries, website performance has become an utmost priority,” says Tommy Landry of Return On Now. “Because of that, website performance and mobile-friendliness are intertwined.”
“If I can only give one tip to follow for technical SEO, it’s this: optimize your page for mobile,” says Samantha Kohn of Mobials. “Not only will a mobile-optimized site make your brand more attractive and accessible to readers who are browsing on a mobile device, but Google will penalize your site for not being mobile-friendly.”
There are several ways to test the mobile-friendliness of your site:
And our respondents also offered a few tips on optimizing your site for mobile:
Another option that helps both with mobile-friendliness and load speeds is Accelerated Mobile Pages (AMP).
“AMP is an open-source initiative that aims to improve the web browsing experience by decreasing page load times,” says Venkatesh C R of Dot Com Infoway. “AMP Plugins for WordPress sites have improved recently: they’re easily added and are good enough to convert an entire website to AMP.”
“Mobile-first indexing is here and mobile-first SERPs will naturally follow,” says Tanya Wigmore of CRO:NYX Digital. “If you are not using AMP markup on your site, you are almost definitely missing out on potential rankings and traffic.”
“Structured data markup is a critical technical SEO tactic in 2019,” says Samuel Schmitt. “More and more, Google is adding rich snippets to its results pages.”
So what are rich snippets? Here are a few examples:
This is a recipe rich snippet (it features an image next to the meta description) with a rating rich snippet (an aggregate rating of recipe reviews).
This is an event rich snippet that lists upcoming events below the meta description.
This is a product rich snippet that features price above the meta description.
“They say you only have one chance to make a first impression, and structured data is the key to nailing that first impression in organic search,” says Matt Desilet of Lola.com.
“In terms of coding, structured data is on the easier side to create,” says Miva’s Luke Wester.
You can generate structured data for your site using Google’s Structured Data Markup Helper.
But Desilet also recommends “making sure that your implementation doesn’t fight other implementations that you have on your site.”
“Using structured data means your organic result will contain more visual data and will attract more clicks,” says Julien Raby of Pet Approves. “Just look at this image:”
“The second listing looks way more appealing to me with the average star reviews and the price indicator. So even though it’s ranking in position two, it might get more clicks than the number one result,” Raby says.
“Your H1 header is very important,” says Celeste Huffman of Rocket Web. “Many developers sometimes forget to add the tag.”
Most of the time, your CMS will create your H1 for you automatically. For example, the main title of a blog post in WordPress is automatically coded as an H1. But they sometimes get neglected on landing pages where the page doesn’t have a specific title.
To check and see if your page has an H1 tag, you can right-click on the page, select “View Page Source,” and then open the find function with Command + F on a Mac or Control + F on Windows. Search for “h1” in the find bar. If it finds zero results, your page is missing an H1 tag.
Or to find missing H1 tags at scale, you can use a tool like Screaming Frog.
It’s also important to note that unlike other heading tags (H2, H3, etc.), each page should have only one H1 tag.
“It’s important to make sure that there are no duplicate meta tags such as meta descriptions or meta titles on a website,” says William Taylor of MintResume. “If there are lots of duplicate tags, it can send a low-quality signal to search engines.”
“A great solution to this is using software such as Screaming Frog to quickly crawl a website and find all pages with duplicate tags. This has saved me and my team hours of work compared to manually scouring the HTML of every page on a website.”
“URL length has a surprising impact on search,” says Kenneth Burke of Text Request. “I’ve seen noticeable increases in search traffic whenever URLs are simpler and more succinctly convey what the page is supposed to be about.”
Crediful’s Chane Steiner agrees: “If your URLs have long, confusing parameters or IDs, then you are potentially harming your SEO. My advice? Change the URLs to reflect a simple primary keyword and leave it at that.”
If you do nothing else for technical SEO, you should at the very least follow this advice from The Blogsmith’s Maddy Osman: “Integrate free Google services—Google Analytics and Google Search Console—with your website.”
“Both can provide valuable feedback regarding website issues, both from a technical perspective and the perspective of actual users. Google Search Console can literally tell you if you have technical issues that need to be fixed, such as if you have broken links on your live site,” Osman says.
And Smartlook’s Nikola Kožuljević says, “Don’t be afraid to jump into technical tweaking, but complement that by making good backups. For starters, HTML or .htaccess tweaks may seem complex and dauntingly difficult. But they’re really not.”
“I’d suggest making backups and jumping into the files to tweak them accordingly,” Kožuljević says. “Repeating these processes over multiple iterations—and with ever-increasing complexity—will do wonders for your site’s SEO in the long run.”
Marketing | Nov 30
Marketing | Nov 26