Stop overcomplicating SEO

Author's avatar Marketing UPDATED May 18, 2022 PUBLISHED Mar 2, 2016 7 minutes read

Table of contents

    Peter Caputa

    To see what Databox can do for you, including how it helps you track and visualize your performance data in real-time, check out our home page. Click here.

    Search engine optimization has long been considered an area of digital marketing that requires deep expertise, especially on the technical side. The typical marketers’ thought process goes something like this:

    “Ok. I really need to optimize organic search rankings. It’s time to hire an expert consultant. Or an agency. Or, wait – maybe a full time employee? Hell, this is too important and too complicated to muck up – I’d better hire all three. Where can I cut budget?”

    It’s true that SEO is time-consuming and requires a degree of technical comfort. And I have no doubt that, as with every area of marketing, there are individuals and agencies out there that can make a huge difference.

    But that’s no excuse not to tackle some of the basics yourself. It’s time to stop overcomplicating the technical aspects of search engine optimization, and get comfortable with some of the basics.

    Before we get started, I’d like to set the stage by reminding you that SEO is about content. Content for humans, and content for search engine spiders. Your job, as a digital marketer, is to make sure your website content:

    • Is desired by your audience
    • Uses terms familiar to your audience; and
    • Is coded cleanly (and thoroughly)

    It’s that last part that confuses everyone, but doesn’t have to. Search engine spiders are trying to understand your site. You’re trying to make it dead simple for spiders to read your site through relevant language and clean code.

    At the end of the day, there are two parts to technical SEO: getting your content read, and getting your content understood.

    Getting content “read” by search engines

    The keys to getting your content read by search engines are making sure you’re speaking their language; giving them shortcuts; leaving specific instructions; and staying unique.

    Speaking their language: Javascript

    The universal language of the web is HTML. That’s what search engines depend on to figure out what content you’re offering visitors. But a lot of the fancier things you see online – like infinite scroll pages or dynamic content – rely heavily on Javascript. There’s nothing inherently wrong with Javascript; it’s just traditionally not been something that search engine spiders are fluent in. The good news: Spiders want to read content. That’s their sole purpose. And Javascript is pretty popular. So Google has an incentive to get better at handling Javascript. But to play it safe, if you use a lot of Javascript commands to render your content you may want to code fallback pages for your site.

    Give a shortcut: The sitemap

    A sitemap is a shortcut to seeing what’s on your site. Think of a sitemap as doing the work for the spiders – sort of a Cliff Notes for robots. You can use a sitemap to provide Google with metadata about specific types of content on your pages, including videos and images.

    Now, creating a sitemap does mean coding an XML file. There are tools that you can use to do it, but this is probably not something you want to attempt if you’re not comfortable with code. Instead, make a point to ask your nearest front-end dev to make sure you have one for your site.

    Leaving detailed instructions: Robots.txt

    The purpose of the robots.txt file is to inform spiders which directories you do not want indexed – or to block an entire site. It’s a standard followed by all major search engines. Now, keep in mind – you can’t use it to make a site private; if someone has the link, they’ll be able to see it even if you’re told search engines not to index your content. But you can use robots.txt to keep your blog that’s dedicated to embarrassing 80s pictures from showing up in Google search results.

    Stay unique: Duplicate content

    Search engine spiders want to make sure the pages they’re reading are really yours. They also don’t like having to arbitrarily choose which version of a page is the “right” one to display in search results. So having the same thing hosted on more than one URL can hurt you – and duplicate content is more common than you think:

    • A blog article posted to multiple sections.
    • Your system automatically copies your site and serves it on a secure (https) domain.
    • Your content is heavily syndicated.

    That doesn’t mean you can’t ever have the same content in two places – you just need to use canonical links to designate the preferred version of a page when multiple versions of the URL exist.

    To set up a canonical like, just place the following code in the page header:
    < link rel="canonical" href=“ENTER YOUR PREFERRED URL"/ >

    And if you have two entirely separate pages that host the same content, use a “no follow” tag in the page header.

    You can also use redirects to avoid publishing content in two places for the sake of something like a vanity URL. There are a couple types of redirects, but the ones you want to ask your development team for are most commonly 301, or permanent, redirects.

    Getting content “understood” by search engine spiders

    There are a couple of key components of your web pages that you should pay attention to in order to ensure search engine spiders are understanding your content: introduce yourself, make your main point loud and clear, be recognizable and don’t forget a picture is worth a thousand words.

    Introduce yourself: Page titles

    Your title tags are the first stop a search engine spider makes to understand what your page is all about. The html tag < title > specifies your page title. A good title tag is descriptive and unique. The first word is valued most, followed by the second word, then the third, and so on. But your title tag is not strictly for robots: it will also control the headline that appears when your site is returned in a search query. Typically, Google will display the first 50-60 characters. So remember: good title tags are keyword rich but still user friendly. You want someone clicking on it!

    Make your main point loud & clear: Headers

    Search engine spiders consider your header tags the most important description of your page. Your primary header is indicated by a < h1 > tag. A good h1 is relevant and consistent to other content on the page. You do not always need to style it to look like your headline – you can use css and style overrides to make it visually look like a subhead, for example. And you can use more than one h1 tag per page, but that suggests you have two different things of equal importance to the search engine spider – which isn’t always a good things. Use your h1 tags wisely!

    Be recognizable: URL structure

    Your URL structure is another indication of what your page is about. Spiders (and humans) like “user-friendly” URLs. A general best practice is to relate your URL structure to your page title.

    A picture is worth a thousand word: Images

    Remember, search engine spiders understand “text” – i.e. html – not images or videos.
    If you want your image to contribute to the description of your page – and you should! – use a descriptive file name and add alt text so crawlers know what they’re looking at.

    Bonus round: Meta descriptions

    Meta descriptions have no direct link to SEO, so they’re easy to overlook. But they do show up in search results as the description for the page returned – Google will typically display the first 120-150 characters. That means they play a large role in CTR. Think of your meta description as ad copy – you want to use them to optimize your CTR, because higher engagement will help you increase your search rank over time.

    Once you have all of these things in place, use this SEO dashboard to track the performance of your content in SERPs.

    Getting started with SEO

    If I’ve convinced you that there are some basic aspects of search engine optimization you can get started on today, your next step will be doing a little digging to figure out where you should focus your SEO efforts. Your first step should always be some basic research on Google. Specifically, you should:

    • Do a search for (1) your brand and (2) your top keyword.
    • Use site: search to see which pages on your site are indexed.
    • Use related: to see which websites Google considers similar to yours.
    • Use Google Trends to figure out common phrases.
    • Use keyword planner for keyword suggestions.
    • Use Google auto-complete for keyword suggestions.

    Once you’ve gotten a basic handle on where you stand, you can always do a more in depth SEO audit.

    At the end of the day, the technical side is probably the easier aspect of search engine optimization. The real power comes from creating lots of great content that people want. After all, Google’s purpose is to direct people to content they are actively looking for. Every change they make to how websites show up in search results is designed to support that purpose. Truly, the best SEO strategy is making sure you’re giving your website visitors a great experience and improving overtime based on insights from your SEO dashboard software.

    Author's avatar
    Article by
    Databox

    The all seeing eye of Databox webmaster

    More from this author

    Get practical strategies that drive consistent growth

    Read some