SEO Tips in Ten Minutes
Back when the web was new, when the
<img> tag was new and
phones didn't have web browsers, finding what you wanted on the web meant
sifting through a hand-curated list of links. It was possible to maintain a web
directory solely by human effort because there were so few web sites, it was
easy to find new ones.
Now finding the right web site probably starts with you typing a few words into your favorite search engine and hoping that one of the first results is good enough.
If you're publishing web pages, you want your site to show up early in the search results—hopefully earlier than your competition and certainly earlier than outdates or unmaintained resources that will otherwise lead your potential readers astray.
Because the web is far too large for humans to curate its directories, everyone relies on the tireless work of computer programs to search the web and figure out what each and every page is about. The good news is that these programs have rules, and following those rules can help your site rank better than other sites.
Alright, most techies already know all of this. Maybe you don't know how easy it can be to manage a site so that it's easy for spiders, crawlers, indexers, bots, and other programs to figure out what's on your site, to show it to the best audience, and even to suggest what you should write and how you should structure your information in the future.
Start with a Valid Site
Your first task is to start with a valid, working site. Your HTML should
validate. HTML 5 is probably the best, because it has extra semantic elements
<footer> tags, but valid HTML is better than invalid
Your hyperlinks should work, especially within your site. Broken links are dangerous. If you have to use server-side redirects to work around missing or moved pages, do it.
Your domain name should reflect the topic of your site. If you have one main keyword ("Modern Perl" suggests modernperl.net, for example), see if you can get it in your domain name. Good luck.
Google Webmaster Tools
Your second task is to sign up for Google Webmaster Tools. This will give you a lot of information about how people reach your site through the Google search engine as well as the health of your site—and, better yet, will give you ideas how to create new pages and revise existing pages to draw more specific and useful traffic.
You can't sign up for the webmaster tools early enough. From the point you validate your site to getting useful information, expect to wait at least a week and a half, maybe two. Sign up two weeks before you're ready to take much action.
Intelligent URL Management
Any semantic information encoded in your URLs will make it through to the spiders indexing your site. The more descriptive your URL—the better it works as a title or description for the contents of the page—the better. Cute or clever URLs probably work against you; URLs with a couple of relevant keywords work for you.
That advice also applies to the structure of your site. The names of subdirectories is important; the parallelism and structure they imply across your entire site holds too.
You may get your URL structure wrong. You can fix it, if you configure your server to send permanent redirects (301 redirects—but note how awful Google's own URL is, from the semantic point of view) to tell search engines that the previous URL was wrong and the new one is right. (Keep in mind the W3C's advice that cool URIs don't change).
<meta> tags in the header of your HTML can be highly
influential; they exist for user agents to analyze, after all. You've seen the
<title> tag show up in search results. The meta
description tag helps you give hints to search engines about what to
display for the snippet of the site in search engine results. Instead of the
search engine picking several words out of context in the hope that the results
will entice users to click on your link, you can have a couple of complete
sentences that describe exactly what potential visitors will find.
One of the most powerful tools for describing your site's content is a sitemap. This is an XML file which describes every publicly accessible URL on your site, along with update frequency and relative priority.
If you use a CMS or other means to generate your site, producing a sitemap should be trivial.
When you register for Google webmaster tools, submit a sitemap for each site. Google will verify the contents of your sitemap and, when it validates, will use the sitemap to crawl your site for indexing.
Be sure to add your sitemap to your robots.txt file too.
If you've taken the advice in this essay so far, you have set up the structure of your site to provide the best chance for search engines to find the right semantics of what your site is about. In other words, you're presenting your information in a way to make it easy for search engines to crawl and to index your site.
They're looking for keywords and key phrases. For example, this page uses the phrase "SEO tips", because that's what the page is about. That phrase is in the title of the page and in the meta description. It's also in the text of the page (how could you read about it otherwise) as well as the header just above this paragraph.
The only possible improvements would be to put the key phrase "SEO tips" in the URL of the page as well as the domain name. (The site's not about SEO though, so there's no reason to do that.)
Assuming your site is really about a topic and not just keyword spam, you'll tend to get better rankings for a keyword or key phrase if your keyword or key phrase is in, from best to least:
- the domain name
- the page's URL
- headers in the page
- the meta description
- the body text of the page
This isn't exact science, but you can experiment with it. Because there are so many web pages and so many criteria, the exact formula for how to rank one page over another is far too complex for one person to explain or even to understand. Yet all of these things contribute.
You can control these things. Choose a good domain name, if you're writing about a single subject. Choose descriptive URLs. Follow good semantic markup practices (and good editing and copyediting practices).
If you've followed the advice so far and you've waited a couple of weeks for the webmaster tools to track data on your site, it's time for the weekly review. This will take you a few minutes every week.
Go to the webmaster tools. Choose your site. In the left menu, choose "Traffic" and then "Search Queries".
This tool is invaluable. It shows you the search queries for which Google showed pages on your site, with their frequency, the number of clicks to those pages, as well as the average rating within the results. You can change the time period and sort the results.
If you click on a search term, you can even see the resulting pages Google displayed.
You start to see the benefit of this. Of course you should start to see queries for which you hoped your pages would rank. You can track your progress over time and tweak the keywords and key phrases in your pages to capture more attention.
More importantly, you can see the search queries you didn't think of and the ways people reach your pages. If you're ranking decently for a search term that's tangentially related to something you've written about but could write about separately, you can make a new page for that subject and capture that traffic. Alternately, if you wrote about a subject but used key words or phrases that don't quite match what people are really searching for, you can change the headers and meta description and text of your page to rank more highly for those terms.
If you're really clever, you can even rename the URL of the page on your site and redirect traffic from one to the other.
Keep in mind that the query tools are usually a couple of days behind. If today's May 8, you will see results from May 6 at the latest. Also remember that Google's indexer bots won't immediately crawl your site after an update, so you may have to wait a couple of days for that too. (You can request an immediate update under the "Health" and "Fetch as Google" menu items.)
That's why looking at this at least once a week but not more than every couple of days is valuable. Sure, the data's a little bit out of date, but if you're planning to keep your site up for years, a few tweaks every few days may help you present your work to the audience that really is searching for it.
The webmaster tools also help you verify your sitemap, see who's linking to you, and even track errors, broken links, and other oddities you didn't expect. Best yet, it's free and it's easy to use and (it's difficult to emphasize this enough) the search queries tool is full of ideas for how to structure your site and to write about new things.