Wednesday, October 24, 2012

How to SEO Your Website

How to SEO Your Website

What is SEO?

SEO or Search Engine Optimization is the practice of making your website attractive to search engines such as Google, Bing, Yahoo, etc.

Search engines regularly read and archive websites so that people can find them easily. For example, a person may be searching for ways to cook salmon. If your website is about salmon and optimized properly, your site should appear within the first page or two of every search engine.

This is a comprehensive guide that explains how to SEO your website. This tutorial offers an overview of how to optimize your site's performance in search engine rankings. From how to set up your site and naming pages to creating conversation across the web, this page offers strategies, tips, and suggestions that can help make your site a success.

SEO is the process of making your website easier for search engines to understand. The goal of SEO is increased ranking for your website, which will result in more traffic. Engaging in search engine optimization requires a constantly evolving skill set. This guide contains basic practices that have remained relatively constant over time.

Many people who don't understand SEO or the goals of SEO consider it to be spam or manipulation. However if implemented within search engine guidelines, the practice is endorsed by Google and other search engines. Good SEO results in pages that are set up in a structured and orderly fashion. The pages will be filled with better information and more valuable content.

What SEO Is Not

  • SEO is not about manipulating search engines such as Google, Bing or Yahoo.
  • Search Engine Optimization is about creating clean and detailed web pages that can be easily read by automated robots.
  • By following a basic set of rules and ensuring that you have the correct information in your source code along with keywords and other detailed information throughout a page, a search engine like Google will be able to easily read and catalog (or index) your page.

How Search Engines Work

Search engines have programs called spiders which visit web pages to determine what the content of your site is and to find other links to scan at a later date.

  1. Spiders, or web crawlers, scan the content of web pages.
  2. They send the results of their scan back to an algorithm to be broken down and analyzed.
  3. If the spiders encounter a link to another page or website these links are stored.
  4. Eventually other spiders crawl the linked-to pages.
  5. Therefore, the more links from other websites and pages your website has, the more frequently your website is visited and crawled.

Matt Cutts Explains Search

Google Engineer Matt Cutts explains what search is and the basic elements of SEO. While this video is centered around Google, the same base concepts apply to any search engine including Bing and Yahoo. Pages that succeed in achieving high rankings on Google tend to get a good number of hits.

SEO Basics

In this MSNBC news clip, Stephanie Leffler (who is the Senior Vice President of Network Solutions) defines the 3 most important SEO tips and what to look for when hiring an SEO company. Leffler's tips include how to have good content, the importance of links and the proper use of tags.

Data Analysis: what search engines look at

Search engines look at a combination of over 200 factors to determine what pages should rank for which queries. These factors include:

  1. Information on the web pages (known as on-page factors), such as:
    1. Page content
    2. Title heading
  2. Off-site factors also come into play. These factors incorporate:
    1. How reputable is the page linking to you
    2. What words are being used to link to you
    3. How long the link to you has existed
  3. It's the combination of on-site and off-site factors that determine your search engine rankings.

Advantages of good site architecture

  • Having good site architecture offers benefits beyond aesthetic considerations, including:
    • Easy Expansion — Because your site is divided into manageable sections it is easy to add new sections and grow in the future.
    • Easy Navigation — Intermediate and advanced users can manually manipulate the site's URL to change sections.
    • Easy Maintenance — Because the website is divided up into manageable sections it is easier to maintain than a site with a flat structure.
    • Well-Defined Hierarchy — Pages with more generalized information are at the top of the tree. As you navigate deeper into the site, pages present more specialized information.
  • Good site architecture requires:
    • A good understanding of your website's subject matter.
    • Knowledge of how users are likely to search for information (you can gain this knowledge through keyword research).

Keyword research

Keyword research is valuable because it's a way to learn how your users search for information. Keyword research can also give you a better understanding of the subject your website will be about.

When doing keyword research there are a few basic ideas to understand:

  • Singular vs. Plural — Search engines and keyword research tools handle singular and plural terms differently.
    • Apple will return different search results than Apples.
    • Understand how your research tool displays and reports singular and plural terms.
  • Word Order and Prominence — The order of the words that are typed into a search box matters, as does the order of the words on a page.
    • Macaroni and cheese has slightly different results than cheese and macaroni.
    • Some keyword tools will always list words in alphabetical order, something you should be aware of when doing your research.
  • Head Keywords — Head keywords are usually short one or two word concepts that can have a wide range of meanings. They have a high volume of searches, but the variety of possible meanings makes it difficult to know what the user was actually searching for.
    • An example of head keyword would be golf. Was the searcher looking for shoes, clubs or places to play golf?
  • Long Tail Keywords — These are multiple keywords (at least four or five words). These keywords are very specific and signal a clear intent on the users part.
    • An example of tail keyword would be Mens black Nike golf shoes.
  • There is a wide range of keywords falling between the head and tail.

When doing keyword research it's important to:

  • Compile a list of all of your keywords.
  • Try to develop clusters around a particular topic or subject (these will become the high level directories in your website architecture).

Siloing and theme pyramids in site architecture

When building a website you should strive for simple and easy-to-understand site architecture. This is good for both humans (your visitors) and for search engines. To do this you must organize your website into clear sections, also known as theming or siloing.
Here's a basic example of good site architecture:


Another example:


Keyword research bsics

SEOBook owner Aaron Wall provides rare free advice about how to research keywords.

Site URLs and server technology

People who build and run websites have a wide array of technology to choose from. Some popular programming platforms include:

  1. Hypertext Preprocesor (PHP)
  2. Active Service Pages (ASP)
  3. JavaServer Pages (JSP)

Each of these platforms has its own language, but they all serve pages out in HTML format. From a search engine optimization perspective there is no advantage in choosing one over the other. You can choose among the platforms based on cost of operations: hiring developers, designers, programmers, and webmasters.

If you are building a new website or undergoing a redesign and restructuring you should follow the W3C recommendations and build a website without revealing the server side scripting language. This means instead of:


Your pages' URLs should look like:


Using this scripting language will allow you to move from one technology to another without altering your site's URLs or its underlying structure.


It's possible to serve your website under both: and However, many search engines will see both and consider it duplicate content (the same content under two URLs) and may penalize your site accordingly. To avoid this:

  1. Pick either http:// or http://www and use it consistently.
  2. Configure your web server to 301 redirect all traffic from the style you are not using to the style you are using.

For more information on this issue see Matt Cutts' URL canonicalization advice.

Static URLs and dynamic URL parameters

In many cases programming implementations use parameters instead of static URLs. A URL with a parameter will look like this:, while a static URL will look like this:

In most cases search engines have the ability to index and rank both formats. However, best practices advise the use of static URLs over dynamic ones, for reasons like:

  1. You produce cleaner, easier to understand output.
  2. You extract the keyword value the parameter can add.

As you start to add more than one parameter to a URL search engines have a harder time properly indexing the URL.

URLs which are not in the index will never rank or drive traffic from search engines.

Keywords in URLs

Generally speaking, it's beneficial to have keywords contained within your URL structure. Having the keyword in your URL helps search engines understand what your page is about and also helps users know what they are likely to find on the page. Consider these two examples and see which you find more useful:


Delimeters in URLS

Delimeters are used in URLs to separate words. The best practice is to use a hyphen to separate words. Search engines do have the ability to understand other characters, such as an underscore, but the hyphen is preferred over an underscore for human usability issues.

For more information see Matt Cutt's discussion on dashes vs underscores.

Design page structure and CSS

Website and design usually have more of an impact on usability and marketing. However, there are some SEO concerns to be aware of.

  1. While proper semantic markup and W3C code compliance never hurts, it's not a requirement if you want to have a high-ranking page in search engines.
  2. Care should be taken taken when building pages to eliminate as many errors as possible.
  3. A page with several hundred coding errors is much more likely to trip up a search engine spider than one with fewer errors.
  4. Using proper standards and markup usually means pages are laid out in a more logical fashion.
  5. Using CSS makes it possible to put the main body of a page's content first. Otherwise the top banner and any side navigation appear first.
  6. Search engines still place some weight on the text that comes first on your pages.

More and more website owners are using a Content Management System (CMS) to build their sites. Using these programs forces you to isolate content from the context, which usually results in cleaner and more streamlined code. Additionally, these CMS systems make it much easier to build and maintain mid - and large - sized websites.

On page SEO factors and considerations

On-page SEO factors deal with the elements that are on the actual web page. Links from other sites are off-page factors.

Most professional SEOs consider the title element the strongest on-page SEO factor, so it's important to pay attention to it.

  1. You want a title that is short and eye-catching, with as many keywords as possible.
  2. Make sure your title still reads cleanly; do not have an unintelligible keyword-stuffed title, as this will display in the search engine listing for your website.
  3. Included your site name in your title for branding purposes. Whether to place your website name at the front or end of the title can be decided by personal preference (smaller or less well-known companies should place their names near the end of the title, so that a browser's focus goes to the keywords in your title).

Meta keywords and descriptions

These factors are largely ignored by search engines due to abuse in the past. In some cases having identical keywords and descriptions across an entire website has been shown to be a slightly negative factor in ranking.

The meta description will appear under the title when your website shows up in a search engine result. Therefore, create a unique description that is well-written and eye-catching.

Headlines and page headings

Page headings (also known as H tags) are structural elements used to divide a page into meaningful sections. They number from H1 through H6, with H1 being the most important and H6 being the least.

  1. Your page should only have one H1 tag.
  2. You can use as many other H2-H6 tags as you want, as long as you don't abuse them by keyword stuffing.

Many people have their H1 match their title tag. You can make them different, which allows you to use a wider array of keywords and to create more compelling entries for humans.

Bold and italics

Bolding and italicizing fonts doesn't impact search engine rankings. Use these fonts for visual or other formatting reasons, not to affect your standing in search engines.

Internal anchor text and links

Anchor text refers to the words that are clickable in a link. Internal anchor text are the words that link to other parts of your site. Anchor text is one of the mechanisms search engines use to tell what a page is about.

If you link to a page with the words Blue widgets, search engines think you are trying to tell it the page on the the other end of the link is about blue widgets.
By using consistent or similar anchor text every time you link to a page, search engines gets a better understanding of what a page is about.

Avoid using anchor text that doesn't contain keywords (i.e., anchor text that reads "click here") whenever possible.

Content considerations

Content refers to the pages and articles on your website, excluding your template. Creating content that is entertaining, interesting, educational, informative, funny, or compelling in some other way is the best way for you to encourage people to visit your site frequently, link to you and ultimately improve your rankings. The more unique and interesting your content is, the more value it has to your site's visitors. For example, there are millions of websites about iPods. There are very few websites about putting your iPod in a blender, smashing your iPod, or hacking your iPod to do new things. Think of your content as your point of differentiation.

Most content falls into three different categories: boilerplate, news and evergreen.

Boilerplate content

Boilerplate content is general information. Your "about us" page, "testimonials", "contact information", "privacy contract" and "terms of service" constitute boilerplate content.
These pages exist to help website visitors get to know you, learn to trust you and feel comfortable sharing information or making a purchase from you.

News content

News content is content that has a short-term lifespan. News pages can remain relevant for a few hours, or even a few months. Eventually people stop searching for the term and the page will get very little traffic after that.

Evergreen content

Evergreen content is content which has a long lifespan. An example of long term content is How to Paint a Room. Techniques for painting your interior rooms aren't going to change much for the foreseeable future. The number of people who are searching to learn how to paint a room will remain fairly constant from one year to the next.

Most websites have a blend of news and evergreen content. This will vary from one industry to the next. Gadgets and technology websites will have more news content, whereas websites about William Shakespeare will have more evergreen content.

Creating good content is only part of the equation. Once you have the good content, you have to make sure other people know about it, read it, link to it, or tell other people they know about it. For new websites you will have to engage in more proactive marketing.


Marketing a website is no different than marketing a business. You have to advertise, send out press releases, engage in viral or word of mouth campaigns, or visit other places and tell them about your website. Though a complete marketing plan demands its own guide, some of the key goals of any website marketing plan should involve:

  1. Getting people to visit your website.
  2. Getting people to link to your site.
  3. Convincing visitors to tell others about your website.
  4. Encouraging people to regularly come back to your pages.

Link building and link development

Links are the primary method a search engine uses to discover your website and a key factor in its rankings. Links help search engines determine how trustworthy and authoritative your website is and they also help search engines figure out what your website is about.
Links from trusted authoritative websites tell search engines that your website is more reliable and valuable.

A link from websites like CNN, The New York Times and The Wall Street Journal are more valuable than links from your local neighborhood garage or realtor.

Search engines also look at the anchor text (words that link to your website). When someone links to you with the words blue widget they are telling the search engines you are about the words blue, widget, and blue widget.

Links also increase in value over time. The longer a link has been in place, the more effective it is in passing along trust, authority and ranking power to your website.


Once your website is built you want to try and acquire links from as many trusted sources as possible in your particular industry. Getting links from websites that are related to your industry is usually more helpful than getting links from websites that are not related to your industry, though every link helps.

One of the first places many people start building links is from directories. Most directories have a fee for inclusion. Look for directories that are charging fees because they review each site before deciding whether to accept it.

Don't join a directory that lets in every site that applies; you want one that keeps out low-quality sites.
To see if a directory is worth the review fee, check to see how much traffic they are going to send you.

To evaluate potential traffic, check to see if the directory page is listed for its particular search term.
If the directory is listed, this is usually a good indicator it will send you traffic. If the directory does not rank well for its term, check to see if it's listed in the search engine index and how recently it was crawled.

You can check the last crawl date by clicking on the cache link on the search engine result page.
Pages that are in the index and have been crawled frequently are usually more trusted and will pass some of that value to you.
Pages that are not in the index or have not been crawled recently are usually not worth the review fee.

Press releases

Press releases are usually used to get the attention of journalists or industry news websites, magazines and periodicals. Many press release websites have relationships with search engine news feeds, so using them can be a very effective way to put your website in front of the right people.

ost press release websites do not pass along any link value, they simply act as link pointers to your website. If a journalist, news website or blogger sees your press release and writes about you, you may get a link from them.
Consider press releases in light of how much traffic and secondary links they can bring; ignore the link from the press release service.

Content and article syndication

Content and article syndication websites allow you to publish your content on other sites. In exchange for the free content these sites are willing to provide you with a backlink.

Most of these article syndication sites are like press release sites in that they do not pass any link value, but instead act only as link pointers.

To decide if this strategy should be a part of your marketing and link-building plan, look at the most popular articles in your category and see how well they rank and how much traffic they are likely to drive. You can also use article syndication sites to identify third-party websites that would be interested in publishing other articles from you.

Link exchanges, reciprocal links and link directories

Exchanging links with other related websites is a good practice, if it makes sense for your users. Creating link directories with hundreds of links to other websites that are of very little or no use to the user is a bad practice and may cause search engines to penalize you.

If the link has value to visitors of your website and you would place the link if search engines didn't exist, then it makes sense to put up the link. If creating the link is part of a linking scheme where the primary intent is to influence search engines and their rankings then don't exchange the link.

Paid links and text link advertising

Paying for links and advertising can be valuable, as long as you follow search engine guidelines.
If a link is purchased for the advertising value and traffic it can deliver, search engines approve of the link.

If the link is purchased primarily for influencing search engine rankings it is in violation of Google guidelines and could result in a penalty. If you want to buy or sell text link advertising without violating Google guidelines, look for implementations with a nofollow, (JavaScript) or intermediate page that is blocked from search engine spiders.

Viral and word of mouth marketing

Creating content that is viral in nature and gets you word-of-mouth marketing can help you acquire links. This process is often called linkbaiting. Content created for this purpose is often marketed on social media sites like Digg, and Stumbleupon.
As long as your content becomes popular naturally, without artificial or purchased votes, you will be within search engine guidelines.

Blogs and social media

Blogs are a relatively new form of website publishing. Their content is arranged, organized or published in a date or journal format. Blogs typically have a less formal, almost conversation-like style of writing and are designed to help website owners and publishers to interact more with their customers, users, or other publishers within their community.

The journal and conversational format of blogs usually makes it a much easier way to gain links from your community. You must create content that members of that community value and are willing to link to.

For a blog to be truly successful the authors must participate in the community and publish frequently. If this behavior doesn't mesh with your company culture, creating a blog is not going to be effective for you.

Social media

Social media and bookmarking sites like Digg, and Stumbleupon have community members who function almost like editors. They find and vote on web pages, stories, articles, videos or other content that is interesting or engaging.

Most social media or bookmarking sites are looking for new content on a regular basis. The frequent publishing demands of blogs also requires a constant flow of new material. To get the most out of social media you must become involved in the community and submit stories from other sources, not just from your website.

Each social media website has its own written and unwritten rules. Learn these before submitting stories. Every community frowns upon attempts to "game" the voting procedure. Tactics such as voting rings and paid votes that artificially influence the voting mechanism should not be permitted.

Analytics and tools

Once your website is up and running you will want to know how many people are coming to your site, how they are getting there, what pages they are viewing when they arrive and how long they are staying. For this you will need a website analytics package.

here are a wide variety of analytics packages, ranging in cost from free to several hundred thousand dollars each month.
Each analytics package measures data in its own way, so it's not uncommon for two programs to have slightly different results from the same set of data.

Additionally, each package provides a different level of detail and granularity, so you should have some idea what you are looking for before purchasing a package.

The two main methods of implementation are log files and JavaScript tracking. The most commonly used analytics package is Google Analytics. See Mahalo's introductory guide to using Google Analytics for more information.

Linking strategy

A good general overall linking strategy is to slowly acquire links from as many trusted sources as possible, with a wide variety of anchor text, to both your home page and sub-pages. If, over a short period of time, you gain too many links with similar words in the anchor text, from a few or low-trusted websites, to a limited number of pages, this would create an unnatural linking profile. Your website will be penalized or filtered by search engines for such behavior.

Common SEO problems

You can build a website with great content and institute an effective marketing plan, yet still be foiled by technical issues. Here are some of the most common problems:

robots.txt file

A robots.txt file communicates what pages or sections of your website you want search engines to crawl.
A common mistake is blocking search engine spiders from a section or entire site you want indexed.

Google's Webmaster Central has a tool to let you verify that your robots.txt file is performing how you'd like.

Response and header codes

When your web server serves a page there is a special code that tells the browser or spider the status of the file served.

  • A 200 response code means the page serves normally. If not configured correctly, some web servers will serve a 200 code even when a file is missing. This can create a problem when search engines index a lot of blank empty pages.
  • A 404 is the response code when a page or file doesn't exist. To improve usability, set up a custom 404 page with a message explaining what happened, a search box and links to popular pages from your website.

Duplicate content

The content from any page should only exist on one URL. If the same content exists under multiple URLs, search engines will interpret this as duplicate content.

Subsequently, the search engines will try to make a best guess as to the best URL for your content.
If this condition is true for a large amount of your pages, your website may be judged low quality and be filtered out of the search results.

Duplicate titles

Every page of your website should have a unique title. When a search engine sees duplicate titles it will try to judge the better page and eliminate the other from the index.

Duplicate meta descriptions

If a large number of pages have identical or very similar meta descriptions, these pages may be filtered for low quality and excluded from the index.

Poor or low quality content

In an attempt to create a large number of pages very quickly, many people will employ automated solutions that end up generating pages with fill-in-the-blank or gibberish content. Search engines are getting better at catching this condition and filtering these sites from the index.

Blackhat SEO and spamming

Some people engage in tactics or methods that violate search engine guidelines to achieve higher rankings.
They can employ a wide variety of tactics including (but not limited to):

  1. Keyword stuffing
  2. Link spamming
  3. Paid linking
  4. Artificial link schemes
  5. Sneaky or deceptive redirects

If you employ a tactic that seems to involve tricks or is done primarily to manipulate search engines and artificially inflate rankings, you can be considered to be engaged in blackhat SEO or spamming. This has repercussions for any site you work with and should be avoided. For more detailed information, review Google's guidelines.


Good SEO takes time, as you need to develop great content and a strong community voice. But this is just what the Internet needs: high-quality pages that provide a valuable service to users.

via Mahalo