Tuesday, May 21, 2013

SEO Glossary

  •                          - 200 Status Code

    Status code 200 says status OK that is client's request was successfully received, understood, and accepted; this code says that page or image requested was found and loaded properly.
  • - .htaccess file

    It is a hypertext access file with its one or more configuration directives placed in a web site document directory that is supported by most of the servers. The directives apply to that particular directory and all its subdirectories.
  • - 203 Status Code

    Standard HTML status code 203 is for Non-Authoritative Information.
  • - -30 Penalty

    Google Minus-30 Penalty means your page rank may go down by 3 pages or 30 positions. Every search that for which the URL ranked from a particular domain would show that url after penalty on the top of page 4, position #31 precisely and even for a search on the domain name itself. Similarly, a Google Minus-60 penalty may mean lowering of your page rank by six pages or 60 positions, positioning to 61. Similarly you get minus 90 (-90) penalty that is ranking drop exactly 90 places and are now at minus 90. All these penalties have site-wide effect means for all keywords, every search for that particular domain the search is pushed down. Today, you will observer little change the domain may not always be penalized.
  • - 301 Status Code

    Status code 301 is for status stating moved permanently; this may be used for moving a website or web page permanently to a new location that is new web page or new website .htaccess file is used for redirection on Apache server and for IIS server redirection is done through ASP/ASP.net Internet manager.
  • - 302 Google Jacking/ Page hijacking

    It is a way for web page identity theft, virtual identity theft using 302 redirection by hackers showing false high Google Page Rank by redirection. It is a form of cloaking. Initially your website comes up in search results but as hacker hacks and use 302 redirect the displayed text in your URL listed by Google looks valid. When a potential visitor clicks on the link in Google Search Results a 302 redirect script is followed (bad web site tricked Google spider into thinking that it is a legitimate "302" redirect) and the unfortunate potential visitor and Google spider is taken or redirected to a junk links/banner farm page never knowing your URL is legitimate. Finally you get de-listed neither the hacker nor you get any benefit.
  • - 302 Status Code

    Status code says Found but temporarily under a different URL. Temporarily redirection is not considered good for SEO.
  • - 400 Status Code

    Status code 400 states Bad Request. The server did not understand request sent as syntax error was found and one needs to change request syntax.
  • - 401 Status Code

    User Authentication is requested when 401 status code is returned.
  • - 402 Status Code

    Status code 402 is returned by server when payment is required.
  • - 403 Status Code

    Server Code is returned when server refuses to provide you required information.
  • - 404 Status Code

    Server Code 404 tells web user requested web page is Not Found error message. Custom error page set up option with certain hosts are helpful to know pages with navigational issues. For a web page request that does not exist on the site server should return 404 error.
  • - 408 Status Code

    Status says server Timeout waiting for request; the client did not produce a request within the time that the server was prepared to wait. The client MAY repeat the request without modifications at any later time.
  • - 409 Conflict

    The request could not be completed due to a conflict with the current state of the resource.
  • - 500 Status Code

    Status Internal Server Error; the server encountered an unexpected condition which prevented it from fulfilling the request.
  • - 502 Status Code

    The server code 503 says Bad Gateway and is returned when the server acts a gateway or proxy and the upstream server returns invalid response.
  • - 503 Status Code

    Service Unavailable; it is for temporary as state server may be overloaded or down for maintenance.
  • - 504 Status Code

    Gateway Timeout - When a server acts as a gateway or proxy and as a result could not get response from its upstream server it gives gateway timeout.
  • - 505 Status Code

    The status code is returned when a user requested HTTP Protocol Version is Not Supported.
  • - -950 Penalty

    This penalty is specific to certain searches; this is as an "end-of-results" phenomenon. What we see is a URL that was previously top ranked for a certain search now showing up on one of the last pages of the SERPs not on any specific number page. Most of the time this only affects specific searches on other searches that same URL can still be on first page.
  • - Beacon

    A web beacon is also called web bug. It is a tracker 1 by 1 GIF (1 x 1 pixel) often invisible having no color. This line of code is placed in an ad or on a web page to track the visitor's actions, such as registrations or purchases.                     


  •          A
  • A/B Testing
  • A/B testing or Split testing is experimental approach to user experience or web design where two different versions are checked at different point of time to check user’s behavior. Various testing elements like copy text, layouts, images and colors are used by varying for optimal result. A visitor is shown one version of a page – (A) version or (B) version – based on user behavior second version is modified. Also tracking the changes in behavior based on which version they saw. (A) version is normally your existing design ("control" in statistics lingo); and (B) version is the "challenger". 
  • Above the fold (ATF)
  • Above The Fold ATF a.k.a. "what you see in your browser window" determines what will show on the screen when the user browser windows load. The ads are placed at the top at most prominent and visible location to gain highest "search indexing" and "ranking placement". The concept has its origin from newspaper which is folded from between and above the fold has all highlights to grab readers' attention. Above The Fold is used in advertising – it is reserved for premium content and may result in higher click through and conversions. Above the fold advertising is used to determine if the impressions on the Google Display Network will show ads on-screen when a user's browser window loads. ATF ads display would depend on screen resolutions, web browsers, and other related factors. Ads in your Google Display Network campaign will only appear when a placement can display your ad when the page loads, without having to scroll. Using ATF advertisement you can boost your ads performance. Above The Fold Time (AFT) is used for web performance measurement and for detail you can visit the website webperformancetoday.com 
  • Absolute Link/ Absolute Path
  • While creating URLs or writing links absolute path refers to very specific location, including the domain name; it is full URL of the file. It is used when you are referring to a web element on another domain. An absolute link includes the complete location of the document that includes the transfer protocol required to get the document, the server to get it from (domain name), the directory it is located in and then the name of the document itself (file name). Retrieving absolute link might be slower compared to relative but absolute links have less chance of getting messed up as complete path is defined. 
  • Acquisition Strategy
  • It is a process to acquiring products and services based on its supply and demand, specifications, acquisition methods and risk. Search Acquisition Strategy is processed approach to acquire and retain the right visitors to your website via organic results. Search acquisition strategy looks at the critical factors and criteria in acquiring products and services which need to be addressed by business owners and managers as the initiator in search engine optimization program implementation and search engine marketing strategy. 
  • ACRank
  • ACRank stands for A-Citation-Rank (By majesticseo). It measures how important a particular page is by assigning a number from 0 (lowest) to 15 (highest) depending on the number of unique referring external root domains. Therefore in order for a page or link to have ACRank of 1 or higher it needs to have at least one external backlink. You can find the minimum number of referring domains for each ACRank level. 
  • Active Directory
  • Active Directory (AD) is a directory service created by Microsoft for Windows domain networks. Active Directory provides a central location for network administration and security. 
  • Ad
  • Advertising is a form of communication for marketing used to encourage or persuade an audience (viewers, readers or listeners; sometimes a specific group) to continue or take some new action. When a searcher submit query the search engine may display ad anywhere on the SERP. Generally they appear on the top above the natural or organic listings and on the right hand side of the SERP that is known as "Right Rail". 
  • Adjacency
  • A property of the relationship between words in a search engine (or directory) query. Search engines often allow users to specify that words should be next to one another or somewhere near one another in the web pages searched. 
  • Adsense
  • Google Inc. runs a program called Google Adsense. Google AdSense offers a simple, easy and flexible way to earn revenue by showing relevant and engaging ads alongside your online content. The AdSense ads can be displayed on your website, mobile sites, and site search results. The ads served are based on website content, the user's geographical location, and other factors. Publishers and advertisers through Google Network earn and generate revenue, all the billing is managed by Google. 
  • Adversarial Information Retrieval
  • Adversarial Information Retrieval (adversarial IR) or AIRWeb Adversarial Information Retrieval on the Web is a topic in information retrieval related to strategies for working with a data source where some portion of it has been manipulated maliciously. Tasks can include gathering, indexing, filtering, retrieving and ranking information from such a data source. Adversarial IR is study of methods to detect, isolate, and defeat such manipulation. Malicious manipulation to make search result less useful to web surfers may be in form of link-bombing, search engine spam and optimization, splogs, malicious tagging, referrer spam, reverse engineering of ads blocking, ranking algorithms and web content filtering. 
  • Adwords
  • Google AdWords is Google's advertising product and an important source of revenue generation. AdWords offers PPC ads, CPC advertising, CPM ads, cost-per-thousand-impressions, site-targeted advertising for text, banner, and rich-media ads. Thousands of websites partner with Google to display ads through Google Display Network to reach local, national and international users and drive conversions using various their message in various ad formats like text, image, and video. Google uses Contextual targeting technology to display ads and may use placement targeting. Results can be measured and optimized using placement performance report. 
  • Affiliate Marketing
  • It is type of marketing that involves marketing of products and services sold by other website or other business in lieu of fees or commissions. Merchants expand their market reach using Affiliate marketing programs; they pay independent agents on a cost per action (CPA) basis. Affiliates are paid if visitors complete an action. Display of affiliate site ads in search result is monitored through manual review, landing page quality score on paid ads and implementing algorithm to check thin affiliate sites and duplicate content. 
  • Aggregator
  • A Website that aggregates specific information from multiple sources it may be feeds gathered from RSS syndication. It may display them to subscribers in a Desktop popup device or it can popularize content in different ways. Example Technorati.com The web application aggregates syndicated web content of different forms. Different types of aggregator are: Data aggregator, News aggregator, Poll aggregator, Review aggregator, Search aggregator, Social network aggregation, Video aggregator 
  • Alexa Internet
  • Alexa is a company owned and controlled by Amazon. Founded in 1996 by American web entrepreneurs Brewster Kahle and Bruce Gilliat and later acquired by Amazon in 1999. The company offers Alexa toolbar which once installed gives Internet users traffic patterns to its community users that uses Alexa toolbar for different browsers. It provides traffic information, global ranking, traffic stats of Internet and top websites, contact information and reviews of sites, free global web metric and search keywords. 
  • Algorithm
  • Algorithms are also sometimes called "secret sauce". It is set of rules and formulae by search engine to determine search relevancy and SERP and suggest to user based of their search query. 
  • All The Web ATW
  • ATW is All the Web, Around The Web or All The Way. ATW was a search engine that no longer exists. 
  • Allegra Update
  • It was in February 2005, The Allegra update borrows its name from the popular Allergy medication for allergy sufferers. It is said that Allegra could be the remedy to the "Sandbox Effect" that tens of thousands of Web sites experienced in 2004. The Allegra Super Bowl Update name Super Bowl Update refers to the update's proximity to American football's Super Bowl game. Allegra appears to randomize the search results to a degree so that someone searching for a particular keyword phrase does not see the same search results each time. They do this by periodically switching data centers. 
  • ALT tag
  • Alt attribute used in HTML or XHTML documents specify an alternate text for user agents that cannot display images, forms or applets. It is meant for someone who cannot see image. The "title" attribute which may be used along with alt tag for images to offer advisory information about the element for which it is set. Search engine spiders like Googlebot does not see the images and so read the "alt" attribute alternative text and title if specified. Some browsers earlier used alt attribute as tooltip similar to title attribute to render information in standards non compliant manner. Tags used to define alt attribute are img, area. Attributes like 'longdesc' can be used to provide detail information about image or 'list-style-image in CSS' can be used to create long list. 
  • Amazon
  • It is largest retail website online; its other important sites are IMDB and Alexa. It is electronic ecommerce company started by Jeff Bezos in 1994. 
  • Analytics
  • Analytics is a data visualization channel using which you can discover and communicate meaningful data pattern to quantify performance. Analytics for performance enhancement would rely on operation research, statistics and computer programming. Analytics is different from analysis as it is two side approach firstly it gain valuable knowledge from data analysis and then uses this insight to recommend action for complete methodology. 
  • Anchor Text
  • The anchor text is clickable, readable, visible, linkable label, link text, or link title text in a hyperlink. The hyperlink and its linkable text (anchor text) must be relevant to landing page. But in January 2013 Google has levied Internal linking penalty that says if you use generic anchor text (with highly optimized anchor text that is a particular URL has abnormally high percentage of backlinks with a single anchor text) on the home page, pointing to the home page your website would be penalized. Webmaster should not be manipulative and should stop using keywords rich anchor text. Keywords used as anchor text help determine search engine ranking. 
  • API
  • An application programming interface (API) is a protocol used as an interface by software components and applications to interact with another software component or application. It can take many forms. Example: Twitter API 
  • Applet
  • A small program, often written in Java, which usually runs in a web browser, as part of a web page. It is possible that the use of such a program may cause spiders and robots to stop indexing a page. 
  • Arbitrage
  • It is the practice of taking advantage of price difference between commodities between different markets and moving to immediate resale on another market in order to profit from a price discrepancy. Shopping search engines generally draw most of their traffic through arbitrage. 
  • ArchitextSpider
  • The name of the Excite search engine's spider. 
  • ASP
  • Active Server Pages is a server side script engine to generate dynamic web pages. ASP pages are supported by all versions of Internet Information Services (IIS). 
  • Astroturfing
  • The process of creating fake grassroots campaigns. Astroturfing is often used specifically regarding review sites like Google Places, Yelp, Judy's Book and more. These fake reviews can be positive reviews for your own company or slander against your competitors. It is reverse of full disclosure. Example: Participating in a user forum with the secret purpose of branding, customer recruitment, or public relations. 
  • Auction Model Bidding
  • It is a type of PPC bidding that is market or competition-driven bidding. First, an advertiser determines what maximum amount per click they are willing to pay per keyword. If no competition exists for that keyword, the advertiser pays their bid, or less, for every click. If there is competition for a particular keyword at auction, then the advertiser with the highest bid will pay more than their nearest competitor. 
  • Authority
  • The amount of trust or link juice in terms of SEO that a site is given or awarded with for a particular search query. Authority/trust is derived from related incoming links from other trusted sites. 
  • Authority Sites
  • A website with many incoming links from other related expert/hub sites. Because of this simultaneous citation from trusted hubs an authority site usually has high trust, page rank, and search results placement. Example: Wikipedia 
  • Automated Robot or Search Engine Spider
  • Search Engine Spiders are known by different names like Web crawlers, ants, Web Scutters, automatic indexers, bots, Web spiders, Web robots. Search engine robots are automated computer program or a software agent that browses the World Wide Web in an orderly fashion that is in a methodical, automated manner or in an orderly fashion indexing page content and following links. 
  • Automatic Optimization
  • Automatic Optimization says that search engines picks which ad for an individual advertiser would give the highest CTR (click-through rate) as time progresses, and then optimizes the ad serve, showing that ad more frequent than other ads in the same Ad Group/Ad Order. 
  • Autoresponder
  • An automatic email response program is called an autoresponder. 
  • AV AltaVista
  • One of the search engines that helps find web surfers relevant information, video, images, and answers from all across the Web. 
  • Avatar
  • An avatar is an image or username assigned to a person online within forums and social networks. 
  •          B
  • Backlink
  • In SEO backlink is also called by other names inlink, incoming link, inward link and inbound link. In search engine context it holds importance as it indicates importance and popularity of a web page that is an important factor in determining search engine ranking. Any link that is received by a web node (web page, directory, website, or top level domain) from another web node or a link coming to a website is considered backlink. 
  • Bad Neighborhood
  • Avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links 
  • Bandwidth
  • The amount of data transferred (sent to and from the server where the website is locate) on a web site or server within a specific amount of time, measured in kilobytes; would help in determining the data that can be downloaded and at what speed using the internet connection. By analyzing bandwidth you can calculate website traffic (the number of bits that are transferred on network connections). By increasing the network connection bandwidth between the website and internet you can decrease the time to download any file and avoid delay in download when number of user downloading a file increase. 
  • Beacon
  • A web beacon is also called web bug. It is a tracker 1 by 1 GIF (1 x 1 pixel) often invisible having no color. This line of code is placed in an ad or on a web page to track the visitor's actions, such as registrations or purchases. 
  • Bread Crumbs
  • It is a web site navigation in user interfaces viewed as a trail as a horizontal bar above the main content which helps the user to understand where they are on the site and how to get back to the root directory or document. 
  • Broken Link
  • A broken link is a link that links to a web page that does not exist or has been removed to new location or has been deleted. Different reasons may be behind a broken link. There are free online tools and applications to check broken link and validate url online that would help validate link, remove bad or dead link. Broken link should be removed from the website or fixed for proper website indexing by search engines. As a result of broken link or dead link 404 error is generated.



  •          C
  • CheiRank
  • CheiRank is an eigenvector with a maximal real eigenvalue of the Google matrix G^* constructed for a directed network with the inverted directions of links. It is similar to the PageRank vector, which ranks the network nodes in average proportionally to a number of incoming links being the maximal eigenvector of the Google matrix G with a given initial direction of links. Due to inversion of link directions the CheiRank ranks the network nodes in average proportionally to a number of outgoing links. Since each node belongs both to CheiRank and PageRank vectors the ranking of information flow on a directed network becomes two-dimensional. 
  • Cloaking
  • Cloaking is a process of showing different versions of a web page of a website under different situations. In general a different web page is displayed to web surfers/humans to what is shown to search engines. It is primarily used to show an optimized or a content-rich page to the search engines and a different page to humans when a web page has image or flash with no content. There are ways to make your website content accessible to search engine even though it has flash, image, JavaScript. Hackers also use this technique after they hack a website so that website owner does not know. It is not a good SEO practice and is violation of Google webmaster guidelines. 
  •  De-listing
  • De-listing is removal of pages from a search engine's index temporarily or permanently as a result the web pages are de-indexed from a directory or search engine. There are several reasons behind this. If your hosting service provider is on shared IP your website will fail reverse DNS lookup and there are chances of your website being removed from search index. If a website is listed under Spam Black List then must change hosting provider to avoid de-listing. Search crawler crawls from link to link and so lack of links can be another reason for de-indexing. Other reasons are spoofing, cloaking, doorway pages or splash pages, dynamically on fly generated pages, content farm or buying links.                   
  • External Link
  • Link which references or points to another domain outside the same domain.                   
  • Cognitive Pause
  • In relation to Google Instant, a cognitive pause is the time taken by a user to consider the results that were displayed after a letter is typed. If that pause is three seconds or longer, listings on that SERP earn an impression.

  •    F
  • F.F.A
  • Stands for "Free for All" link pages used to artificially use link popularity. Here the URL submissions would remain active for a period of time. A submission is placed at the top of their list and gradually moved down, and eventually is pushed out, as other submissions are made.                  



  •                          Googlewhack
  • A Google search query that consist of two words, but returns a single result.

  •       

  •                      

  •       
  • Hallway page A page that serves as an index to a group of pages that you would like the search engine spiders to find. Once a search engine spider indexes the hallway page, it should also follow all the links on that hallway page and in turn index those pages as well.

  •                          Inbound Links IBL

  • A link from one site/domain into another/domain. A link from another site will improve your SEO, especially if that site has good authority or has a high PageRank.       



  •          J
  • Javascript
  • A scripting language that allows website administrators to apply various effects or changes to the content of their website as users browse it. Javascript makes content readability difficult for search engines.

  •          K
  • Keyword Stemming
  • It is process of having variable options of a keyword root or stem word. It may be build by adding a prefix or suffix, or using plural forms in order to build website traffic.

  •          L
  • Link Condom
  • It is the rel="nofollow" attributethat would kill any passed link value.

  •          M
  • Microsite
  • Microsite is a mini website designed specifically to promote a specific portion or brand from a larger corporate site. The site is used for contests or as a landing page for specific promotion or to introduce launch of new product.

  •          N
  • Naked Links
  • A posted visible link in the text of a web page where URL is the anchor text and is visible to user and it is directly taken to that web site.                    
  •                      
  •                      
  •                      
  •                      
  •          
  •          O
  • Outbound Link
  • Outbound links start from your site and lead to an external site, away from your site.

  •          P
  • Poison Word
  • It is a forbidden word that tends to build mistrust or trigger suspicion for a website with respect to search engine.

  •          Q
  • Query
  • The keyword or keyword phrase combined with other syntax used to pass instructions to a search engine that a searcher enters into a search field, which initiates a search and results in a SERP with organic and sponsored listings to find a web page. 

  •          R
  • Run of Site ROS
  • The scheduling of ads across an entire site, often at a lower cost than the purchase of specific pages or sub-sections of the site. A run-of-site ad campaign is rotated on all general, non-featured ad spaces on a site.

  •          S
  • Seed
  • It is list of URLs that a web crawler visits. A web crawler starts with these seeds and as the crawler visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit, called the crawl frontier. URLs from the frontier are visited repeatedly on interval according to a set of policies.

  •          T
  • The Fold
  • The “fold” is the point on your website where the page gets cut off by the bottom of a user’s monitor or browser window. Anything below the fold can be scrolled to, but isn't seen right away. Search engines place some priority on content above the fold, since it will be seen right away by new visitors. Having too many ads above the fold can be seen as a negative issue, too.
  •          U
  • URL
  • Uniform Resource Locator that is unique location of a file on the internet.
  •          V
  • Vlog
  • It is a Videoblog that is a blog that contains video enteries. It is also called podcasting, vodcasting or vlogging.
  •          W
  • Web Root
  • It is web directory.                     
  •                      
  •                      
  •                      
  •                      
  •                      
  •                     
  •                      
  •                      


Posted by 
If you’re like most SEOs, you spend a lot of time reading. Over the past several years, I’ve spent 100s of hours studying blogs, guides, and Google patents. Not long ago, I realized that 90% of what I read each doesn’t change what I actually do - that is, the basic work of ranking a web page higher on Google.
For newer SEOs, the process can be overwhelming.
To simplify this process, I created this SEO blueprint. It’s meant as a framework for newer SEOs to build their own work on top of. This basic blueprint has helped, in one form or another, 100s of pages and dozens of sites to gain higher rankings.
Think of it as an intermediate SEO instruction manual, for beginners.
Level: Beginner to Intermediate
Timeframe: 2 to 10 Weeks
What you need to know: The blueprint assumes you have basic SEO knowledge: you’re not scared of title tags, can implement a rel=canonical, and you’ve built a link or two. (If this is your first time to the rodeo, we suggest reading the Beginners Guide to SEO and browsing our Learn SEO section.)
How To Rank SEO Blueprint

Table of Contents


Keyword Research

1. Working Smarter, Not Harder

Keyword research can be simple or hard, but it should always be fun. For the sake of the Blueprint, let’s do keyword research the easy way.
The biggest mistakes people make with keyword research are:
  1. Choosing keywords that are too broad
  2. Keywords with too much competition
  3. Keywords without enough traffic
  4. Keywords that don’t convert
  5. Trying to rank for one keyword at a time
The biggest mistake people make is trying to rank for a single keyword at a time. This is the hard way. It’s much easier, and much more profitable, to rank for 100s or even 1,000s of long tail keywords with the same piece of content.
Instead of ranking for a single keyword, let’s aim our project around a keyword theme.

2. Dream Your Keyword Theme

Using keyword themes solves a whole lot of problems. Instead of ranking for one Holy Grail keyword, a better goal is to rank for lots of keywords focused around a single idea. Done right, the results are amazing.
Easy Keyword Research
I assume you know enough about your business to understand what type of visitor you’re seeking and whether you’re looking for traffic, conversions, or both. Regardless, one simple rule holds true: the more specific you define your theme, the easier it is to rank.
This is basic stuff, but it bears repeating. If your topic is the football, you’ll find it hard to rank for  “Super Bowl,” but slightly easier to rank for “Super Bowl 2014” - and easier yet to rank for “Best Super Bowl Recipes of 2014.”
Don’t focus on specific words yet - all you need to know is your broad topic. The next step is to find the rightkeyword qualifiers.

3. Get Specific with Qualifiers

Qualifiers are words that add specificity to keywords and define intent. They take many different forms.
  • Time/Date: 2001, December, Morning
  • Price/Quality: Cheap, Best, Most Popular
  • Intent: Buy, Shop, Find
  • Location: Houston, Outdoors, Online
The idea is to find as many qualifiers as possible that fit your audience. Here’s where keyword tools enter the picture. You can use any keyword tool you like, but favorites include WordstreamKeyword SpySpyFu, andBing Keyword Tool and Übersuggest.
For speed and real-world insight, Übersuggest is an all-time SEO favorite. Run a simple query and export over 100 suggested keyword based on Google’s own Autocomplete feature – based on actual Google searches.
Did I mention it’s free?

4. Finding Diamonds in the Google Rough

At this point you have a few dozen, or a few hundred keywords to pull into Google Adwords Keyword Tool.
Pro Tip #1: While it’s possible to run over a hundred keyword phrases at once in Google’s Keyword Tool, you get more variety if you limit your searches to 5-10 at a time.
Ubersuggest and Google Keyword Tool
Using “Exact” search types and “Local Monthly” search volume, we’re looking for 10-15 closely related keyword phrases with decent search volume, but not too much completion.
Pro Tip #2: Be careful trusting the “Competition” column in Google Adwords Keyword Tool. This refers to bids on paid search terms, not organic search.

5. Get Strategic with the Competition

Now that we have a basic keyword set, you need to find out if you can actually rank for your phrases. You have two basic methods of ranking the competition:
  1. Automated tools like the Keyword Difficulty Tool
  2. Eyeballing the SERPs
If you have an SEOmoz PRO membership (or even a free trial) the Keyword Difficulty Tool calculates – on a 100 point scale – a difficulty score for each individual keyword phrase you enter.
Keyword Difficulty Tool
Keyword phrases in the 60-70+ range are typically competitive, while keywords in the 30-40 range might be considered low to moderately difficult.
To get a better idea of your own strengths, take the most competitive keyword you currently rank #1 or #2 for, and run it through the tool.
Even without automated tools, the best way to size up the competition is to eyeball the SERPs. Run a search query (non-personalized) for your keywords and ask yourself the following questions:
  • Are the first few results optimized for the keyword?
  • Is the keyword in the title tag? In the URL? On the page?
  • What’s the Page and/or Domain Authority of the URL?
  • Are the first few results authorities on the keyword subject?
  • What’s the inbound anchor text?
  • Can you deliver a higher quality resource for this keyword?
You don’t actually have to rank #1 for any of your chosen words to earn traffic, but you should be comfortable cracking the top five.
With keyword themes, the magic often happens from keywords you never even thought about.

Case Study: Google Algo Update

When SEOmoz launched the Google Algorithm Change HIstory (run by Dr. Pete) we used a similar process for keyword research to explore the theme “Google Algorithm” and more specifically, “Google Algorithm Change.”
According to Google’s search tool, we could expect a no more than a couple thousand visits a month – best case – for these exact terms. Fortunately, because the project was well received and because we optimized around a board keyword theme of “Google Algorithm,” the Algo Update receives lots of traffic outside our pre-defined keywords.
This is where the long tail magic happens:
Long Tail Keywords
How can you improve your chances of ranking for more long tail keywords? Let’s talk about content, architecture, on-page optimization and link building.

Content

6. Creating Value

Want to know the truth? I hate the word content. It implies words on a page, a commodity to be produced, separated from the value it creates.
Content without value is spam.
In the Google Algorithm Update example above, we could have simply written 100 articles about Google’s Algorithm and hoped to rank. Instead, the conversation started by asking how we could create a valuable resource for webmasters.
For your keyword theme, ask first how you can create value.
Value is harder to produce than mere words, but value is rewarded 100x more. Value is future proof & algorithm proof. Value builds links by itself. Value creates loyal fans.
Value takes different forms. It’s a mix of:
  1. Utility
  2. Emotional response
  3. Point of view (positive or negative)
  4. Perceived value, including fame of the author
Your content doesn’t have to include all 4 of these characteristics, but it should excel in one or more to be successful.
A study of the New York Times found key characteristics of content to be influential in making the Most Emailed list.

7. Driving Your Content Vehicle

Here’s a preview: the Blueprint requires you create at least one type of link bait, so now is a good time to think about the structure of your content.
What’s the best way to deliver value given your theme? Perhaps it’s an
  • Infographic
  • Video series
  • A new tool
  • An interview series
  • Slide deck
  • How-to guide
  • Q&A
  • Webinar or simple blog post
Perhaps, it’s all of these combined.
The more ways you find to deliver your content and the more channels you take advantage of, the better off you’ll be.
Not all of your content has to go viral, but you want to create at least one “tent-pole” piece that’s better than anything else out there and you’re proud to hang your hat on.

8. Title – Most Important Work Goes Here

Spend two hours, minimum, writing your title.
Sound ridiculous? If you’re an experienced title writer like Rand Fishkin, you can break this rule. For the rest of us, it’s difficult to underplay the value delivered by a finely crafted title.
Write 50 titles or more before choosing one.
Study the successful titles on Inbound.orgMashableWired, or your favorite publication.
Headline Formulas Work

9. Length vs. Depth - Why it Matters

How long should your content be? A better question is: How deep should it be? Word count by itself is a terrible metric to strive for, but depth of content helps you to rank in several ways.
  1. Adds uniqueness threshold to avoid duplicate content
  2. Deeper topic exploration makes your content “about” more
  3. Quality, longer content is correlated with more links and higher rankings
I. Uniqueness
At a minimum, your content needs to meet a minimum uniqueness threshold in order for it to rank. Google reps have gone on record to say a couple sentences is sometimes sufficient, but in reality a couple hundred words is much safer.
II. Long Tail Opportunities
Here’s where the real magic happens. The deeper your content and the more in-depth you can explore a particular topic, the more your content becomes “about.”
The more your content is “about”, the more search queries it can answer well.
The more search queries you can answer well, the more traffic you can earn.
Google’s crawlers continuously read your content to determine how relevant it is to search queries. They evaluate paragraphs, subject headings, photographs and more to try to understand your page. Longer, in-depth content usually send more relevancy signals than a couple short sentences.
III. Depth, Length, and Links
Numerous correlation studies have shown a positive relationship between rankings and number of words in a document.

“The length in HTML and the HTML within the <body> tag were the highest correlated factors, in fact with correlations of .12 they could be considered somewhat if not hugely significant.

While these factors probably are not implemented within the algorithm, they are good signs of what Google is looking for; quality content, which in many cases means long or at least sufficiently lengthy pages.”

 - Mark Collier The Open Algorithm
This could be attributed longer, quality content earning more links. John Doherty examined the relationship between the length of blog posts on SEOmoz and the number of links each post earned, and found a strong relationship.

Links based on wordcount

10. Content Qualities You Can Bank On

If you don’t focus on word count, how do you add quality “depth” to your content?
SEOs have written volumes about how Google might define quality including metrics such as reading level, grammar, spelling, and even Author Rank. Most is speculation, but it’s clear Google does use guidelines to separate good content from bad.
My favorite source for clues comes from the set of questions Google published shortly after the first Panda update. Here are a few of my favorites.

Google Panda Questions

11. LDA, nTopic, and Words on the Page

Google is a machine. It can’t yet understand your page like a human can, but it’s getting close.
Search engines use sophisticated algorithms to model your sentences, paragraphs, blocks, and content sections. Not only do they want to understand your keywords, but also your topic, intent, and expertise as well.
How do you know if your content fits Google’s model of expectations?
For example, if your topic is “Super Bowl Recipes,” Google might expect to see content about grilling, appetizers, and guacamole. Content that addresses these topics will likely rank higher than pages that talk about what color socks you’re wearing today.
Words matter.
SEOs have discovered that using certain words around a topic associated with concepts like LDA and nTopicare correlated with higher rankings.
Virante offers an interesting stand alone keyword suggestion tool called nTopic. The tools analyzes your keywords and suggests related keywords to improve your relevancy scores.
nTopic

12. Better than LDA - Poor Man's Topic Modeling

Since we don’t have access to Google’s computers for topic modeling, there’s a far simpler way to structure your content that I find far superior to worrying about individual words:
Use the keyword themes you created at the beginning of this blueprint.
You’ve already done the research using Google’s keyword tool to find closely related keyword groups. Incorporating these topics into your content may help increase your relevancy to your given topic.
Example: Using the Google Algorithm project cited above, we found during keyword research that certain keywords related to our theme show up repeatedly, time and time again. If we conducted this research today, we would find phrases like “Penguin SEO” and “Panda Updates” frequently in our results.
Google suggests these terms via the keyword tool because they consider them closely related. So any content that explored “Google Algorithm Change” might likely include a discussion of these ideas.
Poor Man's Topic Modeling
Note: This isn't real LDA, simply a way of adding relevant topics to your content that Google might associate with your subject matter.

13. Design Is 50% of the Battle

If you have any money in your budget, spend it on design. A small investment with a designer typically pays outsized dividends down the road. Good design can:
  • Lower bounce rate
  • Increase page views
  • Increase time on site
  • Earn more links
  • Establish trust
… All of which can help earn higher rankings.

“Design doesn’t just matter, it’s 50% of the battle.”
-Rand Fishkin

Dribbble.com
Dribbble.com is one of our favorite source of design inspiration.

Architecture

Here’s the special secret of the SEO Blueprint: you’re not making a single page to rank; you’re making several.

14. Content Hubs

Very few successful websites consist of a single page. Google determines context and relevancy not only by what’s on your page, but also by the pages around it and linking to it.
The truth is, it’s far easier to rank when you create Content Hubs exploring several topics in depth focused around a central theme.
Using our “Super Bowl Recipes” example, we might create a complete section of pages, each exploring a different recipe in depth.
Content Hub for SEO

15. Linking the Hub Together

Because your pages now explore different aspects of the same broad topic, it makes sense to link them together.
  • Your page about guacamole relates to your page about nachos.
  • Your page about link building relates to your page about infographics.
  • Your page about Winston Churchill relates to major figures of World War II.
Linking Your Content Hub
It also helps them to rank by distributing PageRankanchor text, and other relevancy signals.

16. Find Your Center

Content Hubs work best with a “hub” or center. Think of the center as the master document that acts as an overview or gateway to all of your individual content pages.
The hub is the authority page. Often, the hub is a link bait page or a category level page. It’s typically the page with the most inbound links and often as a landing page for other sections of your site.
Center of the SEO  Content Hub
For great example of Hub Pages, check out:

On-Page Optimization

17. Master the Basics

You could write an entire book about on-page optimization. If you’re new to SEO, one of the best ways to learn is by using SEOmoz’s On-page Report Card (free, registration required) The tool grades 36 separate on-page SEO elements, gives you a report and suggestions on how to fix each element. Working your way through these issues is an excellent way to learn (and often used by agencies and companies as a way to teach SEO principals)
On-Page Tool
Beyond the basics, let’s address a few slightly more advanced tactics to take advantage of your unique keyword themes and hub pages, in addition to areas where beginners often make mistakes.

18. Linking Internally for the Reasonable Surfer

Not all links are created equal (One of the greatest SEO blog posts ever written!) So, when you interlink your internal pages within your content hub together, keep in mind a few important points.
  1. Links from inside unique content pass more value than navigation links.
  2. Links higher up the page pass more value than links further down.
  3. Links in HTML text pass more weight than image links.
When interlinking your content, it’s best to keep links prominent and “editorial” – naturally link to your most important content pages higher up in the HTML text.

19. Diversify Your Anchor Text - Naturally

If Google’s Penguin update taught us anything, it’s that over-thinking anchor text is bound to get us in trouble.
When you link naturally and editorially to other places on the web, you naturally diversify your anchor text. The same should hold true when you link internally.
Don’t choose your anchor text to fit your keywords; choose your anchor text to fit the content around it.
Practically speaking, this means linking internally with a mix of partial match keyword and related phrases. Don’t be scared to link occasionally without good keywords in the anchor – the link can still pass relevancy signals. When it comes to linking, it’s safer to under-do it than over-do it.
Choose Descriptive Anchor Text

20. Title Tags - Two Quick Tips

We assume you know how to write a compelling title tag. Even today, keyword usage in the title tag is one of the most highly correlated on-page ranking factors that we know.
That said, Google is getting strict about over-optimizing title tags, and appears to be further cracking down on titles “written for SEO.” Keep this in mind when crafting your title tags
I. Avoid boilerplates
It used to be common to tack on your business phrase or main keywords to the end of every title tag, like so:
  • Plumbing Supplies – Chicago Plumbing and Fixtures
  • Pipes & Fittings – Chicago Plumbing and Fixtures
  • Toilet Seat Covers – Chicago Plumbing and Fixtures
While we don’t have much solid data, many SEOs are now asserting that “boilerplate” titles tacked on to the end of every tag are no longer a good idea. Brand names and unique descriptive information is okay, but making every title as unique as possible is the rule of the day.
II. Avoid unnecessary repetition – Google also appears (at least to many SEOs) on what’s considered the lower threshold of “keyword stuffing.”
In years past it was a common rule of thumb never to repeat your keyword more than twice in the title. Today, to be on the safe side, you might be best to consider not repeating your keywords more than once.

21. Over-Optimization: Titles, URLs, and Links

Writing for humans not only gets you more clicks (which can lead to higher rankings), but hardly ever gets you in trouble with search engines.
As SEOs we're often tempted to get a "perfect score" which means exactly matching our title tags, URLs, inbound anchor text, and more. unfortunately, this isn't natural in the real world, and Google recognizes this.
Diversify. Don’t over-optimize.

22. Structured Data

Short and simple: Make structured data part of every webpage. While structured data hasn’t yet proven to be a large ranking factor, it’s future-facing value can be seen today in rich snippet SERPs and social media sharing. In some verticals, it’s an absolute necessity.
rich snippets
There’s no rule of thumb about what structured data to include, but the essentials are:
  • Facebook Open Graph tags
  • Twitter Cards
  • Authorship
  • Publisher
  • Business information
  • Reviews
  • Events
To be honest, if you’re not creating pages with structured data, you’re probably behind the times.
For an excellent guide about Micro Data and Schema.org, check out this fantastic resource from SEOGadget.

Building Links

23. The 90/10 Rule of Link Building

This blueprint contains 25 steps to rank your content, but only the last three address link building. Why so few? Because 90% of your effort should go into creating great content, and 10% into link building.
If you have a hard time building links, it may be because you have these numbers reversed.
Creating great content first solves a ton of problems down the line:
  1. Good content makes link building easier
  2. Attracts higher quality links in less time
  3. Builds links on its own even when sleeping or on vacation
If you’re new to marketing or relatively unknown, you may need to spend more than 10% of your time building relationships, but don’t let that distract you from crafting the type of content that folks find so valuable they link to you without you even asking.
90-10 Rule of Link Building

24. All Link Building is Relationships - Good & Bad

This blueprint doesn't go into link building specifics, as there are 100's of ways to build quality links to every good project. That said, a few of my must link building resources:
  1. Jon Cooper's Complete List of Link Building Strategies
  2. StumbleUpon Paid Discovery
  3. Citation Labs
  4. Promoted Tweets
  5. Ontolo
  6. eReleases - Press releases not for links, but for exposer
  7. BuzzStream
  8. Paddy Moogan's excellent Link Building Book
These resources give you the basic tools and tactics for a successful link building campaign, but keep in mind that all good link building is relationship building.
Successful link builders understand this and foster each relationship and connection. Even a simple outreach letter can be elevated to an advanced form of relationship building with a little effort, as this Whiteboard Fridayby Rand so graciously illustrates.
 
  


25. Tier Your Link Building... Forever

The truth is, for professionals, link building never ends. Each content and link building campaign layers on top of previous content, and the web as a whole like layers of fine Greek baklava.
For example, this post could be considered linkbait for SEOmoz, but it also links generously to several other content pieces within the Moz family, and externally as well; spreading both the link love and the relationship building as far as possible at the same time.
SEOmoz links generously to other sites: the link building experience is not just about search engines, but the people experience, as well. We link to great resources, and build links for the best user experience possible. When done right, the search engines reward exactly this type of experience with higher rankings.
For an excellent explanation as to why you should link out to external sites when warranted, read AJ Kohns excellent work, Time to Long Click.
One of my favorite posts on SEOmoz was 10 Ugly SEO Tools that Actually Rock. Not only was the first link on the page directed to our own SEO tools, but we linked and praised our competitors as well.
Linkbait at its finest.
About Cyrus Shepard — Cyrus Shepard works on content and audience development for SEOmoz. Follow him on Twitter andGoogle+