Tenured (Evergreen) SEO

Part Of: Full Stack SEO Best Practices
Published:
Updated:
by Wayne Smith

Tenured SEO refers to someone who's mastered multiple techniques to get web pages to rank. They have expertise in optimizing for different search algorithms and have seen the landscape change over time. They've likely faced setbacks, learned from failures, and refused to give in to the idea that SEO is dead. To stay on top, they likely run testing sites try new strategies, and keep up with the latest search algorithm updates.

SEO efforts that are sustained and use more than a single factor for relevancy and authority are more likely to be successful, and stand the test of time.

What are the best evergreen algorithms for ranking documents?



The history of SEO and the history of algorithms for online document retrieval systems are linked. The SEO point of view is a front-row seat regarding who pioneered or proved the algorithm in scale! Not who invented it.

Tenured SEO - Historic POV

The battle for companies to be the internet's number one search engine has produced many innovations and algorithms.

Yahoo started out as a curated directory of websites, where a team of editors manually reviewed and organized sites into categories. Yahoo allowed websites to be listed in their search engine for free. They would create a title and description for your site.

Later on, Yahoo introduced a paid service that could speed up the process of getting your site listed. In contrast, Google today has rules in place that prohibit paid links. Their rules are outlined in their terms of service.

Yahoo search was based on metadata only, but to scale they broke the database down to keyword indexes; Then merged the keyword indexes with either an all, or any depending of users preference.

Innovations to search by Yahoo include having a data center where the indexed data on a keyword by keyword basis spread across systems. In short this innovation allows search engines to scale ... a general requirment for all internet search engines.

Yahoo pioneered innovations

  • Data center with indexes based on the keyword. Each index could exist on a different computer.

While the search and directory were alphabetical. They pioneered the data center for internet searching by breaking the database down into keyword indexes.

The structure of having A search for "apple pie" where one would open the "apple" index and the "pie" index and merge the data for records that exist in both indexes ... Could be and was expanded on by others. It allows for multible ranking factors.

  • Create site to pass the human qualifier at Yahoo
  • Use keywords in title and description

It can be said that stuffing meta data with keywords began with Yahoo. Although the information was submitted via a form, and did not use the on-page meta data.

Having the keyword in the meta data would result in the site showing up for a search for that keyword.

Infoseek pioneered using a raw (external) link count to rank web pages. Initially using Yahoo's links as a seed database, but crawled the web. They using a data center similar to what Yahoo pioneered, but instead of presenting sites alphabetically in the natural order of its database. It ordered and presented sites based on the number of links toward the page.

Infoseek was a major internet search engine but not a very popular search engine; The search results were not very volatile; The raw count of links pointing towards a page does not change very much or very fast. Once somebody used Infoseek they had their answer and trying to find new pages with additional information was difficult. It did not have the advanced search features of Altavista.

Links were King

Pages were qualified based on content, mostly keywords in the title, description, and on-page. However, I should stress I did not optimize for the Infoseek search engine; The search engine did not drive a lot of traffic. But once a page was in Infoseek it could change content and rank for a different term. But it was more like what is called "pay to play" today -- get links and the site was indexed, (it was as simple as that), But the ROI often didn't work out for the traffic received.

Link Farms Became King

Link Farms would be considered spam today. By setting up multiple sites with links pointing towards the page one wanted to rank the page could be pushed up on the search results pages.

Infoseek pioneered innovations

Using Yahoo's concept of a data center with an index for each keyword as a best practice: Infoseek pioneered doing the heavy number crunching to build the database into a natural order based on a relevancy factor -- links.

Infoseek was slow to change a search in Feburary may have the same sites in the same order as they were in January. SEO for Infoseek was a long game, a site that got links would eventually show up in infoseek.

  • Links are the ranking factor - Get site listed in Yahoo - Wait.
  • The Metadata and content are qualifiers being close to what worked in Altavisa or Inktomi.

Do to the slowness of Infoseek to update listing, and the level of traffic did not attract nearly the level of interest for SEO than did Hotbot, (inktomi), excite, or even Altavista. The major difference Infoseek had was using external links. The ROI for creating pages specifically designed for search engines fueled the SEO industry. Pages were created specifically for Infoseek.

  • Link farms and link exchanges were the main method used to add infoseek to the portfolio of search engines listing a site in the emerging SEO industry.
  • CMS systems created a powered by link to themes to provide a link back to the author's site.

Altavista was a small search engine by today's standards. AV peeked in 2002 with 1,000 million web pages, But for most of its existence was in the 10s of millions of web pages.

AV was a full-page search engine, (unlike Yahoo which was a search of metadata), but usage was complicated.

Complex Search Queries

all:(pie, apple) + nearby:(pie, apple) + any:(recipe, "how to make")

Would present results having both apple and pie; Apple and pie would be near each other in the document; And, either a recipe or "how to make" would be on the page. Just entering "apple, pie, recipe" would present pages that contain all the words. Symbols such as +, -, ~, (), |, ~ could be used. "~" would be nearby. "|" would by either or "or". The "-" would exclude pages including the term. The "()" would be used to nest logical search queries. So the search could be simplified to:

(apple ~ pie) + (recipe | "how to make")

None the less with a small set of pages it was still a challenge to find the perfect apple pie recipe. But, drilling into Yahoo one would find a recipe site that may or may not have an apple pie recipe. Yahoo did not need a college course on how to find pages on the internet.

Altavista's Algo - Content was King

Putting aside the complex search query. The algorithm used favored the keyword appearing in the title, then the description, then the page. Anything beyond three appearances of the keyword seemed to have zero effect on the results. Using headline tags may have played a part in optimizing for the Altavista algo. However, keyword stuffing did not work.

No consideration was given to links pointing towards a page. A new page, long page, and short page were all on the same level -- Content was King.

Altavista's Inovations

Finding keywords that are near each other, although may have limited scalability for Altavista, presented results that often matched what the user was looking for. The "exact match" has also remained a critical element of the user interface with search engines.

  • Advanced queries allowing people to find results based on keywords on the page.

Altavista White Hat SEO

  • Keyword research.
  • Content creation with keywords near each other, (within the same HTML tag), and exact match.
  • Using keywords in headlines was a factor and aided with bounce back.

By the time Inktomi, hotbot.com, became a major search engine it drove more traffic to sites than Altavista. Inktomi provided web search for Yahoo and ROI for optimization or creating pages for Inktomi made sense.

Altavista Gray Hat SEO

Few people created pages specifically for Altavista.

  • Title, Description, Headlines and using the word on the page were the main attributes for Altavista content pages.

Inktomi researched what people needed from an internet search engine before they began. They looked at internet search as a business opportunity. Cost/efficiency was a critical factor in the design, and the need to scale.

  • Easy to use
  • Some of the content found on Altavista only used the word once in the content but did not have any content beyond mentioning the keyword.
  • Query until the results page has enough sites ~2,000 and then stop going deeper in the index. One could not see the 5,046th site in Inktomi. This saves resources and allows for more users to use the search.

To address the relevancy issue Inktomi focused on using HTML tags like headlines. Keyword density or frequency indicated to Inktomi what the page was about. These factors could be indexed before the search, (like Yahoo), and these indexes could now have a type of schema for all the ranking data as part of this keyword index.

Content was King

Inktomi was a level playing field for websites. It did not take into account, (at least in the early days), links or the size of the site. If the page content was about a topic, it could rank.

Inktomi pioneered Innovations

  • Natural Order of database by Keyword for scalability - pioneered by Yahoo
  • Extended keyword index to include additional attributes -
    On-page content is more than qualifiers, becoming ranking factors
    Keyword density or keyword frequency becomes a ranking factor.
  • Above the fold content as a keyword zone
  • Used Semantic HTML tags (keyword zones)
    Headline tags became a ranking factor.
    Anchor text as a ranking factor.
  • Identified white text on a white background as a negative factor

Inktomi may have created the SEO industry

Inktomi's approach to search engine as a business ... and contracts to other major sites to allow them to interface to the Inktomi's search results. Inktomi's index became a major source of traffic for websites. The ROI for creating SEO content was measurable and better than other forms of advertising.

Inktomi Pale Hat SEO or $earch Engine Friendly Content

Creating content specifically for search engines is a little off-white.

  • Search Engine Friendly included proper meta title, description, and keyword tags.
  • Headline tags
  • On the page keyword density
  • The page needed to be about the keyword of the title and keywords

Inktomi Grey Hat SEO

  • Stuff keywords and place white keywords on a white background.

Inktomi Black Hat SEO

  • Supply one version of the page to Inktomi, which had more keywords, and supply another version of the page to browsers which did not have keyword stuffing.

Keyword Stuffing - Spam

Marketers discovered that by adding keywords to a page their site/page would rank higher in search. These could also be white lettering on a white background.

Reading CSS and rendering the page

For Inktomi to remove some types of spam pages, it needed to look at the CSS style sheets to determine if white text is appearing on a white background, which was a method of spamming.

More Inktomi pioneered Innovations

  • CSS to determine and/or qualify keyword zones.
    Inktomi was able to determine the size of text and content blocks and determine if the content was above the fold. Using this information they further refined their search results.
  • Normalize Keyword Densities so more than a normal density of keywords did not increase ranking

Excite learned from Inktomi but saw the opportunity for a search engine to be a profitable business. Their vision however was not to index every page on the internet just index pages that were useful for people to find the information they were looking for, people at that time were "surfing the internet" from one page mostly Yahoo to the next page -- in short the excite search engine favored both the destination content authority sites and pages that led to many authority sites or content destinations. These silos as they came to be known, meant Excite could index that curated silo page instead of the fifty or more other pages.

Excite's mission was to become the number one search engine for the internet. They ran advertisements, and traffic exceeded the majority of search engines, but Inktomi/Hotbot had business deals with AOL and Yahoo for web search. They would have made it to the number one search engine if not for the growth of Google.

I never tenured beyond Excite's silos ... there were algorithms for the authority pages ... The silos were a level playing field. A page could be created with sub-pages or as a topical theme; be submitted and then appear at the top of the Excite search in a day or two. People would visit the page, click the link or banner, and then purchase the product. It was an ideal time to be an affiliate marketer.

Pages were dropped regularly normally on a monthly schedule. The pages linked to from an index page would normally be indexed during this schedule and could promote or demote the page that had the links; depending on relevancy.

Excite pioneered Innovations

  • Natural Ordered of the database by Keywords.
  • Extended keyword index to include anchor text of links on the page, as a ranking factor
  • Used keyword density with an upper limit to avoid spam, but no upper limit for anchor text
  • Focused on the top portion of the HTML file as an above-the-fold keyword zone
  • Metadata title and description were qualifiers

SEO While not tenured on all aspects of their algo, creating topical clusters or silos was a practice. Jumpto or url# were rejected, (tested). All links on a page needed to go to a content page, which needed to exist and have relevance to the topic. Linking to 404 pages would cause pages to be dropped from the index. Duplicate content was dropped. Flooding the search results page with more than one page per site could cause the page or pages to be dropped. Obvious spam or keyword stuffing could cause the page to be dropped.

Cloaking source code was frowned on but not a violation of the terms of service as long as the content matched ... cloaking was used by major brands so that others could not copy their SEO content.

Direct Hit's vision of the ideal search was to list the sites, which they indexed, first in the search results. They learned from Yahoo's, Inktomi's, and Infoseek's database structures. And used a natural order for indexed pages based on what pages were clicked on for what term.

Clearly, such a search term requires the URL listed first to maintain a minimal click-through rate and check for click spam -- an early tier of detecting fake clicks for advertising. Pages that could not maintain interest were demoted and new pages would surface.

Direct Hit pioneered Innovations

  • Natural Ordered of the database by keyword clicks.
  • Methodologies to detect fake or robotic clicks.
  • Methodologies to detect high bounce back the result of link bait meta content
  • Metadata title and description were qualifiers

SEO was basic click-through rate optimization, and on-page above-the-fold optimization for human visitors as well as conversion optimization.



... Solution Smith tests SEO tactics so you don't have to ...

Full-stack SEO has a lot of moving parts, and it takes time and effort to understand the nuances involved. Solution Smith takes care of the overhead costs associated with digital marketing resulting in real savings in terms of time and effort.