Tenured Evergreen SEO Factors

Published:
Updated:
by Wayne Smith

Tenured Evergreen SEO refers to someone who's mastered multiple techniques to get web pages to rank. They have expertise in optimizing for different search algorithms and have seen the landscape change over time. They've likely faced setbacks, learned from failures, and refused to give in to the idea that SEO is dead. To stay on top, they may have testing sites, use many strategies, and keep up with the latest search algorithm updates.

Tenured SEO professionals know document retrieval systems can work in many ways. They often employ an Evergreen SEO strategy to ensure that when (not if) SEO factors change, they have a page covering that factor. Based on their experience, they don't claim one SEO tactic to rule them all.

Evergreen SEO efforts that are sustained and use more than a single factor for relevancy and authority are more likely to be successful and endure.

What are the best SEO methods for evergreen SEO?

SEO Evergreen factors are the same as those proven by search engines over time to produce the best search results.

Evergreen SEO

Search engine algorithms have evolved to improve their results. Tenured SEO techniques evolve based on evergreen algorithms. Evergreen algorithms have proven their worth over time and remain fundamental aspects of SEO that endure updates year after year.

Exact keyword match domains are a evergreen SEO factor, have existed since Yahoo search ... They fall under navigational intent, and according to a Penn State study, about 10% of searches are navigational ... people enter domain names into the query search form and expect to see the domain appear in search.

The title must accurately describe what the page is about; Each page must have a unique title; The keyword should appear in the title; The title and description will affect click through rates ... indicate in the title and description what can be found on your page which is different from the other pages on the search results page.

Headlines when used correctly are an Evergreen SEO factor for search engines to determine page relevance to a search term. It is an important enough factor that Google will use the element that has the role of the main headline as the title in search if the title is not relevant. However, headline tags have been misused, and incorrectly using headline tags will fail.

Conceptually keyword proximity as a factor for ranking documents was initially pioneered by Altavista, however, Altavista's implementation of search did not scale with the size of the internet. Hotbot or Inktomi's internet search did and proximity has been a forever algorithm factor in determining relevancy ever since.

Keyword density is an evergreen SEO factor in determining the subject content of a text document. A paragraph written using standard grammar rules will have the subject used at least once, but often more than once. Closely related factors of exact match and keywords near each other can easily be part of the same algorithm.

Sitewide links were an aggressive SEO technique before 2005. Infoseek ranked pages based on the total number of links pointing toward a page; Google's PageRank patent was solely based on the importance of the page containing the link.


The history of SEO and the history of algorithms for online document retrieval systems are linked. The SEO point of view is a front-row seat regarding who pioneered or proved the algorithm in scale! Not who invented it.

Tenured SEO - Historic POV

The battle of search engines to be the internet's number one search engine has produced many search innovations and algorithms. The algorithms advancements can be layed out chronologically based on each search engine pioneered innovation.

How Yahoo Shaped SEO

The evergreen SEO practice of using keywords in title and description was established.

Home pages now use a brand name instead of home for the title.

The tactic of direct matching domain name to the keywords began.

Yahoo innovated document retrieval using a prestructured database.


Why do many in the SEO field not consider the limitations of Infrastructure when discussing SEO theories? Some speculate that AI will replace search as if it has the ability to look at the content of the entire internet in real-time.

Yahoo's infrastructure started based on 16 and 32-bit technology: Using servers with as much computing power as a smartphone ... to scale the index must be presorted. Yahoo did release the specs of their systems.

Altavista had a small index by today's standards. AV peeked in 2002 with 1,000 million web pages, for most of its existence was in the 10s of millions of web pages. But were backed by DEC with massive resources.

Altavista's algorithms were not based on simple and easy

Altavista added concepts like keyword proximity as a relevancy factor

Full page relevancy became an evergreen factor for SEO -- then became tenured by Inktomi


Using massive computer systems, Altavista's algorithms were not limited to simple pre-sorted indexes. They could and did refine advanced search abilities to query the full document, but the growth of the Internet exceeded the growth of computer power. Ultimately, pre-sorted data won.

How Infoseek Shaped the Internet and SEO

Links become a evergreen factor for SEO -- then become tenured by Google.

Metadata Title and Description becomes evergreen SEO.

Used Keyword Density.

Flash could be used but a site map could be used for Infoseek to find all pages on a site.


Infoseek pioneered using a raw (external) link count to rank web pages. Initially, Infoseek used Yahoo's links as a seed database but crawled the web. It used a data center similar to what Yahoo pioneered, but instead of presenting sites alphabetically in the natural order of its database, it ordered and presented sites based on the number of links toward the page.

Inktomi Powered Search Engines

Inktomi researched what people needed from an internet search engine before they began. They looked at internet search as a business opportunity. Cost/efficiency was a critical factor in the design, and the need to scale.

Inktomi a business model for search -- provided a OEM powered search for many sites including Yahoo, AOL, et al. As the global power in search marketers discovered SEO had a excellent ROI.

  • Easy to use
  • Some of the content found on Altavista only used the word once in the content but did not have any content beyond mentioning the keyword.
  • Query until the results page has enough sites ~2,000 and then stop going deeper in the index. One could not see the 5,046th site in Inktomi. This saves resources and allows for more users to use the search.

Full page relevancy using headlines and keyword prominence

Keyword nearness now baked in evergreen SEO factor.

Innovated using CSS to determine the prodominant subject of a web page.

Inktomi developed evergreen keyword density algorithms

Natural order of Inktomi's indexes was content relevance

To address the relevancy issue Inktomi focused on using HTML tags like headlines. Keyword density or frequency indicated to Inktomi what the page was about. These factors could be indexed before the search, (like Yahoo), and these indexes could now have a type of schema for all the ranking data as part of this keyword index.

Inktomi may have created the SEO industry

Inktomi's approach to search engine as a business ... and contracts to other major sites to allow them to interface to the Inktomi's search results. Inktomi's index became a major source of traffic for websites. The ROI for creating SEO content was measurable and better than other forms of advertising.

Inktomi Pale Hat SEO or $earch Engine Friendly Content

Creating content specifically for search engines is a little off-white.

  • Search Engine Friendly included proper meta title, description, and keyword tags.
  • Headline tags
  • On the page keyword density
  • The page needed to be about the keyword of the title and keywords

Inktomi Grey Hat SEO

  • Stuff keywords and place white keywords on a white background.

Inktomi Black Hat SEO

  • Supply one version of the page to Inktomi, which had more keywords, and supply another version of the page to browsers which did not have keyword stuffing.

Natural order of Excite's index was content relevance but content silos were relevant.

Innovated the usage of anchor text as a now evergreen ranking factor.

Excite Search Engine

Excite learned from Inktomi but saw the opportunity for a search engine to be a profitable business. Their vision however was not to index every page on the internet just index pages that were useful for people to find the information they were looking for, people at that time were "surfing the internet" from one page mostly Yahoo to the next page -- in short the excite search engine favored both the destination content authority sites and pages that led to many authority sites or content destinations. These silos as they came to be known, meant Excite could index that curated silo page instead of the fifty or more other pages.

Excite's mission was to become the number one search engine for the internet. They ran advertisements, and traffic exceeded the majority of search engines, but Inktomi/Hotbot had business deals with AOL and Yahoo for web search. They would have made it to the number one search engine if not for the growth of Google.

I never tenured beyond Excite's silos ... there were algorithms for the authority pages ... The silos were a level playing field. A page could be created with sub-pages or as a topical theme; be submitted and then appear at the top of the Excite search in a day or two. People would visit the page, click the link or banner, and then purchase the product. It was an ideal time to be an affiliate marketer.

Pages were dropped regularly normally on a monthly schedule. The pages linked to from an index page would normally be indexed during this schedule and could promote or demote the page that had the links; depending on relevancy.

Excite pioneered Innovations

  • Natural Ordered of the database by Keywords.
  • Extended keyword index to include anchor text of links on the page, as a ranking factor
  • Used keyword density with an upper limit to avoid spam, but no upper limit for anchor text
  • Focused on the top portion of the HTML file as an above-the-fold keyword zone
  • Metadata title and description were qualifiers

SEO While not tenured on all aspects of their algo, creating topical clusters or silos was a practice. Jumpto or url# were rejected, (tested). All links on a page needed to go to a content page, which needed to exist and have relevance to the topic. Linking to 404 pages would cause pages to be dropped from the index. Duplicate content was dropped. Flooding the search results page with more than one page per site could cause the page or pages to be dropped. Obvious spam or keyword stuffing could cause the page to be dropped.

Cloaking source code was frowned on but not a violation of the terms of service as long as the content matched ... cloaking was used by major brands so that others could not copy their SEO content.


Direct Hit and others innovated using data from the SERPs to rank pages.

Direct Hit Search Engine

Direct Hit's vision of the ideal search was to list the sites, which they indexed, first in the search results. They learned from Yahoo's, Inktomi's, and Infoseek's database structures. And used a natural order for indexed pages based on what pages were clicked on for what term.

Clearly, such a search term requires the URL listed first to maintain a minimal click-through rate and check for click spam -- an early tier of detecting fake clicks for advertising. Pages that could not maintain interest were demoted and new pages would surface.

Direct Hit pioneered Innovations

  • Natural Ordered of the database by keyword clicks.
  • Methodologies to detect fake or robotic clicks.
  • Methodologies to detect high bounce back the result of link bait meta content
  • Metadata title and description were qualifiers

SEO was basic click-through rate optimization, and on-page above-the-fold optimization for human visitors as well as conversion optimization.



... Solution Smith tests SEO tactics so you don't have to ...

Full-stack SEO has a lot of moving parts, and it takes time and effort to understand the nuances involved. Solution Smith takes care of the overhead costs associated with digital marketing resulting in real savings in terms of time and effort.