Webmasters started to optimize their websites and content in the mid 1990’s. In the beginning, optimization didn’t involve much more than writing topical content and submitting the web pages URL (address) for the search engines to crawl the web and add to their index.
According to search engine industry analyst Danny Sullivan, the phrase “search engine optimization” was first used in 1997. Early search engines relied on webmaster set parameters like the key word meta tag which was ripe for abuse by unscrupulous webmasters. Webmasters were able to improperly set non relevant words and phrases in their pages Key Word tag which resulted in the search engines returning non relevant pages for many searches. Further development added factors such as key word density in web page content as a clue to page subject matter, but again this was easily controlled and manipulated by webmasters.
Two graduate students at Stanford University, Larry Page and Sergey Brin developed a search engine, first called Back Rub, later renamed Google, which used an algorithm to evaluate and rank web pages. The output of their algorithm was called Page Rank. Page rank was a scaled indication from 1 to 10 that factored in the following values: keyword frequency and placement on page, keyword use in title tags, and in the webpage’s URL. Also factored in was keyword use in page name, internal links, and website’s architecture (internal linking structure). Perhaps, most importantly, they discovered that the number and strength of links from other websites that pointed to a web page on the internet was essential. Their concept was based on the fact that the more natural links a web page received from around the web, the more easily it could be found by a web surfer, clicking through these links and potentially landing on the page. The algorithm looked at not only the total count of links pointing at the web page, but also factored in the links’ anchor text (clickable words) as a human guide to what the link actually pointed to.
Out of Google’s new search engine algorithm, and its factoring of links, a new phrase to describe the “Link” portion of websites strength was born, Link Popularity. Link popularity is used to describe the relative strength of a website compared to others. E.g. a website with more links pointing to it than another has, is referred to as having higher link popularity than another.
A simple explanation used to explain link popularity, and Google’s search results was often used. If a search was a democracy and links were votes, the website with the highest number of links would rank higher if all other factors were equal.
Because of the simplicity of the home page Google used, and the relative quality of search results, Google quickly gained wide use among web surfers. When search engines first came on the scene, there were hundreds, some claim thousands of them scrambling to claim a portion of the search market. Today there are still many search engines; however, nearly 100% of all searches are performed by two or three main search engines. According to comscore.com in October 2012 search share was divided between Google.com 66.9%, Bing.com (Microsoft) at 16% and Yahoo.com at 12.2% The remaining 5% was divided between AOL and Ask.com. Bing provides the search results for Yahoo.com.
With Google’s popularity grew the industry of search engine optimization specialists, made up of webmasters, technicians and engineers, working to help their clients’ websites appear higher in Google’s results. From the beginning SEO specialists have been classified as either Black Hat SEO specialists or White Hat SEO specialists.
Continued next week…