SAN MATEO (06/12/2000) - In the world of pop music, registering a hit in the top 10 remains a valid benchmark of success. The same benchmark can be applied to search engine rankings on the Internet, where a placing in the top 10 can result in increased site traffic, additional revenue opportunities, and enhanced customer relations -- all without spending additional advertising dollars.
Statistics reveal the need to take search engine placement seriously. According to a recent study by Zona Research Inc., a market research consultancy in Redwood City, California, search engines are used 77 percent of the time by people looking to find information on the Web. Not surprisingly, this makes the search engine the No. 1 tool for information retrieval on the Net.
Unfortunately, with well over a billion pages on the Internet, it is difficult to establish and maintain top 10 staying power. To make matters worse, establishing a high search engine ranking is more of an art than a science.
Securing a place near the top of one search engine, such as AltaVista Co., does not necessarily guarantee a successful search ranking with other engines, such as Google Inc. or HotBot.
Fortunately, there are some established techniques that can be used to increase your chances of breaking in to the top 10. It also pays to understand the different types of search tools that are being used on the Internet and their various methodologies for cataloging the Web. The task of climbing the ranking ladder can also be partially automated using commercially available software products such as FirstPlace Software WebPosition Gold and Woftam Jones AddWeb Pro.
For the most part, Internet search tools fall into two camps: search engines, such as HotBot and AltaVista, and online directories, such as Yahoo and Lycos.
The difference between the two is related to how they compile their site listings. Of course, there are exceptions to every rule. Some search utilities, such as Ask Jeeves, combine the search engine and directory approaches into a single package,hoping to provide users with the best of both worlds.
In directory-based search services, the Web site listings are compiled manually. For example, the ever-popular Yahoo dedicates staff resources to accept site suggestions from users, review and categorize them, and add them to a specific directory on the Yahoo site.
You can usually submit your Web site simply by filling out an online form. On Yahoo, for example, you'll find submission information at www.yahoo.com/docs/info/include.html. Because human intervention is necessary to process, verify, and review submission requests, expect a delay before your site secures a spot in a directory-based search service.
On the flip side, search engines completely automate the compilation process, removing the human component entirely.
A software robot, called a spider or crawler, automatically fetches sites all over the Web, reading pages and following associated links. By design, a spider will return to a site periodically to check for new pages and changes to existing pages.
Results from spidering are recorded in the search engine's index or catalog.
Given the wealth of information available on the Internet, it is not surprising that indexes grow to very large sizes. For example, the AltaVista index has recently been increased to top out at 350 million pages. This may seem like a mammoth number, but by all estimates it still represents less than 35 percent of all pages on the Web.
Because of the depth and breadth of information being indexed, there is usually a delay, sometimes up to several weeks, between the time a site has been "spidered" and when it appears in a search index. Until this two-step process has been completed, a site remains unavailable to search queries.
Finally, the heart of each search engine is an algorithm that matches keyword queries against the information in the index, ranking results in the order the algorithm deems most relevant.
Because the spiders, resulting indexes, and search algorithms of each search engine differ, so do the search results and rankings across the various search engines. This explains why a top 10 site in HotBot may not appear near the top of AltaVista when the same keyword search criterion is entered.
Because all search engines rely on keyword searching to generate results, you need to keep keywords in mind when creating your site's content.
To start, a combination of relevant page titles and body copy can influence site ranking. Determine which keywords prospective visitors would use to search for your site -- and for the sites of your competitors -- and include those terms in your pages.
Keyword placement and frequency throughout a page are prime factors in the ranking process. It's a good idea to use a combination of unique and common keywords, but don't overuse unique or proprietary terms at the expense of common or general terms that are more likely to be used in a search.
In addition, many, but not all, search utilities also reference metatags -- invisible HTML tags within documents that describe their content -- as a way to control how content is indexed. As a result, proper use of metatags throughout a site can also boost search engine ranking.
There are well over 50 metatags available for use within HTML pages, but those most commonly used by search engines for indexing and ranking purposes are the description and keywords tags. As the name indicates, the description tag includes a short description of a site that a search engine would display in a search results list.
For example, the following description tag would allow Web searchers to quickly ascertain the service offerings of Acme Corp. and would entice them to go to the site to download free evaluation copies of software:
Keywords for a search engine to associate with a particular page are stored in the keywords tag. These are words that a user would type as part of a search engine query to bring up relevant Web pages from a search engine's index.
For example, Acme Corp. might include the following metatag in its home page:
As in the titles and body text of Web pages, you should try to combine both unique and common keywords, as well as specific and general ones, in the keywords tag. Some Web developers even include misspelled keywords. Again, the idea is to anticipate the keywords potential visitors would use to find your site.
For large, content-rich sites, maintaining metatags can quickly become tedious.
For this reason a number of products, such as Watchfire Metabot, have arrived on the scene to automate the process. Given the growing importance of XML and meta data for interaction across disparate applications and systems, the proper use of metatags will only become more important down the road.
Despite their proliferation, meta-tags are still ignored by some search engines. Lycos, for example, ignores them entirely in the ranking process. The use of metatag spamming, such as repeating a particular keyword several hundred times in a Web page, has also forced some search engines to reduce the weight of metatags in determining their rankings.
A final factor affecting ranking results in some engines, such as Excite, is page popularity, which is determined by how often a particular Web page is linked to by other pages in the index. The more popular a page is, the higher the ranking. This granular approach to Web page ranking is difficult to influence and is being used by search engines more frequently.
As more and more Web sites arrive on the scene, it will become increasingly more challenging for search engines to rank pages effectively and fairly. You can expect search engines to continue to get smarter, employing linguistic analysis and other heuristic techniques to return more meaningful results. The continued use of XML and embedded meta data will also allow for better ranking and comparisons across information resources.
Naturally, Web site developers will have to continue to roll with the changes.
But considering how heavily Web surfers rely on search engines, your site developers should make search engine ranking an integral part of their site maintenance procedures, if it isn't already.
Technical Analyst Todd Coopee (email@example.com) covers Web-based analysis tools, Internet-based groupware, and application development tools for the InfoWorld Test Center.
The seven habits of effective web site rankingProper Web site promotion should include search engine optimization and registration. When integrated into the routine maintenance cycle of your site, the following seven tips can help maximize the visibility of your site.
1. Stay informed. Competition for the top spots causes the results of search engines to change weekly. Search engines also periodically change the manner in which they index sites and pages. To stay in the hunt, it pays to keep abreast of all changes in the search engine arena. A good place to start would be to subscribe to the Search Engine Watch newsletter at www.searchenginewatch.com.
2. Identify the competition. Using the keywords that users would enter to find your site, see who your competition is. Assess the top 10 URLs returned by each search engine. Look at the description and keyword metatags in their HTML source code to determine the secrets of their success.
3. Get the word out. Reciprocal links -- links from external Web sites -- are a great way to increase site traffic. Be on the lookout for new online directories and industry association sites that might be willing to include links to your site.
4. Make good use of your pages. Because some search engines ignore metatags, the titles of your HTML pages should be as descriptive as possible. Each should contain a few, but no more than 10, of your top keywords.
5. Register, register, and register again. Because you have no control over which Internet search tools people may use to find your site, be sure to register with all the major ones. And remember, site registration isn't a one-time event. Make a commitment to keep track of your ranking results and boost them whenever necessary.
6. Maintain your metatags. Keep your site's metatags up-to-date. Change the contents of your keyword metatags as the content of your pages change. Also, reevaluate your keywords periodically to make sure they reflect current industry usage and buzzwords.
7. Remember that third-party software can help. A number of tools are available to assist you with site ranking and metatag maintenance. Don't be afraid to use them.
THE BOTTOM LINE
Search engines and site rankings
Business Case: The search engine is the No. 1 tool used by consumers to find what they are looking for on the Internet. Appearing near the top of a search engine results list is the equivalent of free advertising for your Web site, affording you the opportunity for increased traffic and additional revenue.
Technology Case: Improving the ranking of your Web site in any search results list is an inexact science at best. But the proper use of metatags, keywords, body text, and page titles, along with the assistance of third-party software, can help your cause considerably.
+ Appearing near the top of a search results list can drive traffic to a site, increasing revenue opportunities+ Third-party software can ease metatag and keyword maintenanceCons:
- Maintaining a ranking can be labor-intensive- Achieving consistent rankings across multiple search engines can be difficult.