Sunday, February 13, 2005

Google filters: How to avoid them

Some of the major concerns, of many search engine optimization (SEO) professionals, are the various alleged filters employed by Google.

It is thought, that the search engine giant has placed many dampening and blocking devices, in its algorithm. They are collectively referred to as filters.

The reasoning behind the alleged use of the various filtration systems is still unclear. It is also not universally agreed, by all SEO professionals, that filters even exist.

Most SEO experts agree, however, that there are some sort of dampening type filters built into the Google algorithm. The evidence for filters is circumstantial, but is fairly evident across the search engine results pages (SERPs).

Since about one hundred factors and variables are programmed into the algorithm, it is entirely probable that a number of them are filters. The questions then become what are the various filters and what are their purposes.

While some experts believe that there are no such filters, it is still a good idea to guard against their possibility. Whether you take the position, that some filters and dampening effects exist or not, there are ways to avoid them entirely.

Let’s examine some of the techniques, of search engine optimization, that help you to steer clear of any of the actual and coincidental filters. Since no one knows precisely what’s in the Google algorithm, it’s better to be safe than sorry. Instead of being caught in a possible filter, or being assessed a Google penalty, it’s a much wiser course of action to be safe rather than sorry.

With a good SEO strategy, you can avoid any possible Google filters, or possible penalties that might result.

What are some of the alleged filters?

There are a number of filters and dampening effects that are noticeable by SEOs. They range from the well known sandbox effect to the so-called over optimization filter.

This list is by no means comprehensive, and doesn’t include every alleged filter. Note as well, that the alleged filters mentioned are purely speculative. Our main purpose is to show the widespread possibility of filtering and dampening, that may result in penalties.

While many SEO experts disagree as to whether any, some, or all of the alleged filter actually exist, it makes good sense to be aware of them. Avoidance of their potential triggers goes a long way toward prevention of possible problems.

In fact, many of the techniques that avoid being trapped by a filter, are also good methods of gaining higher search engine rankings. They provide what amounts to a double benefit.

Let’s consider some of the alleged filters, one at a time, along with some possible methods of avoiding them.

Possible link related filters

One fairly well documented filter involves the use of identical link anchor text on all of your site’s incoming links. Many SEO professionals have noticed, that the use of the same anchor text raises some red flags, that all of the links are not naturally created. The same link text might be signs of link manipulation for the purposes of higher Google PageRank and link popularity.

An easy solution, to the possible identical anchor text filter is to mix up your link text wording. The use of somewhat different keywords, and perhaps even your site title, will bypass that problem entirely. By using no more than a maximum 80% of your regular link text, and at least 20% alternative text, the filter will be avoided.

Potentially, the variety of anchor text wording could provide benefits. The different texts will show a wider range of incoming links, and that could show up well in your backlink totals and search rankings.

There is some evidence that links.htm pages, that place virtually all of a site’s outgoing links on a page labelled in that manner, triggers a filter. Many website owners have seen their links.htm pages not given credit either for their own PageRank, or PageRank and backlink credit for their linking partners. Other webmasters have seen no such lack of link credit.

To overcome even the possibility of being caught by a links.htm filter, simply name your links something else entirely. In fact, don’t use the word links at all. Another idea that might work for you, is to split links up by themes, and place them over several pages. Writing a brief description of each link, and how it is themed to your site might also prevent tripping off the possible filter.

There is concern that reciprocal links may invite a Google filter. There is some evidence that reciprocal link pages may be discounted, but that may be a side effect of the possible links.htm filter as well. Blogs contain many reciprocal links, but seem to have suffered no ill effects, either in backlink totals or in PageRank loss.

One method to combat the possible reciprocal link filter may be to maintain a positive ratio of incoming one way links. Blogs use that technique without thinking about with their heavy tendency to link to one another constantly.

Many observers have noticed a cross linking filter resulting from cross linking together too many sites from the same server, and especially from the same c level block. A c level block is the part of a website address in the third section. Example: 123.123.xxx.123 xxx = c level There is some belief that the threshold trigger for that filter is about 20 linked sites.

To avoid this potential filter, it is a good idea to avoid reciprocal cross linking of many same hosted websites. A good method is to triangular link them A to B and B to C with C back to A. Do not use any other combinations as they could trip off a filter in the algorithm. Being careful with cross linking should avoid any such problems.

Potential multiple cause filters

The so-called over optimization filter appears to be more the result of a number of triggers, than from simply one cause. No one is certain if there ever was an over optimization filter, but strong evidence of it occurring was seen during the infamous Florida update and the later Austin update.

The main culprits appear to have been, the now isolated anchor text filter, various on page problems caused by over use of h1 and h2 tags, and some keyword stuffing concerns.

The over optimization filter appears to have been most active, in the most competitive and therefore most spam laden, keywords and phrases. Website owners, in the more competitive keyword areas, should consider using only the most reasonable levels of keywords in their content; certainly not more than 3%.

Writing naturally will work best for the search engines and for conversions to sales. Over use of keywords may even trigger a filter all by itself. Be careful of overusing h1 and h2 tags. Make certain each page has a unique title tag related to that page’s actual content.

There is thought by some people to be a damper placed on new incoming links. Those freshly created links allegedly don’t give the same bang for the buck as older more established links. This theory is usually discussed in conjunction with the famous sandbox filter.

The sand box filter allegedly works this way. A new site will receive a fresh site bonus from Google and rank highly. Following that initial blush with fame and fortune, the site will drop in the search rankings, and drop, and then drop some more. That is where the alleged sandbox occurs.

Once in the sandbox, the site will be anywhere from two to four months rising in the SERPs to a respectable position. During that sandbox period, regardless of the number and quality of inbound links and the PageRank, that site will not rank well at all.

It is thought that gaining too many links too quickly might be part of the reason for the sandbox. On the other hand, building up a domain with incoming links prior to site launch, may help avoid the sandbox entirely.

Some observers believe only certain search terms are filtered through the sandbox. If that is true, then the sandbox is tied in with the alleged most competitive search term filter as well.

In any case, it’s wise to take the ranking time into consideration when launching a new site. Perhaps setting the site up live, slightly earlier than planned for business, will allow time for any damper placed on the site to be removed.

Duplicate content may not only trigger a filter, but sites that contain a large number of pages that are similar in content might be targeted. If that is indeed the case, then webmasters must be careful to differentiate their pages more completely. Of course, that is a great idea from an SEO point of view. More different pages, each with individual titles and main keywords, means more potentially high ranking pages in the search engines.

Many of the larger sites, which may be experiencing a possible large site and duplicate content filter, are thought to be affiliate sites. If that’s true, then there may even be a bias filter against affiliate sites by means of clamping down more tightly on duplicate or very similar content.

As always, the best way to avoid the possibility of duplicating web page content, is to make certain all pages contain unique and differentiated copy. By continually updating and adding different pages, you will also benefit from the search engine algorithm preference for fresh content.

Conclusion

As stated from the outset, many of these alleged filters are based on partial observation. The filters and dampers are not definitely in place. On the other hand, where there is smoke, fire might be close at hand.

The best way to avoid any filters or dampers is to employ common sense search engine optimization practices. Mix up your link anchor text. Write natural sounding unique content for each web page without keyword stuffing. Avoid duplicate content. Don’t employ any questionable practices designed only to fool the search engines.

Following best SEO practices will work best for you in the long run, and you will avoid any filters, real or imagined.

Tags: , , , .

No comments:

Post a Comment