Courtesy of Flickr |
Nobody wants to upset the 800
lb. gorilla in the room named Google.
Yet, that is precisely what many business owners do when they employ
serial sites to do their bidding. Google
hates serial sites with a passion.
That’s because these online clones attempt to generate position by the
cookie cutter method, where the same site is used to target a number of
individual keywords. If the Googlebots
catches you using serial sites, all your sites could wind up sandboxed (or at
the very least lose ranking).
The real question then
becomes one of, “What is a serial site?”
The answer to this question has changed over the past year or so. In fact, landing pages that were once
considered perfectly legitimate by Google have been deemed serial sites. The
way the website owners found this out was when their landing pages disappeared
from the first page of Google. Other
recent changes in Google’s algorithms also turned out to kill site ranking,
including such things as being mobile-unfriendly, using certain kinds of
programming languages, specific programming methodologies, as well as where and
how your site employs backlinks. On
today’s Working the Web to Win blog, I am going to explore the ins and outs of
technology that can kill your site stone cold dead in the eyes of the world’s
most popular search engine.
English: a chart to describe the search engine market (Photo credit: Wikipedia) |
Why all the Fuss?
If you have been working the web
for any length of time you have no doubt heard about Google’s algorithm changes
that sport cute names like Panda, Penguin, Hummingbird and Pigeon. The majority of these sea changes were brought
out to deal with professional search engine optimizers who for years plied their
trade by using a number of no-holds-barred techniques to advance their client’s
rankings regardless of the rules. This is
what is known in the business as Black Hatting.
Up until around 2010, the search
engine spiders weren’t sophisticated when it came to understanding what they read
on websites. So Black Hat techniques like
keyword stuffing, using invisible text, cloaking, redirecting, content spamming
and link farms were employed with glee. Many
black hat SEO pros made a tidy sum by helping clients cheat their way to the top
of the search engines. Then a funny thing happened on the way to the bank. The search engines started programming their algorithms
to selectively search for black hat technology.
Rolled out in February 2011, Panda
was the first time the Googlebots were able to start looking at websites from a
contextual standpoint. In other words, they were not only able to read the page,
they could also make qualitative judgments on the validity of the text they were
seeing. Among other things, they looked for
such things as nonsense statements stuffed with keywords that were typically used
by black hatters. They also kept a weather eye alert for factual errors, invisible
or micro text, duplicate text, redirects and a number of other telltale hints that
a site was employing black hat techniques.
While this didn’t exactly put all black hat operators
out of business overnight, it did put a dent in their nefarious business practices.
Penguin Takes Flight
Courtesy of |
Hummingbird Hums a Different Tune
Courtesy of |
Released on July 24, 2014, Pigeon
was tasked with increasing the value of local search. What this algorithm tweak was supposed to do was
make local searches more intuitive by providing search results based upon the geographic
location of the website. While this change
benefited a number of local businesses, it also had the unsettling effect of diminishing
the results of a number of businesses that worked on a national or even global scale. It also gave more weight to online directories
and portals that aggregated local listings.
Like most algorithm changes this caused initial panic among those whose page
1 positions were usurped, followed by damage control to reclaim this lost territory.
The Geotargeted Faux Pa
This brings us back to the top
of our story since serial sites were often used to regain lost ground by creating
geotargeted sites. If you sold hotdogs online,
you might commission a number of sites that were targeting major cities, such as
PhiladelphiaHotDogs.com, DetroitHotDogs.com and DenverHotDogs.com. These sites would
be virtual clones of one another with the exception of their url and the name of
the city in the content. While initially
successful, this technique was also deemed off limits and the Googlebots were once
again programmed to seek and destroy those who employed serial sites. The fallout meant that others who were using legitimate
landing pages were also scooped into the serial site net and tarred with the same
brush.
Courtesy of |
However, that does not mean that
landing pages have to be abandoned altogether.
Let’s say that you sell apples, bananas, oranges and grapes online. While you can set up a FruitsRUs website that
gathers all these elements under one umbrella, this isn’t the most efficient way
to please either the Googlebots or prospective website visitors. In the first place by featuring 4 different fruits
on one site, the Googlebots will not give priority to any single one of them. This means watered down search results. It also means that a potential customer that happens
upon your site has to hunt for the fruit they seek. This usually translates into a high bounce rate.
Here's what Google has to say about hiring & SEO Vendor
In order to reduce the bounce
rate and improve ranking, what many savvy business owners did was create four separate
landing pages, each of which gave priority to a single item. So ApplesRUs, BananasRUs, GrapesRUs and OrangesRUS
were created. While this was cost and time
efficient, it was not effective. If each of these sites were virtual clones of one
another, with the exception of the URL, they would soon be deemed serial sites and
sandboxed. In order to avoid the ire of the
Googlebots, what needs to be done is that each site, while retaining the FruitsRUS
brand (i.e. the look and feel) they need to make sure that the text, graphics, videos
and even the offers on each of these pages is unique and focus on the particular
fruit.
Courtesy of |
The more your landing pages focus
on a single subject, the higher they will rank on those specific keywords (assuming
all other ranking factors are equal). So for example: if this particular page focuses
on grapes, its keywords are on grapes, its content, offer, pictures, videos, testimonials
are all about how great your grapes are, It will rank higher. It also needs to be
well shared on the social nets, and has to be back-linked to many authoritative
sites. This method of creating landing pages will outrank other sites that are less
focused (more than one fruit) or a site not as well connected.
Courtesy of |
It is my opinion that the top
ranking factors start with the quality of your content, the timeliness, relevance,
the connectedness, along with whether your social engagement and page sharing are
positive or negative, have the greatest impact on your ranking. Make no mistake,
content is king. Having a focused page that has high quality, relevant and timely
content that is being shared and is well connected, will rank higher than any competing
page that is not equal in these aspects.
Here is a list of Must Read Articles to help you achieve a
page one organic ranking.
As long as you put in the effort
to understand and avoid the speed bumps that Google has erected on the Information
Superhighway, there isn’t any reason you should be labeled a serial killer by the
world’s most popular search engine. At least not until their next algorithm “tweak”
rears its ugly head.
In this article, I have discussed
how having serial sites (duplicate sites) can cause them to be sandboxed by Google.
I also provide details on several of Google’s other algorithms including; Penguin,
Hummingbird and the latest one called Pigeon. I have provided an overview of how
they differ from each other and what a company can do to mitigate
their effect on their search position. I have also provided details for helping
businesses deal with the Google Pigeon update, which affects local directory listings
and how a company is viewed in Google search locally.
If you found this article useful,
please share it with friends, family, co-workers and associates. To learn more about
this subject, read any of the articles listed above. You can also search this
blog by typing in "Google algorithms or SEO” in the search box to find other
articles and podcasts on the same subject.
If you feel your business could use some help with its marketing, contact us at 904-410-2091,We will provide a free marketing analysis to help you get better results. Don't forget to Plus us on Google+ as well. If you have a comment related to this article, leave it in the comment section below. If you would like a free copy of our book, "Internet Marketing Tips for the 21st Century" (in its 3rd Edition), fill out the form on the right hand side bar and you will receive immediate access and remember your data is always protected and never sold.
Thanks for sharing your time with me.
Carl Weiss is president of Working
the Web to Win, a digital marketing agency based in Jacksonville, Florida.
You can listen to Carl live every Tuesday at 4pm Central on BlogTalkRadio.
Related articles
all this talk of black hats and sandboxes makes me nervous, but we definitely need to be educated so we don't tick off the gorilla! having 800 lbs going ape-schnit on you could be a very bad thing indeed! thanks, guys!
ReplyDelete