Can Google Truly Make Quality Content King Of The Web?
|Average rating: Rate this article|
For those webmasters and e-marketers who can remember a web without Google, life was much less complicated and a lot less tumultuous back then. Since Google came onto the scene and became the dominant search engine in the world… things have changed drastically.
And not necessarily in a negative way, those same webmasters probably jumped for joy when they reached the top of Google for their keywords. Then they complained just as loudly when Google made one of their never-ending changes to their algorithm and these webmasters saw their rankings drop or in some severe cases, disappear from the web altogether. In those early days, most of Google’s major updates were kept secret until the fallout got webmasters fuming or rejoicing.
However, in recent algorithm updates or changes, Google has openly broadcasted these changes to anyone who was listening. The same openness applies to Google’s recent changes dealing with “content farms” and “low quality content” in Google’s SERPs. Matt Cutts stated in his blog, “we’re evaluating multiple changes that should help drive spam levels even lower, including one change that primarily affects sites that copy others’ content and sites with low levels of original content.”
Basically, what Google is trying to do with these changes is to increase the overall quality of their search results by lowering the rankings of sites which it perceives as low quality and containing little or no original content. These would be sites that have scraped content from other websites and that have displayed it, usually along with ads and/or links to affiliate products or other related sites.
At present, this only affects search traffic in the States, but this is no small update since 11% of the queries have changed. And as some webmasters have noted, these changes are indeed improving search results.
One interesting find comes from Alexis Madrigal at www.theatlantic.com that looks at Google’s new improved search results for the keywords “drywall dust” and shows that there were indeed less “content farm” listings in the new results.
However, it’s Google’s definition of “content farms” which has many long-time webmasters concerned. As an online marketer who attributes most of his success to article marketing, Google’s recent updates have me somewhat worried. I contribute articles on a regular basis to many online article directories, most of which are free for other webmasters to use as long as they keep my resource box and links attached. These articles get picked up and displayed on countless sites around the web. I also feature many of those same articles on my own site. I am sure there are thousands of webmasters who do the same thing and who are also worrying how Google’s new changes will affect all this duplicate content.
In most cases, my articles in Ezinearticles usually get displayed at the top of the rankings in Google, sometimes even above the same article on my own main site. This is understandable since Ezinearticles is a much more respected authority site in the eyes of the search engines. Also, all sites come with their own unique keyword ranking “DNA,” which means they optimized for those keywords and any keyword related content which is added to those sites will rank higher in the search engines, especially Google.
Years ago, I tried on several occasions to use “spin software” to make all my articles unique, but I could never bring myself to accept the resulting spins or versions of my articles. They just didn’t seem right and didn’t have the right flow. For me, writing has always been more of a pleasure than a chore and corrupting it in any way is just not worth it. Besides, I have been horrified more than once by seeing fragments of my articles mutilated on some of the aforementioned low quality sites which have scraped my content from the web.
Instead, I started writing unique articles or content which I placed on other sites. One of the main directories for this was Buzzle which switched over to only accepting unique content two or three years ago. I have monitored Buzzle over the years and noticed that it’s traffic stats keep increasing in a steady line upwards, probably due to adding all this constant, unique content. Other article directories seem to have a more see-saw flow to their traffic numbers, if you compare them on sites like Alexa.
I believe this whole issue comes down to quality content and what the search engines perceive as quality. Just because something is unique, doesn’t mean it’s quality content. Google has to judge the quality of the content it finds on the web and it has over 200 ranking factors which it says can filter out the top content and present it to the searcher. The recent “content farm” update is difficult to evaluate… just because content is duplicated or appears on another site doesn’t mean it lacks quality.
As a webmaster, I have always placed related videos and press releases on my sites to complement my own content. I also reference other sites and data in my articles to back up an opinion or to prove a point. Although going forward, I will be very wary of placing content on my sites which is not unique.
Sometimes I find it ironic that Google, since day one, has not created any unique content… its robots crawl the web and compile that information into search results. The quality of those results largely depend on the quality of the scraped content and how well its algorithm can filter out the low quality stuff.
Supposedly, no human eyes judge this whole process which I don’t believe for a minute. Google’s engineers are constantly monitoring the results and constantly adjusting its algorithm to filter out what they don’t like which brings us back to the question at hand. Can Google really perfect a system where only the quality content on the web rises to the top?
Obviously, they can use such factors as bounce rates, time spent on-site, pageviews per visitor, direct access, backlinks from authority sites… and bookmarks in the social media/networking sites. Let’s face it, if a piece of content has 2,000 re-tweets, it must contain something of interest/quality for a lot of people. Likewise, if a piece of content or video has 5,000 comments attached to it and 10,000 Diggs or Likes…chances are good that it is of high quality.
Of course, there is also the technological/mechanical side to site quality. If a site loads slowly and has countless dead links, then it can be easily ranked as low quality. Google has always maintained a user/surfer’s experience is important to what it lists in its results. Content farms and sites with little or no unique content would probably be high on Google’s list of what not to display.
However, judging the quality of a piece of information or writing, without human eyes viewing it, is not so easy. Unless Google has a thousand little Watsons running quietly in the background, intelligently reading and rating all that content, making only quality content king of the web will be extremely difficult for Google to do. Time will tell.
Print this article Bookmark:
About The Author
Rate This Article
Use your mouse pointer to select as many stars as you want, and press the left mouse button to vote.
Other SEO Articles
Rating: 5 stars
|How to Grow Your Twitter Following by Vince Peek (Nov 18, 2011)|
|Growing a large following is something many new Twitter users struggle with. They see all these big accounts with 20, 50 or even 100K followers and wonder how did they ever get there? Today we will give you some tips and tools on how to grow your following steadily and we’ll also make sure that you stay within the Twitter guidelines to avoid account suspension...|
Rating: 4 stars
|What Does the Google +1 Button Mean for SEO by Rich McPharlin (Aug 19, 2011)|
|Google, the internet giant based in California, has had a strangle hold on much of the online market for the past decade. Most would struggle to find a single person in the Western World who is not yet familiar with the Google Search Engine...|
Rating: 4 stars
|Embracing Google's Mayday Update by Mike Beeson (Jun 18, 2010)|
|SEO copywriting has changed dramatically over the past two or three years. Then, it was all meta tags and keyword density. Now, SEO copywriting is more about quality inbound links and useful content that reads smoothly...|
Rating: 4 stars
|18 SEO Myths Debunked by Shubhneet Goel (Jun 5, 2010)|
|SEO is no more a hidden art. It has developed well in past 10 years and many things have changed around in the working of search engines and SEO techniques. Best as well as Worst thing about SEO is that techniques in SEO changes quite frequently but the goal remains the same to get Quality traffic from search engines...|
Rating: 4.3 stars
|SEO Keyword Distribution by Jenny Pilley (May 12, 2010)|
|Keyword distribution for SEO purposes at face value appears to be a simple concept. Distribute your selected keywords evenly across your site and wait for your ranking to rise! Unfortunately, like many things to do with SEO it's not quite that simple and many an individual misinterprets the basic principles or quite simply gets it wrong...|