rouletteguru

Will inaccurate information penalize your website?

Rate this Entry
by , 20 April 2016 at 7:47 pm (3224 Views)
This article is a follow-up to my previous post about Googleís fact algorithm. Basically itís designed to give a lower rank to websites with inaccurate information. After all, the truth is not always popular. Currently, even an unpopular website created by a scammer can rank high, because it creates controversy and buzz on forums, therefore gets a lot of links. This is just one example of how a bad website ranks highly.

Of course only Google knows the exact algorithms used to rank websites, but this article reveals what is likely, and what we already know. Keep in mind make this may be the next Panda or Penguin, so you need to prepare your sites by ensuring information is accurate. And if you are one of the affiliates loading a website with trash like the Martingale system being a best system, chances are youíll see a drop in rankings.

One of the elements of Googleís new algorithm is to be able to determine facts on a webpage, with the least possible assistance from outside sources such as links. It is more of an artificial intelligence algorithm than one that refers to a database of facts. At least thatís what Googleís patents appear to indicate. Artificial intelligence is still a bit of a mystery to most of us. If youíve ever had a conversation with artificial intelligence, itís really hard to call it intelligent. Take this for example:



But replicating human emotion and responses is very different to using mathematical equations to understand content on a website. One possible example is one part of your site may say one thing, but another part may have contradictory information. And contradiction is indicative of inaccurate information. After all, two opposing statements are unlikely to be right. Technically though, it does depend on the context of statements. But generally speaking, the opposing statements indicate a discrepancy. And itís likely that Google will detect a discrepancy, all from your own material.
If youíve ever followed a Wikipedia page, youíll notice that so-called facts tend to change. Something simple may be the birthplace of an individual. Obviously this is because Wikipedia is an open source repository of information, and it relies on the conceptions and beliefs of the writers. And they are often wrong. This is why Wikipedia of course encourages the use of citations and references. But the references are not usually sufficient because the sources may not be credible. But at the very least, most of the time, information on Wikipedia is consistent. Perhaps this is because of the inter-site links, which point to related information to substantiate claims.

You would have seen on many SEO websites the claim that content is king. But not many people understand exactly what this means. But ultimately, with the present algorithms used by Google, it means quality and authority websites are more likely to link to good content. Additionally, that metrics related to bounce rates will be lower, indicating to Google that your website gave the reader what they were looking for. Matt Cutts once said that one of the keys to search engine optimisation is to answer questions, and give visitors what they are looking for. This is more than a hint. To me this is like saying you just focus on your content being great, and let us develop the algorithms to help people find it.

One of the methods employed by Google to test their algorithms is by manually selecting some of the best and worst websites, according to human opinion. This means that their staff will manually choose great websites. Then the algorithms are tested against these websites to see how they rank. And if the resulting ranks closely match that of manual selection by human, then the new algorithm is considered a success. This is basically what they did with Panda and Penguin.
Getting back to my point about artificial intelligence: again Google is heading towards ranking websites based on accuracy of information. It is not feasible for a staff member to research the accuracy of all claims on a website. And itís just not realistic for every new page to be scanned by a real person - in fact this is how the first versions of Yahoo worked, and things didnít turn out well for them. So the future is artificial intelligence that is closely guided by real humans. Of course you may consider any mathematical algorithm to be artificial intelligence, but Iím referring more towards something much smarter, which can cross-reference information on your website. It may be an independent algorithm, or something part of Panda which assesses the overall quality of content on a domain.

What does this mean for Webmasters? Basically get your facts straight. The example I used before is if your website promotes the Martingale as one of the best gambling systems, but another page explains the odds and probabilities of a gambling game, it is entirely possible that this information alone can be used to prove your information is inconsistent. Perhaps Googleís initial release of the artificial intelligence wonít do mathematical sums in this sense, but Google tends to catch up and refine their algorithm, so I donít believe itís far behind. An extension of this is if you copy inaccurate information from another website, Google is likely to eventually detect this, and give better rank to accurate websites. So make sure your content is accurate.
Tags: google Add / Edit Tags
Categories
Uncategorized