Google’s latest search research paper hints at major changes on the horizon, says Danielle Haley, Co-director at FSE Online Ltd
Earlier this year, Hal Hodson, the US technology reporter for New Scientist, wrote an article describing how Google is planning to rank websites based on facts, not links.
If you have been following the major Google search updates in the past few years, you will already be familiar with the Panda and Penguin updates, which have started to tackle the problem of low quality content and manipulative link building. It would seem that this is just a sign of things to come.
Hal Hodson cited a Google Inc. research paper, which was published in February 2015 in the Proceedings of the VLDB Endowment, called Knowledge-Based Trust: Estimating the Trustworthiness of Web Sources. This could well be one of the most ground-breaking pieces on search theory to come out of Google since Page and Brin published The Anatomy of a Large-Scale Hypertextual Web Search Engine in 1998. What does it mean for websites?
Google’s Panda and Penguin search algorithm updates, which started in 2011 and 2012 respectively, have shown us that Google is serious about stamping out low quality and spammy sites from the internet. For too many years, ‘black hat‘ SEOs have manipulated the search engines to place their websites at the top, leaving honest businesses floundering.
Four years on from the launch of Panda, we certainly approve of the changes – Google search results are now a much fairer and a more level playing field. However, although Google has been refining and adjusting these updates, along with multiple others that are designed to improve the visibility of the best websites, there is still spam in the system.
Google feels that its users want to discover only the best businesses online. Google, and most searchers, believe that the sites shown first and foremost should be those that provide the best service to the end user and also share accurate information.
Although the Panda update did tackle low quality content, it has not succeeded to promoting high quality content – there is a big difference between punishing bad and promoting good.
If Google can develop a system that can accurately identify web pages that are sharing factually correct, fresh and relevant content, they may well be able to further reduce the importance of PageRank in their search results.
It is impossible to know if the original PageRank will eventually be decommissioned, however, seeing that many businesses are still managing to game Google through the manipulation of content and links, it is hardly surprising to learn that Google is at least researching some feasible alternatives.
What does it mean for businesses? The answer is simple – if you are building a business website that is rich in high quality information that is factually correct and properly referenced (where applicable) you should perform well in the future. Businesses need to ensure that their web copy is informative, engaging and optmised for their industry. To accomplish this, it is advisable to use professional copywriters, who understand both the needs of business and the expectations of Google and other search engines.
However, for the time being at least, building good quality and natural links from authoritative websites is still one of the most important SEO strategies you can employ.