The question of whether a website’s compression ratio affects its search rankings is almost as old as Google itself. Read about the research behind it. and what level of content compression is tolerable for SEO.
Website compression benefits everyone
A basic example of compression that everyone is familiar with is shrinking documents into a zip file . Compressibility from a search engine perspective refers to how effectively pages are compressed for better loading. Search engines. and they all do. compress pages to save space and process them faster.
Compressing web pages is generally a useful thing because it allows search engines to access the site faster. which signals to crawlers that the server is not overloaded and it is possible to get even more pages to crawl and index .
Compression and page acceleration provide a better user experience for website visitors. Compression is also automatically enabled by most web hosts because it saves bandwidth.
High compression rates are associated with spam
Renowned researchers Marc Najork and Dennis Fetterly state in their 2006 study that highly compressible websites indicate poor quality content.
Their analysis showed that 70% of pages compressed to a level of 4.0 or higher were low-quality pages with a high level of redundant special database words. while the average compression of websites was around a level of 2.0.
Averages of “normal” websites included in the research:
- Compression ratio 2.0: The most common compression ratio in the dataset is 2.0.
- Compression ratio 2.1: Half of the examined pages have a compression ratio lower than 2.1 and the other half higher.
- Compression ratio 2.11: The average compression ratio of the analyzed websites is 2.11.
Differences in compression could be an easy initial filter for search engines to detect and eliminate “heavy” spam content. But it’s not that simple. as bots use much more sophisticated methods that include a variety of different spam signals to increase accuracy.
The research also found that spammy become a detective: track down ai by fingerprints content was found on 70% of websites with a compression level of 4.0 or higher. This means that the remaining 30% were not spam. Therefore. it is necessary for search engines to use more than SEO Myths and Legend one signal to identify spam.
Do search engines take compressibility into account?
It would be logical to assume that search engines use compressibility to recognize obvious aggressive spam on the web.
However. the reality is that they also fresh list use other signals to increase the accuracy of spam metrics. And here we are. into account when ranking.