Googlebot analyzes no more than 15 MB of data per page

6eaa042586f61757fab6e55cf91b07b0

According to Search Engine Journal, Google uses no more than 15 MB of content per page for its evaluation.

As it turned out, Googlebot, which analyzes the pages that fall into the search results, takes into account only the first 15 MB of data. This is stated in the official documentation of the corporation.

Google reported that this limit is deterministic and all content beyond it will be ignored. At the same time, media content is analyzed and taken into account separately from JavaScript code and CSS markup. After counting 15 Mbytes, the bot will go directly to indexing and evaluation of the resource.

In this regard, many developers and SEO experts have asked if this means that a bot can detect a couple of images at the beginning of the page and completely ignore the text that follows them. Specialists from Google reported that the image is not part of the HTML file and will be analyzed in parallel, without affecting the 15-megabyte limit for HTML directly.

At the same time, it is worth noting that the recommended size of the HTML file does not exceed 100 KB. So this action by Google will affect a minimum number of users.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *

© 2022 ZoNa365.ru - Theme by WPEnjoy · Powered by WordPress