The first step to take is to check that access to scripts is not blocked using the robots.txt file. the content of the page will prevent it from rendering properly. Googlebot does not behave the same as the user and cannot perform certain interactions. Additionally, its resources are limited. Googlebot is unable to perform interactions such as: clicking on the button, scrolling, support for cookies and local storage. If the access to the content requires user interaction (e.g. clicking), it will not be included in the indexing. s loaded only after an action, e.g. clicking a button. Also, Googlebot may not be able to download all resources.
Site verification with the URL Inspection tool in Google Search Console If you have access to the GSC of the analyzed page, you should first use the URL Inspection tool. To do this, we paste our website address at the top of the website. After a short while, we should receive information about the analyzed URL in response, but in order to check the current version of the subpage, we must also click on the Check URL of the active version button . Once we have retrieved the address.