V
vic
New Member
Hi all,
For a Wordpress website, I have disallowed access to /wp-content/plugins/ via Robots.txt. I did this (as well as a couple of other disallows) to maximise crawl budget.
Now, the Sitebulb tool highlights the following critical errors:
"Disallowed Style Sheet
CSS files that are disallowed in robots.txt, which may affect how search engines render page content. If these page resource URLs are disallowed in robots.txt, it means that Googlebot may be unable to correctly render the page content. Google relies on rendering in a number of their algorithms - most notably the 'mobile friendly' one - so if content cannot be properly rendered, this could have a knock on effect in terms of search engine rankings.
Disallowed JavaScript file
JavaScript files that are disallowed in robots.txt, which may affect how search engines render page content. If these page resource URLs are disallowed in robots.txt, it means that Googlebot may be unable to correctly render the page content. Google relies on rendering in a number of their algorithms - most notably the 'mobile friendly' one - so if content cannot be properly rendered, this could have a knock on effect in terms of search engine rankings. "
So, I'm going to remove these disallows from the Robots.txt, but as I'm new to SEO, I wonder whether anyone has any thoughts or further info about this?
Thanks in advance,
Vic
For a Wordpress website, I have disallowed access to /wp-content/plugins/ via Robots.txt. I did this (as well as a couple of other disallows) to maximise crawl budget.
Now, the Sitebulb tool highlights the following critical errors:
"Disallowed Style Sheet
CSS files that are disallowed in robots.txt, which may affect how search engines render page content. If these page resource URLs are disallowed in robots.txt, it means that Googlebot may be unable to correctly render the page content. Google relies on rendering in a number of their algorithms - most notably the 'mobile friendly' one - so if content cannot be properly rendered, this could have a knock on effect in terms of search engine rankings.
Disallowed JavaScript file
JavaScript files that are disallowed in robots.txt, which may affect how search engines render page content. If these page resource URLs are disallowed in robots.txt, it means that Googlebot may be unable to correctly render the page content. Google relies on rendering in a number of their algorithms - most notably the 'mobile friendly' one - so if content cannot be properly rendered, this could have a knock on effect in terms of search engine rankings. "
So, I'm going to remove these disallows from the Robots.txt, but as I'm new to SEO, I wonder whether anyone has any thoughts or further info about this?
Thanks in advance,
Vic