
Witness an Increase in your ROI
Unlock higher rankings, quality traffic, and amplified conversions through tailored award-winning SEO strategies.
Google’s John Mueller explains how blocking CSS in robots.txt can pose problems for Google in crawling your page.
In a new video from the #AskGooglebot series, John addressed a question revolving around blocking CSS files on the CDN in the robots.txt file. The person asking the question pointed out that the Mobile Friendliness test in his Google Search Console was red (meaning the pages weren’t well-optimized). He suspected that blocking the CSS files was be the culprit behind this and was also curious whether this would affect site rankings.
Here’s the question:
“Mobile friendliness: when we block CSS files on the CDN through robots.txt, the report on mobile friendliness in GSC is red, would this affect the actual ranking?”
John Mueller explained how the Googlebot has to be able to see a page in its entirety, including HTML, CSS, JS, and all other elements. This helps Google to better understand the page and verify that is mobile-friendly for the users. He went on to further explain that it is a bad practice to block CSS files in robots.txt and this may cause issues.
Here’s John Mueller explanation in the YouTube video:
“Yes, it can cause issues and you should avoid doing that. Being able to see a page completely helps us to better understand the page and confirm that it’s mobile-friendly.”
Key Takeaway
Do not use robots.txt file to block your CSS and Javascript files. This will not increase your Pagespeed scores, but it will prevent Google from validating your site as mobile-friendly. Make sure Google can crawl and index the entire page, including all of the elements.
More videos from the #AskGooglebot series can be found on the Google Search Central YouTube channel. To keep you informed, we will make sure to drop a news article as soon as a video is released.
To see all of the other interesting questions and answers, you can watch the entire video here: