Setting Website Conversion Goals wi... Previous Post
Google Analytics Solutions Gallery ... Home Comment
If you received an email from Google, the step by step instructions below, should fix the issue.
Step 1: Check Your Robots.txt settings
Go to your website URL and after .com/ add robots.txt. This is how it should look: www.yourwebsite.com/robots.txt.
Step 2: Check to see if any CSS or JS files are disallowed.
The example below are the robots.txt files for a WordPress website. Notice, the highlighted area shows files which are blocking CSS & JS files. These files need to be removed.
Step 3: Download FTP program.
If you have an FTP program, go to step 4. I use Filezilla but you can use any program. Just do a search for FTP programs. Filezilla is FREE. Download Filezilla here.
Step 4: Connect via FTP program.
Once you download it, open the program and add your FTP credentials. If you do not have them, ask your hosting company for them. Add host name, user name and password and connect.
Step 5: Find the Public_HTML Files
Once you are connected, find the Public_HTML file and double click.
Step 6: Find Robots.txt file
After you click on the Public_HTML files, look below for the robots.txt file. When you find it, right click on your mouse and select edit/view. This will open up the file.
Step 7: Edit Robots.TXT File
Find any Disallow blocking CSS & JS files and remove them from the notepad. If you are using WordPress, remove any of the files below. Once you remove the file, save it to desktop.
Step 8: Transfer Saved Changes back to website.
Once you remove the disallows and saved file to desktop, you will need to find the robots.txt file on the FTP program and double click. It will ask you if you wish to overwrite the file. Click yes.
Step 9: Double Check.
Once you upload the changes, go to the websites robots.txt url www.mywebsite.com/url and make sure the changes went through.
This should fix the problem. One more thing, if you are using WordPress, you can sometimes make the edits if you download the WP Robots.txt plugin. Depending on how you have your robots.txt file set, this will not always work.
Leave a comment below and let me know if it worked. If it did not work, feel free to contact me for assistance.
As WP is vulnerable, wp-includes needs to be blocked in robots.txt for several legitimate reasons. Hence, I recommend to keep icons, css, js etc files which fetches to load the website in some different folder and free that from robots.txt.
Alex Miranda says
Thank you for your input. Keeping the wp-includes, was the old way. Google now wants to see those files. This is one of the reasons why so many WordPress owners are receiving these emails. As a matter of fact, in the latest version of WordPress, the Disallow: /wp-includes, has been removed from the default.