After not having checked my AdSense Site Diagnostics in some time, a week ago I had a look at them and noticed a list of blocked URLs, all beginning with "http:/ / webcache. googleusercontent. com/ search? q= cache:" with the "Reason Blocked" being "Robots.txt File". I then did some research and added googleusercontent. com to my robots.txt file's list of allowed spidering sites.
A few days later I checked my Site Diagnostics and saw more of the same thing, with the "Reason Blocked" still being "Robots.txt File", so I just deleted the robots.txt file. For some reason, that still hasn't stopped this from happening. Several days after deleting the robots.txt file, I'm still seeing new blocked URLs beginning with "http:/ / webcache. googleusercontent. com/ search? q= cache:" with the "Reason Blocked" being "Robots.txt File".
Thanks for the reply. Fortunately, doing nothing is something I'm pretty good at.
By the way, I forgot to mention that in my initial attempt to figure this out, I called Google's main phone number. I never did to talk to a live human, and gave up after waiting on hold for 55 minutes.
I believe doing nothing is appropriate in this case. I vaguely remember reading something about this issue, which I've also noticed in the AdSense control panel for my site. I think the block is at Google's end, not ours...