Search Engine World did a crawl recently of 75k robots.txt files. (robots.txt files contain instructions for search engines that index your site. You can use them to prevent search engines from indexing certain directories, blocking specific search engines, etc.) They report on their findings of common errors made in the files.
The worst robots.txt error I ever saw was for a site whose owners complained that they never showed up in google search results. I took a peek at their robots.txt file and sure enough someone had set it to disallow all search engines. Oops! This was probably a leftover from when the site was in development. Have you checked your robots.txt file recently?