Technically, it's correct: you don't need a robots.txt for good SEO.
However, it's not that simple. Part of the problems I was having on my sites was that the search engine bots weren't just crawling my site, they were freaking pounding it into a fine powder. Oh sure, if you've got a big enough server, you can afford to let them run all over you–but I'm doing this crap on a budget.
If you're on a budget hosting service, or to put it another way, if you're using the cheapest hosting you feel you can get away with–you have to make sure you're not throwing away bandwidth or CPU cycles.
Look at your access logs. Are you getting hammered every couple of seconds by Googlebot? Or the Yahoo bot? Or any bot for that matter?
If you have a robots.txt, are the bots reading it and heeding it?
It's one thing if you've got flat HTML pages for your site, but even with wp-cache running, WordPress can bog down if a bot is allowed to run rampant. And if your site is slow or can't be crawled properly because the bots have bogged it down, then yeah, that can affect your SEO.
Now you know. And knowing is half the battle.