robots.txt is a great herald example of people misunderstanding and misusing a tool. The file was designed to help crawlers, by pointing them to the most valuable to index content and help them avoid wasting resources on useless pages.
The people trying to use it to block or limit bots are uninformed and/or misinformed.