You can create and manage your robots.txt file via the pages application, this makes everything editable in the admincp, which is quick and easy.
1. Create a new pages page.
Select the manual HTML method
Name it as follows. (be sure it's exactly as below- not uppercase)
Select the content tab and add a reduced crawl rate below.
Then using the SQL toolbox you can view the sessions table to see real active sessions and block any bots as needed
to block a bot that honors the robots file, you can place the following, in the example of Yandex, which does, and is often an issue.