Pretty general question - probably out of order, but I can't find a reliable answer.
I'm usually use the Rocket Launcher to install and then build in that temp folder. Is there a reliable way to make sure the site doesn't get indexed incorrectly?
Thanks for reading. Sorry if it's a dumb question.
Basically, what you want is a custom/customized robots.txt file. This file tells robots (Web crawlers, etc.) what to do with a site (where they can and cannot go).
Anyway, all responsible search engines should respect the robots.txt file, as long as it is properly set up.
If you want a more ironclad guarantee, then use .htaccess to deny access except from your ip(s) so that nobody else (including search engines) can access the site. No access = no indexing.
In theory, putting it in the folder where you are doing the development work should be sufficient, but to be safe, and since you have access to the root folder, I would put it there, something like:
Disallow: /test_site/
Which would make the entire thing off limits in one fell swoop.
Again, since the paths are relative, it should not matter, but you are really at the mercy of the individual 'bot, so the simplest solution is often the best.