create a file in notepad and name it robots.txt. The file should hold the following info:
User-agent: *
Disallow: /joom
Upload it to your site root.
To be sure it works access this in your browser:
www.yoursite.com/robots.txt
The search engines will be updated the next time they crawl your site.