Last year I created a page which was a duplicate to my existing page
www.vip-wedding-hawaii.com
without the price info for a hotel, that made their own prices. I have since seen my page drop in ranking from page one to page 5 and assume the duplicate content has something to do with it. Just to be safe, I want the page not be crawled and all and entirely blocked. The hotel gives my link out to guests and that is all I need.
Do I modify the robot text file? If so what should the code be? Or do I do this in Google webmaster tools? How would I do this?
Thanks. Just wondering I also noticed that under Global Configuration / Robots I have the option to change it to No Index No follow. Is there a benefit using one method versus the other?