Google Webmaster Tools

Login to your account and click on your site. Under ‘site configuration’ click ‘sitemaps’, and in the textbox, enter the full address to your site.

Add and verify a site to Google Webmaster Tools

http://support.google.com/webmasters/bin/answer.py?hl=en&answer=34592

Login to your account and click on your site. Under ‘site configuration’ click ‘sitemaps’, and in the textbox, enter the full address to your site.

Robots.txt

Webmaster Tools
Webmaster Tools

Another must-have for every site is a robots.txt file. This should sit in the same place as your sitemaps.xml file. The address to this file should look the same as the example below:
http://www.yoursite.com/robots.txt
The robots.txt file is a simple file that exists so you can tell the areas of your site you don’t want Google to list in the search engine results.
There is no real boost from having a robots.txt file on your site. It is essential you check to ensure you don’t have a robots.txt file blocking areas of your site you want search engines to find.
The robots.txt file is just a plain text document, its contents should look something like below:
# robots.txt good example
User-agent: *
Disallow: /admin
User-agent: *
Disallow: /logs
If you want your site to tell search engines to not crawl your site, it should look like the next example. If you do not want your entire site blocked, you must make sure it does not look like the example below. It is always a good idea to double check it is not set up this way, just to be safe.
# robots.txt – blocking the entire site
User-agent: *
Disallow: /
The forward slash in this example tells search engines their software should not visit the home directory.
To create your robots.txt file, simply create a plain text document with Notepad if you are on Windows, or Textedit if you are on Mac OS. Make sure the file is saved as a plain text document, and use the ‘robots.txt good example’ as an indication on how it should look. Take care to list any directories you do not want search engines to visit, such as internal folders for staff, admin areas, CMS back-end areas, and so on.
If there aren’t any areas you would like to block, you can skip your robots.txt file altogether, but just double check you don’t have one blocking important areas of the site like the above example.