SEO
  SEO Maryland headerSEO Baltimore headerSEO Company header Maryland  
 
    

SEO Blog & Internet Marketing Blog

 

Submitting Updated URLs with the New Fetch as Googlebot Feature

August 12, 2011 by Dustin

Google’s Webmaster tools now provide users with a new way to submit new or updated URL’s for Google indexing. The fetch as Googlebot feature allows users to identify individual URL’s and submit them to Google’s crawl schedule for index consideration. Unlike the other methods of URL submission the Googlebots will crawl the submitted pages usually within one day of their submission. Although the response time is faster Google will still use the same selection process when considering a URLs inclusion in their index. The Fetch as Googlebot helps Google discover a web page faster but like its natural discovery process there is no guarantee that the page will be included in their index.

fetchasgooglebot2 300x106 Submitting Updated URLs with the New Fetch as Googlebot Feature 

from http://googlewebmastercentral.blogspot.com/2011/08/submit-urls-to-google-with-fetch-as.html, August 2011

In order for Google to crawl and index a page it first needs to know that the site exists. The fetch as Googlebot feature may be a new and convenient method to submit a URL to Google but it’s not the only option. The process of discovery and can take place in a couple of different ways. Traditionally discovery occurs through links which is why it is extremely important to make sure that all your web pages are linked internally. Pages don’t always have external links connecting them, making internal links necessary for bots to travel from page to page.

Google also uses RSS feeds, XML Sitemaps, and Public URL Request to discover URLs that are not yet indexed. An XML Sitemap is a complete list of URL’s that you want Google to crawl for your website. The public URL request is an “Add URL” form that is available for anyone who wants to request that a URL be added to the index. Recently the add URL form has been renamed to the Crawl URL Request Form and to use this feature you must be signed into a Google account. Similar to the Fetch as Googlebot the Crawl URL request form can only be used to submit up to 50 URLs a week.

It’s recommended that you submit your URLs using an XML Sitemap (especially for images and video) and have a solid internal and external link system to encourage bot crawling. However the fetch as Googlebot can come in handy when just launching a site for the first time or adding new pages to an existing site. In addition to discovering new pages the fetch as Googlebot feature can quickly fix, update, or refresh existing web pages. With this method a website owner can ask Google to find their pages and skip the natural discovery process. It’s also suggested that the Fetch as Googlebot method be used before the Public URL Request because more priority will be given to the actual website owner.

While the Fetch as Googlebot was officially launched last week it has actually been around for quite some time. This feature was previously part of the Google Labs Section where products in their early development cycle are tested. With the Fetch as Googlebot option users are able to see a page exactly how Google sees it and once fetched the URL can be submitted to Google’s crawl schedule. In addition to submitting individuals URLs you can also submit a URL and all of its linked pages 10 times per month. The page will keep track of your submissions and display how many individual URLs and URLs with linked pages you can still submit.

How to submit a URL Using Fetch as Googlebot:

  • Use Diagnostics > Fetch as Googlebot to fetch the URL you want to submit to Google. If the URL is successfully fetched you’ll see a new “Submit to index” link appear next to the fetched URL.
  • Once you click “Submit to index” you’ll see a dialog box that allows you to choose whether you want to submit only the one URL, or that URL and all its linked pages.

 

The Fetch as Googlebot feature provides users with an alternative method of submitting URLs even though most of the time it may not be necessary due to the Googlebots ability to effectively crawl the web. This method can certainly come in handy when adding new pages or making major updates without submitting an XML Sitemap.

 

Thanks for reading

Dustin

Internet Beacon

 



share save 171 16 Submitting Updated URLs with the New Fetch as Googlebot Feature

Tags: , , , , , , ,


 
  Our SEO Company Serves BaltimoreMaryland, and Beyond.  
 
  seo baltimore footer 1 seo baltimore footer 2  
Maryland
  SEO Blog and Internet Marketing Blog HTML 4.01 800-876-1398 InternetBeacon.com