Get Help:Ask a Question in our Forums|Report a Bug|More Help Resources
Last post Apr 26, 2012 03:31 AM by BU XI - MSFT
Apr 24, 2012 12:39 AM|LINK
Id like to use Watin to get html snapshots to make my single page ajax app crawlable as per google specifications.
Can somebody please show me code example how to return an html snapshot to googlebot using Watin?
This is what I have so far but honestly Im not even sure where Im going with this I really havent found a single example on the web that would really clear it up:
public PartialViewResult Solutions()
HtmlString htmlSnapshot = (HtmlString)GetHtmlSnapshot(Request.Path);
//Normal user request
return PartialView("Solutions", null);
public PartialViewResult MarketData()
return PartialView("MarketData", null);
public IHtmlString GetHtmlSnapshot(string uglyUrl)
string prettyUrl = uglyUrl.Replace("?_escaped_fragment=", "");
string decodedUrl = HttpUtility.UrlDecode(prettyUrl);
FireFox firefox = new FireFox();
Apr 26, 2012 03:31 AM|LINK
It may be up to the search engines, see if it likes to index the content of AJAX.
Normally, if you've got content that is retrieved by AJAX, you'll also provide a normal page that has this content. This page can be found by a normal GET request, and has a link to it somewhere in your site. So when next time search engine comes, it will
go to this link and index the page.