Last post May 23, 2008 12:44 PM by gtk
May 23, 2008 05:23 AM|gtk|LINK
im developing a site that contains content in three languages (en, de, gr). I used linkbutton controls fro the flags and resource files for the content. So you are always see the name of the site (i mean all users see www.example.com/test.aspx , not www.example.com/de/test.aspx
or www.example.com/en/test.aspx )
Im thinking whats about indexing my site by the google robots when the site is published. In fact there are no links for the two languages (de, gr), just some flag-buttons that trigger the click event and store the requested language in a session variable
as new culture in the session. How the content in these two languages can be crawled and parsed by the search engines robots?
May 23, 2008 11:24 AM|Denis Chiochiu|LINK
Did you try to crawl your site and it didn't work?
I think it will, cause the crawlers actually access your compiled site, so if have a path to the new language, it will crawl it.
May 23, 2008 12:44 PM|gtk|LINK
Thanks for your answer
Actually, i have not published the site yet. Just wondering the google indexing proccess.
What do you mean the compiled site? Generally, i know that crawlers follow the links of the site and spread over the sites. I would like the crawler index all the languages-content of the site, not just the default (english). And now with the new asp.net
2.0 functionality, there are not visible subfolders / links (for example there is no www.example.com/en/test.aspx or www.example.com/de/test.aspx ).
Anyone knows what really happens? Will i face some problem with this? Im thinking of changing the structure of the site, stop using localization and resources, and follow the old standar way with the subfolders and multiples copies of the web pages [8-)]