Last post Oct 03, 2011 05:44 PM by hiserve
Sep 07, 2011 10:18 AM|hiserve|LINK
We have an asp.net site written in VS and deployed as compiled code. It uses a master page and has dynamic content, but it is so secure it seems invisible to search engines and bots. How can we improve the SEO?
Sep 07, 2011 08:04 PM|IgorB|LINK
Open any of your pages in the browser and then "View Source". If you can see the complete code with text, then your page is visible to any of the search engine!
Do you have secure login?
How old is your web site?
Sep 08, 2011 08:28 AM|hiserve|LINK
Hi, When I load a page in the browser I can view the code, but obviously that page is not there until it is loaded. Do search engines load the pages? In the directory there are only placeholder aspx pages, so do bots find the info?
The secure login is only for part of the site.
The site was first uploaded a couple of years ago and has been updated a number of times since then. The content is changed on a daily basis.
Thanks for any help.
Sep 08, 2011 10:24 PM|IgorB|LINK
Search engines have no authority nor password to browse your files on your server. They can only browse existing web pages on the internet. Search engines don't know how your page is constructed or loaded. They don't know if it's dynamic or static. They
treat every page as static. Basically any of your pages are static - until refreshed. Each page has it's own URL. When you click VIEW SOURCE - you can see what google bot sees.
Sep 09, 2011 09:38 AM|hiserve|LINK
Most pages don't show anything until you click on something. For example there is a news.aspx page which is just a placeholder and has no content until someone selects a section, article etc. So although there are hundreds of possible 'pages' none of them
exist without user interaction; surely this would mean it would appear not to having been updated for ages by a bot.
Sep 09, 2011 09:39 AM|LeroyGerrits|LINK
In addition to what IgorB says, be careful with conditional content/pages which only show up if certain requirements are matched. Last week I fixed a website and optimized it for search engines, because it didn't get indexed at all. The cause was the Start
Page of the website, which was a simple page with no markup, but a little bit of code in the CodeBehind that looked into the Request.UseLanguages object and redirected the visitor to a next page depending on the visitor's language using Response.Redirect().
Problem is, most search engines do not support the Request.UserLanguages object (giving an Exception in the first place) nor Response.Redirect().
Take a look into Google's webmaster tools, it offers the opportunity to browse a given page as GoogleBot, and is a great SEO tool in overal.
Sep 09, 2011 11:06 AM|IgorB|LINK
When someone clicks on something - does it change the URL? Post the link to your website, so I can see how it works...
Sep 09, 2011 11:31 AM|hiserve|LINK
www.signlink.co.uk thanks for any advice.
Sep 27, 2011 01:32 AM|IgorB|LINK
One thing I noticed is your page URL looks like this: http://www.signlink.co.uk/Sections.aspx?i=13 . This is not very SEO friendly. Search engines are suspicious and some don't know how to index, and people are hesitant to click. I would recommend to use
URL Rewrite, so your page looks more meaningful and understandable like http://www.signlink.co.uk/Applications. This by itself should improve your SEO tremendously.
The second big problem I noticed is your Page Meta Description is exactly the same on all the pages. This does not help any search engine, especially like Yahoo, which uses meta description in search results. This is also the sign of low quality. Even if
your pages are dynamically generated, there is a way to rewrite meta description. Very easy if you use ASP.NET 4.0, but possible in previous versions as well.
This are the first things to take care of...
Oct 03, 2011 05:44 PM|hiserve|LINK
Thanks for your suggestions - have to look into it. Using asp.net 3.5
I'll let you know how it goes!