skip to content
16 September 2008

Three Rules That ASP.NET Developers Should Know About SEO


Search engines optimisation, SEO, is an evolving ‘science’ and it keeps changing on purpose. Most articles that I read which involve both SEO and ASP.NET usually focus on how to programatically set the meta keywords tag and they tend to make it look like very important while, as of today, it has minimal effect on optimisation.

Generally, web developers tend to turn the blind eye when it comes to SEO while a great part of SEO should be done by developers. Here are three rules for .NET developers to follow while building a site:

1 – Favour XHTML with DIV and CSS Design Over Tabular Design

Thankfully, the era of developing tables-based website is about to end. Today most of the sites are following the DIV and CSS style of design and are table free, except for tobular data. However, some designers and developers did not make the leap yet. The DIV with CSS design will:

  1. Generate less code which will improve your ‘code to content’ ranking which is favoured by search engines.
  2. Improve your page load as your pages tend to have smaller size with reduced tags and will be even smaller with the CSS sheet separated. Load time is another factor in SEO.
  3. Promote better web semantics by marking your content with the correct XHTML elements. This will enable the spider(the search engine robot which will crawl your pages content) to better understand the structure of your site and your page content.

Always opt-in for XHTML with DIV and CSS type of design and think of rewriting your current tabular based site in a table-free format.

2 – Allow For Meta Description and Meta Keywords Tags

The meta description tag, from SEO point of view, will suggest to the search engine what to display in the search results page. The meta keywords tags are mostly ignored by search engines as spammers gave them a bad reputation, today they are only used to add misspellings and different culture spellings e.g. ‘optimisation’ and ‘optimization’.

ASP.NET does not provide an out-of-the-box solution in adding these tags, so you have two options:

  1. Add a ContentPlaceholder in the head of your MasterPage and fill it with the appropriate MetaKeyword and MetaDescription tags from within the page.In the MasterPage:
    <head runat="server">
        <!-- other xhtml code... -->
        <asp:ContentPlaceHolder id="head" runat="server">
        <!-- other xhtml code... -->

    In the ASPX Page:

    <asp:Content ID="Content1" ContentPlaceHolderID="head" Runat="Server">
    <meta name="keywords" content="ASPNET, optimisation, optimization" />
    <meta name="description" content="Three SEO rules for ASP.NET developers that will improve your site optimisation" />
  2. Use a base page for your ASP.NET pages (Also known as: Page Controller Enterprise Design Pattern) in which you will create a definition for generating the meta tags. This article Using Meta Tags with Master Pages in ASP.NET is well written and the code is in both C# and VB.NET.

Use a development procedure in adding Meta Description to your site pages where relevant. You could also add a Meta Keywords tag where relevant.

3 – Do Permanent 301 HTTP Redirects to Mark Moved Pages

Usually, when developers change the structure of a website, say upgraded the site from ASP to ASP.NET, they tend to forget that the old pages accumulated SEO ranking over time and it would be unwise to throw that away!

To point the search engine from the old page location to the new page location, you will need to issue a 301 HTTP redirect to the new URL when somebody, including the spider, requests your old page.

301 Redirection in ASP.NET could be done with the following code in the old page’s Init event:

Response.StatusCode = 301;
Response.RedirectLocation = "new-url.aspx";

However, the old page, most probably, will not exist or the old site might be on ASP and ASP is no longer supported on the new site. There is an easy solution for this, try URL Rewriter for ASP.NET. You might also consider implementing your own database driven redirection by using a HttpModule or HttpHandler.

Doing 301 HTTP redirections will maintain the previous ranking that the old URL has accumulated and will update the URL on the search engine result page to the new URL.

leave your own
  • Thomas Hansen September 16th, 2008

    Quite nice actually πŸ™‚
    Though aren’t the Meta Tip with adding “popluar typoes” you’re supplying technically “gaming the search engines”…?
    If you look at my page (follow my link) you’ll see that in fact we’re doing stuff like adding the scripts at the bottom, css at the top and so on. We’ve even created an HTTP Handler which sets Far Future Expires and so on on CSS though we haven’t been able to get it to work yet in IIS. And the site is pretty extremely optimized for SEO in mostly all regards. Though unfortunately it’s a new site and google actually penaltized us a week ago for what I don’t know, though I suspect it’s because of that we added a lot of content in the beginning which made them suspicious in addition to a couple of “circular links” and so on. But as a “template” for how to create SEO optimized pages I still think it rules big time πŸ˜‰
    Especially the thing we’re doing with the scripts (appending them to the bottom,even though they’re resource scripts) is pretty awesome πŸ˜‰
    This is a feature of Ra-Ajax the library itself πŸ™‚

  • Adam Tibi September 16th, 2008


    Keyword mispellings in the Meta Keywords are fine as you are not including unrelated keywords, so why would this be cheating?

    Pushing the JavaScripts to the end will generate an error if the user triggers a JavaScript related action while the page is still loading, this will happen more for users on slow connection.

    This approach is no longer a factor in today’s SEO optimisation even though there are some resources online that encourages it, so I recommend returning your scripts back to the top and give your users a better experience.

    Your site is not currently ranking in Google as the registration date is 2008-06-26 and you currently fall in what is called Google Sandbox, so you need to wait and hope for the best πŸ™‚

  • Ercüment ESER September 17th, 2008

    Fourth Rule for SEO: Use Asp.Net MVC!

  • Adam Tibi September 17th, 2008


    I totally agree as with MVC you have full control on the XHTML output unlike traditional ASP.NET programming.

    However, at the moment, MVC framework in a Preview 5 and still not released yet.

  • Anthony Grace September 20th, 2008

    Hi Adam,

    Very interesting post; I didn’t realize we could set the meta tags declaratively in a master page setup. In my own blog posting:

    I went about it the long way around! However, I was able to use the following to get my JavaScript to the top of the page:

    string myScript = “/js/footerFix.js”;
    Page.ClientScript.RegisterClientScriptInclude(“myKey”, myScript);

    I’m not exactly hot at JavaScript, but I believe that with JQuery, we can use the “ready” function to wail until the document and images have loaded, before calling a function. What do you recommend as a best practice here? Also, could you expand on the Google sandbox issue?

    Keep up the good work!
    Anthony πŸ™‚

  • Adam Tibi September 20th, 2008


    Thank you for the comment, I am glad that this helped.

    Pushing the JavaScript to the end of the page to promote the more important content is not an SEO factor, however, I found a lot of tutorials that promotes it.

    However, if you insist on moving the JS to the end, I suggest doing the following. Always inherit all your pages from a BasePage and override the Render method. When overriding the Render method, search for the JavaScript references in the rendered HTML, remove them, then push them before the close <body> element.

    Regarding the Google Sandbox, I have just posted to answer this.


  • September 20th, 2008

    thx, great article.

  • Raymond October 5th, 2008

    Thanks for the info, it really help.
    But just a thought,when we do the suggesting 301 HTTP redirect, would Search Engine penalize us on duplicate content?

  • John Enderson December 24th, 2008


    Thanks for your information.
    I have one query for CSS. I think external CSS is more useful for SEO friendly website.

    If any website have internal CSS. OK?????

    But, I want still same result as an external CSS.

    There are many factors in web development. I want to know perfect method for that.

Leave a Reply