Security note on ASP.NET beta 5

With the release of Visual Studio 2015, Microsoft have included the beta 5 release of ASP.NET 5. This project template does unfortunately come with a drawback in terms of deployment and security due to the default setup with Bower components.

Project comes with a file named .bowerrc which redirects the default bower_modules folder into the wwwroot/lib folder.

That means all the files that are part of the Bower package that you reference in bower.json, will be included in your wwwroot folder during deployment.

If you have a default setup for an ASP.NET Web App on Microsoft Azure, that means that any .php file located within any of the bower components will execute on your server if access by anyone through a web browser.

Additionally there are hundreds of JavaScript (and other) source files (471 in the few default packages that comes with standard web project) and unit tests that are included with some of the bower packages, these are deployed as well by default – unless you filter the files manually. This does expose you to potential Cross-Side-Scripting issues.

Fix for issues

The easiest way to fix the issue is to delete the .bowerrc file, let Bower install components at the root of your web project, which does not make them a part of your deployed assets. Then use the gulpfile.js to do tasks that performs work on copying only those files your project actually need.

Working on an improved ASP.NET 5 template that should be released here in a few days. It will be example of a Single Page Application built on ASP.NET 5, which utilizes only NPM for packages and removes the use of Bower completely.

This issue relates to ASP.NET 5 beta 5, hopefully Microsoft will change the project templates in the future so this won’t be any further issue.


Microsoft Azure: Secure Site with SSL certificate (revisited)

Back in January 2011 I wrote the first instructions on how to secure your site with SSL certificate on Windows Azure. Since then, both Azure and IIS have been updated, so I’m revising these instructions here.

Learn how you can create the CSR (Certificate Signing Request) for Windows Azure, using Internet Information Services on Windows Server. The CSR is used to generate the proper SSL by any certificate provider. You will know learn how to go through the process of securing your Windows Azure hosts and enable users to access your services over HTTPS.

Create a new Windows Server

As it is now possible to create virtual machines on Windows Azure, you could easily create a new VM on Azure if you don’t have any on-premisses Windows Server.


After the machine is provisioned, you can connect using Remote Desktop Client. You will find the public TCP port on the Endpoints page of the virtual machine. Connect to your Windows Server, either on Windows Azure, another provider or on any on-premiss server.

Install Internet Information Management

Choose the Add roles and features option on the Server Manager. Go through the wizard and select the Web Server (IIS) option on the Serve Roles step. Accept the dialog that adds required feature, the IIS Management Console.


Certificate Signing Request

First open IIS Manager and navigate to the root element for the web server. Open the Server Certificates by double-clicking on the icon, as seen in the screenshot.


On the right you will see the Actions options. Click the Create Certificate Request to start the wizard.


Fill out the fields in the wizard, in the Common name you will out your domain name.


Next step is choosing the bit length (strength) on the certificate. Choose a minimum of 2048, in this example I have chosen 4096 which is more secure, but require more computation and can be slower on high traffic sites.


Choose where to store the signed certificate request on your local computer.


Open the file in a text editor and copy everything. You need this in your application for SSL certificate.


Copy and paste the signed request to your selected SSL provider. There are many providers available, and there are different processes for verification and different levels of verifications. Make sure you research which type of certificate and verification that fits your requirement.

Installing and exporting SSL certificate

After you have supplied the request to your SSL provider, and have completed the other verification steps, you will receive one or multiple .crt files, often packed in a .zip.

You normally don’t need the extra certificates, such as the CA (Certificate Authority) certificates that are included. These certificates are normally already installed on your server.

Copy the www_domain_com.crt or similar named file to your Windows Server.

Next step is to install the SSL certificate on your local web site in IIS. We will install the certificate and later export it for use on Windows Azure.

Go back to IIS Manager and the Server Certificates window. Below the link we used earlier there is another one named Complete Certificate Request. Click this and complete the wizard. Note that IIS normally looks for files with the .cer extension, so you might have to choose the *.* option in the Open dialog, if your certificate is in the .crt format.


It’s OK to install the certificate in the Personal certificate store, you might get permission error if you try another.

Located the installed certificate in the Server Certificates view inside IIS. Right-click on the certificate and choose Export.


Pick a selection to store the .pfx, and enter a password. Make sure it’s a decent quality password, if you ever loose the .PFX you don’t want anyone being able to easily brute force the password. If you loose the PFX and the password, others will have access to the private key of your certificate and can use it to do malicious actions in various manners.



Important: Keep your PFX file safe and keep it’s password safe. It contains the private keys and shouldn’t be distributed widely.


Configure Certificate for Azure Web Role

Next step is to configure web roles in your cloud project within Visual Studio, to use the new certificate. First thing to do on your development machine, is to copy the .pfx file, double-click to open it, choose the store location to be Local Machine, fill out the password you entered earlier. As you already have an exported private key, within the .pfx file, you don’t need to check the Make this key as exportable.



Now you can open your Visual Studio solution with the cloud project. Expand the Roles folder and double-click your Web Role. find the Certificates tab, click Add Certificate. Fill out a identifier name, can be anything, choose the Store Location to be LocalMachine and the Store Name to be My. In the Thumbprint column, click the “…” button to open certificate selection dialog.


If you can’t find the certificate in the dialog, experiment with the various stores to see if you can find it. If you are unable to find it, you can manually install it using the Certificate Management Console add-in.

Navigate over to the Endpoints tab and add a new endpoint with HTTPS as the protocol, and select the certificate to be active for that endpoint.


Now you can launch your web project from Visual Studio and the local Azure-emulator will open two instances, one for HTTP and one for HTTPS. Don’t be afraid of the certificate warnings, these are normal. Your certificate are only valid for the production URL that you specified while ordering the certificate. Meaning that you will get a warning if you re-use the certificate for localhost, “” and other sites. There exists wild-card certificates, which can be * and can be used for many purposes. If you are building a big cloud solution, where you want to have custom domains for Azure Storage, etc. then you should apply for a wildcard certificate. Beware though, it comes with a premium price.


Simply choose to skip/ignore/accept the certificate for your localhost debugging and developing needs.

Adding Certificate to Windows Azure hosts

The last and final step before you deploy your updated web role, is to ensure that Azure have a copy of the certificate.

Login to the Azure Management Portal, find your Azure instance, navigate to the Certificates option. Choose the Upload a certificate link and find your .pfx file.




After the process is complete, you can deploy the updated version of your cloud project. Your site should now be fully functional with the ability to run over HTTPS for secure communication.


Securing your services with HTTPS is important to ensure the privacy and safety of your customers and users. Never allow anyone to authenticate their credentials with your site unless it’s with HTTPS. When you don’t use HTTPS, all the information the user enters on your web site can be sniffed and logged by third parties at various steps in the network from the client computer to your hosted server. In many cases, this data travels across multiple country borders.

Installing and configuring HTTPS certificates is sometimes hard, but I hope this walk-through makes you aware of the importance to use it and how quickly and easy you can get up and running with a valid SSL certificate.

If there is any questions, please leave a comment.


Trying to understand Microsoft.Data.dll

Here is my analysis of the recently “released” (embedded) Microsoft.Data.dll assembly, the namespace and the types it includes. It’s been the topic of a lot of heated debate recently, with viewpoints I’m unable to relate to and understand just from reading, so I needed to understand.

The debate is stemming from a blog post by David Fowler and his example that shows how some data-related tasks have a simpler syntax with Microsoft.Data and the ASP.NET WebPages with Razor Syntax.

What is inside the Microsoft.Data namespace?

There is very little code inside the namespace and the assembly. It’s simply some helper types that makes life's a little bit easier. It’s not a new data access framework, like Linq to SQL or Entity Framework.

It contains the following classes: ConfigurationManagerWrapper, ConfigurationConfiguration, Database, DbProviderFactoryWrapper, DynamicRecord, IConfigurationManager, IDbFileHandler, IDbProviderFactory, SqlCeDBFileHandler and SqlServerDbFileHandler. Of which only Database and DynamicRecord are public available, the others are internal.

All data access inside the Microsoft.Data types are using the common ADO.NET types, not the providers specific for any SQL platform. This means it’s not restricted to SQL Compact Edition nor SQL Server. It relies on DbConnection, DbTransaction, DataTable, etc.

Microsoft.Data on ASP.NET Web Forms

While Microsoft.Data.dll is currently not accessible in the Add References dialog, you can find it by looking on your computer, it’s located in the Global Assembly Cache (GAC). Microsoft probably don’t want us to use it outside of WebMatrix in the current release… but if you just take a copy of the assembly out of the GAC, then you can reference the assembly in any .NET project and it will load it from the GAC (you just need the file so you can add a reference).

In my project I added a database to my App_Data folder (which you normally would never do, unless you are working with a local read-only cache in a large-distributed system or working with SQL Compact Edition) and added the following code to my web form, to make it render the Name column of my Users table.

	var db = Database.OpenFile("Database1.mdf");
	var users = db.Query("SELECT Id, Name FROM Users");
	foreach (var user in users)

Take notice of the OpenFile parameter, it’s simply the filename on disk. I don’t have to care about any specific details of the connection string, nor how to figure out where the App_Data folder is.

Obviously though, if you added an entity framework (EF) model of your database, you would have very similar example to achieve the same and you don’t have to care about the connection string, at least not in theory.

	using (var db = new Database1Entities())
	var users = db.Users;
	foreach (var user in users)

The two big distinctions betweens these examples is that the first one is dynamic, I can modify the database schema whenever I want and it won’t (necessarily) break my web app, while the latter example with EF will need to refresh the entity types based on the database model.

The other distinctions is that the first example doesn’t require a connection string, while the latter generates one for you automatically, a rather cryptic looking one.

<add name="Database1Entities" connectionString="metadata=
provider connection string=&quot;
Data Source=.\SQLEXPRESS;AttachDbFilename=|DataDirectory|\Database1.mdf;
Integrated Security=True;
User Instance=True;
MultipleActiveResultSets=True&quot;" providerName="System.Data.EntityClient" />


While all of this are peanuts for me and anyone who’s been developing on .NET for a while, I think that making things simple where possible is positive, rather than negative. It doesn’t mean we will stop using NHibernate, do proper n-tier and layered architectures just because Microsoft makes some tasks simpler for beginners. It also means some of us probably will eventually have to maintain and possibly migrate solutions built on Microsoft WebMatrix, but does that give us any right to restrict beginners the privilege of building their own solutions and feeling the immense joy of realizing their dreams?

Other’s feedback and comments

Ayende Rahien comments on his blog on the example, where he mentions the use Response.Write within the loop. Understandable this is probably not the best way to do this, but it’s understandable with the sample in question, which was already using Response.Write. There are slightly better examples available out there. He also points out that having the SQL queries directly in the view code is an open invitation for SQL injection. Using proper parameterized queries will reduce this potential security problem. Looks like David updated the sample to use parameters after the initial reactions. After the security push at Microsoft some years back, they really cleaned up their practices with examples in the MSDN documentations, I think we should expect the same level of security thinking on employee blogs as well.

Ayende quotes David, which (David) made the assumption that Microsoft.Data is tied to Sql Server in any way, which my investigations has shown is not correct.

David tried to respond to some of the feedback on embedding SQL directly in the view, with hacking around to get the new dynamic keyword to work properly with LINQ. To me, this defeats the whole purpose of simplicity with Microsoft WebMatrix, Razor and Microsoft.Data.

KristoferA comments on the post and suggests generating Linq to SQL datacontext using a page-level directive, which would essentially give the developer entity objects to work with (and query against). This again defeats the purpose of simplicity, and now you can no longer change the database scheme without “recompiling” your web-app.

The namespace naming is another sour point for some, and I can agree that there is little point is “abusing” the Microsoft.Data namespace for such a trivial little helper, perhaps Microsoft.WebMatrix.Data or Microsoft.Data.Connection?

Who is this for?

Microsoft WebMatrix (and ASP.NET WebPages) is not a tool built for “professional” programmers, additionally is it not a fully generic framework for building everything. It’s a domain specific language that is optimized for building simple web applications without to much resources.

It is not meant for enterprise applications that handles millions of transactions. Will it be used for that? Yes, it probably will! We’ve seen plenty of classic examples of websites that starts with simple web-frameworks and find themselves in big trouble when their services become popular. Services like Twitter and Facebook was not built to scale to their current levels, yet they started as simple concepts and services and has grown to become important services which affects global policies and real-life social interactions.

It's Not Rocket Science, But It's Our Work:

And obviously, it’s for those of us who still miss the old days with ASP Classic, here is a pretty good (and funny) read, 8 Reasons to Stick with ASP 3.0 in 2006 (and 2007).

Final thoughts

It’s very clear that Microsoft WebMatrix (and related technologies) are primarily is focused towards beginners and it’s a great tool to build simple websites. I wouldn’t advice anyone to use this if you already know ASP.NET MVC and want to build complex web solutions, ASP.NET Web Forms, MVC or other more general purpose frameworks would probably be more fit.

Additionally I think it’s important to remember that WebMatrix is primarily focused on SQL Compact Edition for data access, the built in editor doesn’t allow you to modify SQL Server database. So the question (and response to some of the comments) is how many layers do you want to wrap your data access logic for a SQLCE database?

Been a while since Microsoft did a push towards simplifying development for beginners, when we went from VB6 to VB.NET, everything was more complex and the entry level for VB.NET is on-par with C#. With the release of .NET Framework 4, the complexity and total amount of features is mind blowing. I for sure welcome tools, languages and frameworks that simplifies how we develop software.

Simplicity is hard and it's something we should strive towards in all that we do.


MSDN Live: Solution Architecture Slides

Here are the slides from my talk on Solution Architecture at MSDN Live in the spring of 2010. The slide decks alone isn’t enough to appreciate the presentation, so I have included all notes that was written for the presentation. This means you can read through the presentation and the points I made when delivering it in Stavanger, Bergen, Trondheim and Oslo. Download the full presentation or watch below.

For more background on the presentation, also read my blog post that I wrote during the preparations. The final result is very different than I initially planned and I didn’t deliver what was promised in the agenda. I still hope the presentation gave enough value to those who attended and I hope it inspired to enable change and sparked a move towards simpler solutions with reduced complexity.



MSDN Live: Solution Architecture

At the next MSDN Live tour in Norway (in April), I’m doing a talk about Solution Architect and SharePoint 2010 for Developers.

I would like to air some ideas I have for the Solution Architecture talk and hopefully get some feedback, perhaps some tips and hints that can improve my talk.

What’s in a name?

There is no way I’m going to even start to try defining the name architecture or the architect role. It is something different to every single individual, in the same way as I’m never going to define what a developer truly is.

Though we can talk about distinctions between what it means to be a developer and what the role of an architect in comparison could potentially be.

Architecture is primarily about the bigger view of things and the spider web of interactions between humans and systems in an organization and across organizations. There are many forms of architects, from functional architects, enterprise architects, software architects and what I’m going to talk about: solution architects and architecture.

Architect? You make diagrams, right?

Well sure, architects often use tools to draw their ideas and conclusions, even if it’s just on pen and paper. Source code is the primary language for a developer and diagrams is the primary language of an architect. More than that, I’m not going to talk about diagrams. Other than say, they are a good tool for communicating intents, ideas, thoughts and meaning. Architecture is not about diagrams, it’s about everything else.

The Solution Architect Role

When I talk about the solution architect role, think about the role from a technical perspective, not a functional one. Here is a diagram that tries to illustrate some of the interactions that the architect has with other roles in a project.


Depending on the scale and form of a project, the architect is often involved early in the process – and hopefully part of the project until the final delivery date. Unfortunately the identity of the architect have been put on some negative weight. Some people see the architect as someone distant from the project, someone that makes decisions that developers feel the pain from. And this can be true for some projects, and that is a bad position to be in, both for the developers and the success risk of the project.

It’s important that the solution architect is closely involved with the project all the way. Initially they work with the client to gather all the requirements, depending on what type of architect and his or hers responsibilities, they might be both functional and non-functional requirements. Initially often with project leaders and members on the client side and often the upper-management often has a stake in the project and unfortunately sometimes do technical decisions ahead of involvement of others, often after reading an report by Gartner… So often the architect and developers have to work with pre-existing decisions, most of the time, this works out fine though.

The green person in the illustration is the clients network and system administrator, who often have requirements and demands regarding security and deployment. If you’re lucky to be on a project with a designer, the typical black-suite guy using a Mac, they often have insane demands on the interface. I say this with a sense of humor, as usability experts and designers are very important individuals for the success of a project.

Then you have all the others, which are different individuals from inside and outside the organization. Computer security experts might be utilized to do reviews of the architecture and eventually the complete solution.

Users of the final solution is very important, it’s for those we do what we do. If we can’t satisfy them, then there is little point in going forward with a solution.

After a project has been planned, contracts have been agreed upon and signed, the project starts with the project team. Depending on the size, the project team could include advisers, project leaders, developers and others.

The architect often have interactions with all of these roles in a project and their focus and responsibility is often the quality of the overall delivery. Architects are not the individuals who manages the projects and it’s resources, which is a whole different and challenging arena, which luckily as an solution architect, you normally avoid directly. Though it’s a constant battle to ensure the developers get the time, knowledge and tools they need to ensure the quality of a delivery, which is not compatibility with the goal of a project leader who first and foremost want to deliver on time.

Topics for the talk

These are some of my other potential topics on the agenda for my talk, there are so much to talk about on the subject of solution architecture, though I have only an hour and I’m interested in finding the topics that gives most value for my audience.

Topics: Security, Infrastructure, Products or Custom, Cloud Computing, Frameworks, Scalability, Tools, Why you should care about architecture, Become an solution architect.

What do you want to hear about?

Come to MSDN Live!

If you haven’t signed up for MSDN Live yet, it’s about time! The tour starts with Stavanger the 16th of April and ends 26th of April in Oslo.

I work as a senior solutions architect at Steria, who’s one of the partners for MSDN Live. Check out our stand at MSDN Live!


Windows Azure: Secure Site with SSL certificate

This post have been updated for newer versions of Azure and IIS. View the Microsoft Azure: Secure Site with SSL certificate (revisited) or continue reading below.

Learn how you can create the CSR (Certificate Signing Request) for Windows Azure, using Internet Information Services 7.0 on Windows Server 2008 R2. The CSR is used to generate the proper SSL by any certificate provider. You will know learn how to go through the process of securing your Windows Azure hosts and enable users to access your services over HTTPS.

Certificate Signing Request

First you open IIS Manager and navigate to the root element for the web server. Open the Server Certificates by double-clicking on the icon, as seen in the screenshot.


On the right side, there is an action by the name Create Certificate Request. Click this link and a new dialog will appear. Fill out this dialog according to your information details. Make sure you don’t type the wrong URL of the website you want to secure. Depending on the type of certificate you’re applying for, you might use a wildcard or not.


The next step in the wizard allows you to choose the cryptographic service provider. Choose the Microsoft RSA one and set the bit length to whatever your requirements are. I set mine to 2048. The last step is to save the request to your local machine. Save the file somewhere and you’re done with the request.

Purchasing SSL Certificate

Now you’re ready to purchase an SSL certificate. There are many providers of certificates and there are a wide array of available certificate types. They can be used for secure email, signing software code, securing web server communication and more. There are additionally multiple levels of security, some certificates include a green URL bar which improves the visibility of the secure communication channel.

Open the .txt file you created with the request. Copy everything inside the file, it should begin with “—–BEGIN NEW CERTIFICATE REQUEST—–“.

Paste the content into the website where you have purchased the SSL certificate. See the screenshot below for an example.


Complete the procedures required by your certificate provider. When you’re complete, you should be able to download or receive the certificate by email. The certificate comes in the same way as the request, if you receive it by email, copy the content and save it to a new file with the ending .cer. The certificate starts with “—–BEGIN CERTIFICATE—–“.

Installing and exporting SSL certificate

Next step is to install the SSL certificate on your local web site in IIS. We will install the certificate and later export it for use on Windows Azure.

Go back to IIS Manager and the Server Certificates window. Below the link we used earlier there is another one named Complete Certificate Request. Click this and complete the wizard.

If everything is correct, the certificate should be in the list and when you open it, validate that it’s valid. See example below.


Exporting to .pfx file

Next important step is to export the newly installed certificate to an .pfx file. You can do this by clicking the Export link in the same page where you imported the .cer file. It’s advised that you enter a password for additional security, if the .pfx file is lost or stolen, they can’t use it unless they know your password. Export the file to the local machine, we will use this file later on to upload it to Windows Azure.


If you’re developing a web site using Windows Azure, I suggest mapping a virtual directory on the local IIS to your web-application. This makes it possible to test your site through IIS, as well as the built in web server in Visual Studio, and through the Windows Azure Fabric that runs locally.

Go into the web site and modify the bindings. Add a new binding which uses HTTPS protocol and pick the newly installed SSL certificate.


If you want to, you can now verify that the SSL certificate is working properly on your local machine. Before, you will see an security warning that the certificate is invalid (wrong URL). This is to be expected.

Adding Certificate to Windows Azure project

Now that the certificate is installed on IIS, we need to install it for the local user that is running Visual Studio with the Windows Azure project. The certificate has to be installed in the local store, for Visual Studio to find it in the properties dialog on your project.

Open Windows Explorer and locate your certificate, this is the .cer file that we saved earlier. You need to open it from Windows Explorer and not from IIS Manager, for the “Install Certificate” button to appear. In the installation wizard, choose to install the certificate in the Personal store.


Open the settings of your role, you do this by double-clicking the icon below the Roles folder under your Windows Azure project in Visual Studio. Navigate to the Certificates tab and choose Add Certificate. Give it a name, I like to use the domain name for simplicity. Choose LocalMachine as the store location and store name as My. Click the button to open a dialog where you can choose the newly installed certificate.


Last step before we move on to the next one, which is uploading our private key to Windows Azure, is to configure the HTTPS endpoint. Navigate to the Endpoints tab in the same window and enable HTTPS. Pick the correct certificate from the dropdown.


Adding Certificate to Windows Azure hosts

Navigate to, login and open your project. At the bottom of the screen where you see the current status of your virtual machines, there is a section named Certificates. Click the Manage link to continue.

Upload the .pfx file you exported earlier. Enter the password you wrote while exporting. Click Upload to continue.


Go ahead and deploy your project again, you might be required to delete the existing deployment before you can upgrade. This is the case if you’ve moved from a temporary self-signed SSL to an official one. When the new deployment is complete, your web site works with HTTPS! Below is a screenshot of Boks running on Windows Azure with HTTPS.



Securing your services with HTTPS is important to ensure the privacy and safety of your customers and users. Never allow anyone to authenticate their credentials with your site unless it’s with HTTPS. When you don’t use HTTPS, all the information the user enters on your web site can be sniffed and logged by third parties at various steps in the network from the client computer to your hosted server. In many cases, this data travels across multiple country borders.

Installing and configuring HTTPS certificates is sometimes hard, but I hope this walkthrough makes you aware of the important and how quickly and easy you can get up and running with a valid SSL certificate.

Additional Walkthroughs

There are many examples on the first part of this walkthrough, namely the one to request certificates and get them approved. How do I… Request and install SSL certificates in IIS 7.0? by Mark Kaelin is one which explains more in details the various steps. Do a search to find more walkthroughs if you’re having problems.


Technology Predictions for 2010

TheRegion2010 As we enter a new decade, one which will be called The Tens or Twenty Tens, we are ready to a fresh start on a new decade. One which will bring huge important developments in all areas where human kind puts it’s efforts and energy.

Let me first give a general introduction to what others have predicted for this decade, which is one of the favorite decades for many books and Hollywood movies. Then I will give my own predictions for this decade and what will happen in the year 2010. At the end, I will summarize my predictions for 2008 and 2009 and see what I got right or wrong.

The Tens

As 2001: A Space Odyssey was instrumental for the previous decade; The Aughties, there is another film adaptation for the second book in the series by Arthur C. Clarke. 2010: The Year We Make Contact is a movie which is largely set on the context of the Cold War between USA and Soviet Union. While Russia (previously Soviet Union) is not on-par with it’s previous empire, they will be an important partner in the global society, for both environmental and technological reasons.

Other movies that depicts The Tens are: Terminator Salvation, Blade Runner, Akira, I Am Legend, Back to the Future Part II, 2012, The Island and with games such as Call of Duty 4: Modern Warfare 1 and 2, Metal Gear Solid 4 and S.T.A.L.K.E.R. (which happen to occur in Chernobyl).

A lot of movie and game predictions for The Tens evolve around misery of all forms: World Wars, famine, terror, death and global disasters from natural disasters, the creation of intelligent machines and whatnot. I predict that very little of the movies and games mentioned above will come true 😉

Decade of Subscriptions

My main major prediction for The Tens is that it will represents a dramatic shift towards subscription based models for everything. This isn’t exactly a revolutionary prediction, as almost everything today is already based around subscriptions. There are some few industries which stills holds true to the concept of charging pr. item, these will fail in The Tens unless they are willing to change. We can’t truly say that the previous decade had subscription models for everything, cause in many instances it was only available and restricted to certain parts of the globe. Microsoft’s Zune Pass, subscription to the (almost) entire music library on Zune, available to subscribers only in the USA. Spotify, another similar service, an innovation from Sweden, is still only available to a few selected countries. My prediction is that the subscription model will become more global, span country borders and become available to anyone, anytime, from anywhere – from any status in society.

Those who make it big in The Tens, are those early out with good subscription-based services. If we see to Amazon and their newly released (internationally) Kindle – which was the most gifted item on ever and the increase in sales of digital eBooks, we see that people are ready for a change in an area which for long have been held off due to lack in innovation on the reading devices and the usability and experience of delivery. Anyone can use the Kindle and buy new books with ease from the chair, bed, taxi or bus you’re currently at. Amazon Kindle books are today packed with copy protection and all the items you purchase are not yours, you only pay for a license to the item. There are two issues with this, the first one with copy protection (DRM) is one that hopefully will go away, in the same manner as Apple iTunes stopped protecting the music they sold. The other issue with consumers not actually being able to purchase a product, will be a major hurdle going forward. We will see lawsuits and protests against the players in this field. It’s important for us as consumers to realize that our rights are virtually none when it comes to digital content purchases. We need to demand our rights to the digital products in the same manner as we have with physical products. When I purchase a physical CD album or a book, this is something I can later on sell, rent or give away to a friend, relative or can be inherited by my (future) children. With digital content, this is not legal or physically possible. Amazon has another big problem that the costs of eBooks are not worth the dramatic reduction is usage rights and value for money. I’ve seen countless examples of books which are cheaper in paperback than eBook, this will hopefully change moving forward.

We consume more content than we ever did before and none is able to pay for every piece of song, movie or TV-serie episode they watch. Which is why many still listen to radios and watch television. I’d really like to see a television which charges pr. watched episode of reruns of Friends and Seinfeld. I pay a relatively small price (less than one CD) every month for access to the full music library on Spotify and I can listen to my favorite playlists on all my computers and my mobile phone. I don’t have to care about the hassle with synchronization of content and connecting cables to the computer.

From my 2009 predictions, here is one that became a widespread reality for many:

Here is a scenario you will experience many times in 2009: You go to a party and connected to the stereo is a computer laptop. It’s running Spotify and music is streamed out through the speakers. You step up to the laptop and login with your own account and starts playing your private playlist of the best party-music in history.

Legal methods of video distribution will have similar appeal as Spotify and will have a good chance of competing against less-legal ways of distribution, which by all counts today represents a big deal of the video distribution market. I’m currently a beta-tester for a coming video-subscription service called Voddler, another innovation from Sweden, and the potential is great.

Here is my wish list:

  • Amazon Kindle Subscription, read any book you want in the entire library for $9.99 a month. Download copies without restrictions for $2.99.
  • Zune Pass International, Microsoft is planning international release of Zune in 2010, let’s hope it happens. Until then, I’m more than happy with Spotify.
  • Movie Subscriptions, services such as Voddler and others will become more widespread. Television sets will include support for streaming videos over the Internet.
  • YouTube Premium, pr. movie cost and/or subscription model for premium content delivered by YouTube. This will work on most of today’s TV set which supports YouTube.

This chapter had focus on media content, but the same is true for everything in the free market. Including computer-rental (through Cloud Computing models), robot-rental (this will be huge), car-rental (inner-city electrical vehicles).

What is the potential for a subscription provider? Massive amount of customers. The last 20 years brought 1 billion users to the Internet, the next 5 years will bring another billion users to the Internet. Think about that!

Decade of Simplicity

Another big area of development will be towards simplicity, in all it’s forms and shapes. While the previous decade had a lot of focus on functionality and features, the next decade will have focus on streamlining existing functionality and features – into simple and intuitive consumer solutions.

At the end of the decade, mobile phones had more features than users knew what to do with. Microsoft Word had more functions that any single human could learn about. This spawned new products, such as the Apple iPhone, the new Ribbon-bar in Microsoft Office 2007 and Twitter, micro-blogging service. These developments will in the 2010s spawn of new services and products, where the focus is on building simple and user-friendly solutions.


2010 For the next year we will see a smooth progress in all areas of technology. As our computing power and abstraction models have become more advanced and established, we are more fit for an continued evolution of progress in all areas of research and development.

Computers will have six and possibly 8-core CPUs, most computers will come standard with minimum 4GB RAM at the end of 2010. We will see the first demos of Windows “8”. Visual Studio 2010 will be released, which will further improve developer productivity and team collaboration. More software projects will succeed than ever before in recorded history, thanks to improved practices, tools and patterns. Cloud Computing will give us an additional abstraction layer that makes it faster and easier to build online services that can scale to millions of users.

Avatar made movie-history in the end of 2009, with it’s innovative use of 3D. It will spawn an era of a lot of crappy and some good 3D movies. Hollywood will yet again make record numbers in 2010, though this is in some ways the last year of growth if they can’t move to an online distribution model that can compete with “free” alternatives.

We’ll get news about the next-generation gaming consoles from Nintendo, Microsoft and Sony. HALO: Reach will be played by a record number of simulations online gamers, beating the previous record set in 2009 by Modern Warfare 2.

Microsoft will release more online services which are only available to individuals in the USA. They will release at least one new software product which is completely unknown to the public before announced. Big things will happen with Windows Live Services, details will probably be known during MIX10 conference. Apple will release an incredible, amazing, awesome tablet computing device. Google will continue to develop Google Wave and their OS will be largely ignored by the market as there are better alternatives already available for instant-on OS-experiences.

Software that previously was restricted to local installation and desktop will be built using Silverlight and use online cloud storage to persist data in a secure and scalable manner. All your documents, books, music and other content will be stored in the cloud, except from movies. Movies will come later, as the capacity of network bandwidth and storage becomes cheaper.

One new and important service in the online cloud world is Boks, it’s a service that helps you organize all your stuff (books, movies, games, CDs, etc.). It will  be available in 2010 and is developed by me:

Previous Predictions

In 2008 (Technology in 2008 and Beyond) I focused on more general predictions, problems with the global warming, the free markets inability to innovative in face of regulations and that the CPU will continue according to Moore’s Law. Other predictions I made in 2008 came true for many, the ever increasing social networking services and our tendency to be drawn into them. Facebook, Twitter and blogging is still growing in popularity and the trend will continue. We have become accustomed to spending more time online and our need for content is growing. Watching videos on YouTube is the new way we entertain ourselves and friends (watching videos in the living room on the big TV), educate ourselves, involves ourselves in debates and discussions and it’s the way we communicate our thoughts, ideas, opinions and imaginations.

In 2009 (Technology Prediction for 2009) I tried to be more concrete with my predictions and one of the major ones was of course social networks, which have more or less come true. Social networks are today an important part of the global society. I predicted laptops with 16 and 32GB RAM, today you can buy an Dell M6500 with 16GB RAM and Quad Core CPU.

Another prediction I made was $199 netbook, which happened in November. You could get an Acer Aspire One with 160GB HDD, 1GB RAM, wireless network, webcam and more for $199.

Microsoft had a very successful release of Windows 7, computer security is still a major problem. Phishing was big in 2009 as I predicted, with millions of usernames and passwords lost for services such as Google GMail, Hotmail and Yahoo. Cloud Computing is off to a relatively slow start, those who’s willing to bet on the cloud are having success and more will come in 2010.

We have not seen the development in emotional software as I predicted, thought I’m confident that this will become a very important industry in the The Tens.

To read more predictions for 2010, check out my fellow Microsoft Regional Directors at

(Photo by Tim in Sydney)


I know your passwords

Computer security is one of the hardest things in computer science and engineering. It’s easy to make software today, anyone can do it. Though, not everyone knows how to develop security into their software. Every week I come across insecure solutions and it frightens me, it gives me the willies.

Was looking for a provoking title for this post as I want people to read it. I hope it worked and please keep on reading.

Today I only want to touch upon one issue; passwords. This is an area that affects every one of us and is pretty easy to explain. If you’re a software developer and you read this, make sure you don’t make the same mistakes. If you’re a consumer, make sure you tell your service provider that they need to change their practices. This is a major industrial issue, please raise your voice. If you have little time, please skip forward to the “Learn by Example” section.

Stubbornness or Cluelessness?

116033885_fdbe8fc197 Whenever I come across a web site that has a potential secure issue, I contact the offenders and try to explain the problems I’m seeing.

A lot of time, I’m only meet with ignorant support personal that doesn’t understand what I’m saying.

That’s OK, I’m a pretty technical guy and I don’t expect everyone to understand this, there’s no reason for it. But, when they for some reason argues with my request to forward my message to someone technical and responsible for security, I’m baffled.

Many don’t seem to take their customers privacy seriously, and they are reluctant to react to issues.

Next time you come across a web site that has problems, like those I’m about to elaborate, I hope you take the time to let them know you won’t use their service until they improve their systems. What does all of this have to do with Tom Cruise in the photo? I found him when I searched for a tech support photo and he looks just like a tech-support :-)

Username and Password

2505803867_913846f3ed_b In the beginning of the computer industry, we rarely cared much for the security on our local machines. We shared the same user accounts and we mainly used different usernames to individualize the computer. We were disconnected and the way we distributed software was with diskettes and later on using CD-ROMs.

The information we stored on our computers was often school and work related, it didn’t contain much personal details or communication. No matter what you put on the computer’s hard drive, it required someone to physically steal it to peek at your data.

Then came the local network, where we hooked up computers in offices and with our friends for a LAN party. Information was spread freely on the networks, sharing games, videos, music. Just as we previously burnt CDs and recorded tapes with music and videos on VHS, we could now share our stuff must quicker and more cheaply than ever before.

Enter the Internet.

Suddenly our local insecure computers are connected to the online digital world. A myriad of software and services was created, in a global mess of information that makes it impossible for anyone to really know who or what you can trust. And everyone want your username and password, it’s their way of distinguish You from Me.

We’ve all heard the lesson that you should make sure your password is a hard one to guess, yet many of us have a hard time coming up with any sensible password that we’ll remember easily. It’s also important to don’t reuse the same password everywhere. As you will understand if you read on…

Please Enter…

Please enter your username and password, and we’ll open the door for you and let you into our fine establishment. That’s how it starts, if you’re not already registered on the web site you’re required to fill out, often an extensive, form that tries to capture some personal details from you. Part of this process is filling out your username of choice, password and email address.

This is where the problems starts…

Let’s start with Google’s GMail as our first example. Creating a new account involves filling out the first name, last name, desired login name and password. Additionally, Google wants’ you to pick a “Security Question”? What’s the purpose of this, you might wonder? Does this make you more secure? No, it doesn’t.


There are only four default security questions proposed by Google, and they have a help page that explains what type of question and information you should avoid. Things like your mothers maiden name and other information that is easily discoverable about yourself. You can write your own question, but my advice is to completely forget about the security question, it’s way to easy to put something that someone can guess or figure out.

Then we have a field called secondary email. This is a very nice solution to be able to restore access to a new email account, it’s better than the security question.

If we look at how we humans work, you’ll quickly see that most of the time we will fill out all fields in a registration form, even though we probably don’t need too.

So the issue with this Secondary email field is the following: People without existing email addresses might fill out something in this field, just because they intuitively think it’s required information.

Important: Always make sure you enter the correct email address.

Let me give you a very scary example on what might happen if you write wrong email address when registering a new Gmail account (please excuse the screenshot being in Norwegian).


As the above screenshot is in Norwegian, I will just quickly explain it.

It’s a confirmation email you receive from Google with a confirmation code that is used if you have any problems with your account in the future, for example if you loose the password. I have received multiple of these emails. With this information, I can take over someone else’s email account and read all their communications.


You should be and this is only the beginning… I receive invoices, usernames, passwords, photos, personal messages and what not…


Phone subscription invoices…


Lego account activation… what if your kid filled out personal details, like their full names, address, birth date and other details? That information will be accessible by the person who receives this email.


Online Game registrations that sends passwords in clear text…


Property descriptions… that probably was suppose to go to someone, somewhere…


I could be a Gladiator… I loved the movie, I already hate the online game… and you can see why I hate it.


Love to watch photos… especially the dull and boring family photos from last Christmas.


Guess he won’t see that flat after all…

I’ve received invitation to board meetings, mobile MMS messages sent by mail, photos, responses to job applications, all kinds of crazy stuff. Let me give an example where I actually, for the purpose of this article, click the activation link just to see what kind of information I could stumble upon.

Learn by Example

Disclaimer advisor: I would never try to hack or steal anything from anyone. My intentions in this example is only to show how vulnerable you can be when a service provider doesn’t care about your personal information safety. This is the first and only so-called activation link I’ve clicked that did not belong to me. When I went through with this example, I was scared how easy it was and it was only one of potentially many examples I could do. I had to censor the names, details, URLs to protect the innocent.

1. You register on a website, by filling out your personal details. Potentially information like full name, home address, phone number and finally your password. Which you probably used before on another website as well.

2. This is where things get’s problematic, I own the email address that the user supplied. If I where an evil system administrator, I could potentially steal this email as it hits the servers. There are many ways I could potentially get hold of the specific email or the users email account. Do never presume that your emails are secure.

3. Someone receives your confirmation email about your account. Sometimes this email contains the original password in clear-text. Sometimes it require you to activate the account to “prove” that you are the owner of the email account.


4. After clicking the activation link, I come to the website. Some services actually automatically log you in at this step. This service did not, so I had to use the “recover my password” functionality.


5. I then receive email with a password. Some service will NEVER expose your original password, which is what they never should. When you forget your password, a service should return you an auto generated password. The service in question, returned me the original password that another person had used.


6. Login to the website and check out the users profile too see if there is any interesting information. What I got from this service was full name, birth date, phone number and at the end, there is a empty field for bank account number.


7. I was surprised to see there is a password and confirm password text field on the user profile page. It made me think that possibly the website renders it’s users passwords in the HTML source. And surely they did.


8. I know have this individuals full personalia. Since I have the persons phone number, I can validate that everything is correct, and it is. There are so many ways one can utilize this type of information. The person had an income of approx. $53,000 in 2007, thanks to the public Norwegian tax lists. I know what interests he has and what he looks like, from his Facebook profile photo.

9. I’m not going to take this any further, what I potentially could do is to login to the individual’s Facebook account, as he is probably using the same password there…

Example Conclusions

The scary part of this whole example is that this was done using an online auction website, which probably have a lot of traffic and users. There are just so many security mistakes done on this example that I’m not believing it. They handle VISA and MasterCard transaction, they don’t use HTTPS/SSL for anything. They have probably outsourced the VISA/MasterCard transactions, I hope.

Can you consider what would happen if their database was stolen, with all this information available for all their customers?

Clear Text Passwords

This is the most common mistake made by developers, and it amazes me that there are services out there that still relies on storing your password in clear text. Let me illustrate how this works.

1. User enters a web service and registers with the credentials.

2. Credentials are sent over the Internet, often over an secure HTTPS (SSL) connection. Never fill out important information on an HTTP connection.

3. Credentials are stored in the database.

4. The user comes back to the website to authenticate, password is again sent to the web service and it’s validated against the value that is stored in the database.

When you have trouble remembering your password, those services that store your password as clear text, often allows you to retrieve insecurely them by email. Just because you can’t retrieve the password by email, doesn’t mean it’s stored securely, it can still be clear text in the database somewhere.

Secure Password Communication

With the above example in mind, I want to quickly give you an example on how the web service should handle your passwords securely.

1. User enters a web service and registers with the credentials.

2. Every data is sent over a secure HTTPS connection.

3. The web service generates a non-reversible hash based upon your password and any type of hidden secret (algorithmic salt).

4. The hash of your password, which is not reversible except with an awfully powerful computer and a lot of time, is then stored in a database.

5. The user comes back to the websites to authenticate, password is again sent to the web service, but this time it will generate the hash all over again, retrieve the existing hash from the database, and compare those two values. If they are the same, you are authenticate.

There are absolutely no reason why a service provider should require to store your password in clear text. If they have a reason, it better be a very good one.

Simple Passwords

A lot of web services demands that you enter a fixed password length, sometimes between 4 to 12 characters ( and American Express has limited your password too 6-8 characters. Characters and numbers is required, not sure if they allow non-ASCII characters. You don’t need to be a mathematician to understand that a brute force attack on American Express is easy, considering the requirements for user passwords.



You’ve been Hacked!

How do you know that your service has not been hacked or leaked customer details? Every month there are news stories about information that has been lost and systems taken down by hackers. I promise you that we’re just seeing the tip of the iceberg in this regard. Do you really think that hackers will tell anyone that they’ve gained access to your information?

Spotify was recently hacked and they published a letter to all their subscribers. Luckily for us users, they follow best practices and did not store your passwords as clear text, only as an cryptographic hash. This ensured a minimal consequence of Spotify being hacked. There is today more than a million users on Spotify, consider the consequences if they didn’t do security properly?

If you uncover a service that has a potential to leak any personal information, please inform those in charge and make sure they change their practices. I do it all the time, and it does make a difference.

That’s it and make sure you follow some best practices regarding your passwords.


Copyright disclaimer: “Passwords are like pants” photo by Richard Parmiter and licensed under Creative Commons. Photo of "Tom Cruise by banky177 and licensed under Creative Commons.