Thursday, December 16, 2010

Travels through Europe with Windows Azure

I've been traveling through Europe this week spreading the good word about Windows Azure and have had a great time (some photos below). Although I've been to Europe many times, it had been almost ten years since my last trip and I was going alone, so I planned it carefully.

My schedule is tight: immediately after delivering Azure training in Los Angeles I am off to the airport to catch a flight to Germany. The weather reports for Europe have not been encouraging: blizzard-level snow storms, roads and borders closed, hundreds of flights cancelled, thousands of passengers stranded, and (my favorite) a shortage of de-icing fluid for aircraft. Not sure what to expect on the other side, I boarded my Lufthansa flight.

Service on board the Airbus 340 was terrific, though flight attendants kept assuming I spoke German at first because I look German (however, my German is limited to the dining menu!). If there was bad weather the pilot handled it so well no one noticed. My business class airline seat was so full of features it would not have looked out of place in a science-fiction movie and came with a 10-page, full-color user guide. It had power, a table, sophisticated positioning controls, a reading lamp, an eyeglass holder, a shoe-holder, a fantastic video system, and converted into a full bed (perhaps the most important feature). My only disappointment was that the AC power was limited to 75 watts, not enough to power my laptop which is admittedly the largest laptop in the world. I'd hoped to work on my Azure book and was limited to just a few hours of battery life. Next time I'll bring a small Netbook with long battery life.

Looking down on the blanket of clouds from 39,000 feet reminded me of how global and all-encompassing cloud computing is. The plane had a monitor showing a visual display of our flight path, and in our approach to Europe I was delighted to see we flew directly over Dublin, Ireland and then Amsterdam, Netherlands—home of 2 Windows Azure data centers! I waved…

Arriving in Munich airport, I was wondering if my recently-acquired Windows Phone would work in Europe. Alas, my phone was unwilling to recognize a signal. My backup plan was to buy a local pre-paid phone for the week at the airport, but to my surprise this was not available. So I went through the week without phone service, relying on email when I was online in hotel rooms or on-site at training centers.

Munich is the third largest city in Germany and one of the most influential. It is also the capital of the area known as Bavaria. If you recall the 1968 film "Chitty Chitty Bang Bang", the castle in the movie is an actual castle in Bavaria. I have a personal connection to the area as my grandfather was from Bavaria.

I stayed at the Sheraton Arabellapark Hotel which I really enjoyed. Although I didn't have time to check it out, the hotel is well-known for having a pool and spa up on the 22nd floor. The 66 Grill Restaurant in the hotel was out of this world: I enjoyed every meal. If you're ever staying in the Arabellapark area of Bavaria this is a good place to stay, especially if you are a Foodie like me. The lavish breakfast buffet, included with your room, includes delicious breads, pastries, eggs, sausages, cheeses, fruits, and of course great coffee. I probably did in my diet with all the eating but everything was too good to pass up trying. Lunch and dinner were equally tasty. I particularly enjoyed trying cream of pumpkin soup with curry, local sausages and mustards, and various veal and beef dishes. If you haven't experienced them, it is difficult for me to convey to you how good basic things like bread and coffee are from this area as well as specialty dishes like goulash and spaetzle--don't miss trying them if you ever have the opportunity.

The trip to Germany would not have been complete without apple strudel, which I found at Paulaner's Wirsthaus and Biergarten. This is in the Westin Grand, a hotel across the street from the Sheraton that looks to be very high class.


Apple strudel to die for

The Windows Azure training class I taught had close to 20 people. In introducing myself I shared my personal connection to the area and apologized for the language barrier and "for being an ignorant foreigner" (this drew laughter).


Introducing myself as an ignorant foreigner

The attendees were very friendly and we enjoyed ourselves as you can see from this picture—see how happy Windows Azure makes you? Most of the attendees were new to Windows Azure but we had an expert in attendance, fellow Windows Azure MVP Christian Weyer, who was very helpful. This area of Germany is known for its hospitality and I certainly felt it!


Munich Azure training

This was an expensive trip: the dollar isn’t worth much right now when exchanging currency. My $600 became only 400 euro. I had to make sure I used credit cards for everything including taxis (no problem as long as you confirm this before starting your trip). Speaking of taxis, I had a fun experience on my first day trying to get to the Munich training center from my hotel. Getting into the cab, I showed my driver the address telling him "I think this is nearby". He looked at the address and told me to get out of the cab! --because the location was right around the corner. Ignorant foreigner, indeed!

After completing the event in Germany I flew mid-week to Amsterdam for a second training session in the Netherlands (also known as Holland). Whereas it had been snowy and cold in Munich, I was surprised to land in sunny, almost balmy conditions in the Netherlands. I was cheered to see many windmills (including some modern ones) on my taxi ride to the hotel. The nice weather didn't last long though: cold, rain, and snow were soon upon us. The snow was amazingly thick: it looked more like feathers than snowflakes.


Heavy snow beginning to fall in Nieugewein, Netherlands

The Netherlands Azure class also went very well, held in the city of Nieugewein, where we also had close to 20 in attendance. A number of the attendees had travelled from other countries including Belgium, Portugal, and Austria. You can see from these photos that this group was also friendly, fun-loving, and fond of Windows Azure. Among my new friends is ASP.NET MVP Kris van der Mast.




Netherlands Azure training

Since I live in California, I naturally host many of my Windows Azure demos in the San Antonio, Texas data center and I’d been a little worried about performance and reliability accessing them from Europe. I was pleased to see everything worked great and performed well. I also needed to take care of a production problem with one my Windows Azure applications while in Europe, and my being abroad presented no problems: when your applications are in the cloud, troubleshooting problems and updating deployments is equally easy no matter where you happen to be.

In Europe they have these really cool coffee machines everywhere that can give you just about kind of coffee drink you want and are fully automatic. Just punch in the drink you want--espresso? cappuccino? something else?--and additives such as cream or sugar, and it is instantly prepared for you. Many of these are vending-machine size and serve more than one person at the same time. If you're used to frequenting Starbucks like I am, these machines--often free to use if you're at a business location--are addictive. There was even one at the Munich airport that was free to passengers.



Just about everyone I met in Europe spoke English so I had no trouble getting around and communicating. I really enjoyed my visit and the opportunity to make new friends and get them started on cloud computing with Windows Azure. However, I’ll also be very glad to come home this weekend—the family and I have been missing each other and it’s almost Christmas!

Tuesday, November 30, 2010

New Windows Azure Features Webcasts

Last month at PDC2010 Microsoft announced the imminent availability of many exciting new features and services, some long awaited. These capabilities have just come online for use this week--some as released features and some as community previews you have to sign up for. This includes a completely new management portal and an updated SDK, so there's a lot to get used to.

I'll be covering (and demoing) the new features in a 3-part webcast series, the first of which is this Friday 12/3. Below are the dates, topics covered, and registration links.

What's New from PDC, Part 1 - Windows Azure
Friday 12/03/10
https://www.clicktoattend.com/invitation.aspx?code=152107

What's New from PDC, Part 2 - SQL Azure and Windows Azure VM Role
Wednesday 12/15/2010
https://www.clicktoattend.com/invitation.aspx?code=152108

What's New from PDC, Part 3 - AppFabric and Windows Azure Connect
Tuesday 01/11/2011
https://www.clicktoattend.com/invitation.aspx?code=152109

Hope you can join us!

Monday, November 15, 2010

AzureDesignPatterns.com Re-Launched

AzureDesignPatterns.com has been re-launched after a major overhaul. This site catalogues the design patterns of the Windows Azure Platform. These patterns will be covered in extended detail in my upcoming book, The Azure Handbook.



This site was originally created back in 2008 to catalog the design patterns for the just-announced Windows Azure platform. An overhaul has been long overdue: Azure has certainly come a long way since then and now contains many more features and services--and accordingly many more patterns. Originally there were about a dozen primitive patterns and now there over 70 catalogued. There are additional patterns to add but I believe this initial effort decently covers the platform including the new feature announcements based on what was shown at PDC 2010.

The first category of patterns is Compute Patterns. This includes the Windows Azure Compute Service (Web Role, Worker Role, etc.) and the new AppFabric Cache Service.



The second category of patterns is Storage Patterns. This includes the Windows Azure Storage Service (Blobs, Queues, Tables) and the Content Delivery Network.



The third category of patterns is Communication Patterns. This covers the Windows Azure AppFabric Service Bus.



The fourth category of patterns is Security Patterns. This covers the Windows Azure AppFabric Access Control Service. More patterns certainly need to be added in this area and will be over time.



The fifth category of patterns is Relational Data Patterns. This covers the SQL Azure Database Service, the new SQL Azure Reporting Service, and the DataMarket Service (formerly called Project Dallas).



The sixth category of patterns is Network Patterns. This covers the new Windows Azure Connect virtual networking feature (formerly called Project Sydney).



The original site also contained an Application Patterns section which described composite patterns created out of the primitive patterns. These are coming in the next installment.

I’d very much like to hear feedback on the pattern catalog. Are key patterns missing? Are the pattern names and descriptions and icons clear? Is the organization easy to navigate? Let me know your thoughts.

Webcast: Microsoft Cloud Computing Assessments: Determining TCO and ROI

On Tuesday 11/16/10 my Neudesic colleague Rinat Shagisultanov will deliver Part 2 in our webcast series on cloud computing assessments, this time dealing with how to make the business case by computing TCO and ROI.

Assessments: Determining TCO and ROI

Event Type: Webcast - Pacific Time
Event Start Date: 11/16/2010 10:00 AM
Event End Date: 11/16/2010 11:00 AM
Presenter: Rinat Shagisultanov, Principal Consultant II, Neudesic
Registration: https://www.clicktoattend.com/invitation.aspx?code=151052

Cloud computing can benefit the bottom line of nearly any company, but how do you determine the specific ROI for your applications? In this webcast you'll see how to compute the Total Cost of Ownership (TCO) for your on-premise applications and estimate what the TCO in the cloud will be to gauge your savings. You'll see how your Return on Investment (ROI) can be calculated by considering TCO, migration costs, and application lifetime. Knowing the ROI helps you make informed decisions about risk vs. reward and which opportunities will bring you the greatest value.

Sunday, October 31, 2010

First Impressions Moving from an iPhone to Windows Phone 7

Along with the other attendees of Microsoft’s Professional Developer Conference last week I received a Windows Phone, an LG E900 to be precise. It’s now been several days since I switched over from the iPhone I’d been using for the last couple of years to a Windows Phone and I thought I would share my first impressions. I’ll tell you up-front, I really like it.



iPhone Experience

First off, why was I on an iPhone to begin with? When I was ready for a new phone a few of years back there really wasn’t anything else in the iPhone’s class of user experience and innovation so it was a simple decision. Moreover, coverage where I live is poor for many of the main carriers but AT&T comes in strong which was another point in the iPhone’s favor.

I’ve enjoyed using my iPhone, but I kind of got disenchanted with Apple earlier this year when their version 4 Phone OS came out. The iTunes sync tool offered to upgrade my iPhone 3 to the version 4 OS so I accepted—and boy was I sorry! The phone became a near-useless piece of junk, with sporadic fits of slowness or outright unresponsiveness. It took a lot of research to find the right painstaking process to get back to the original OS. Apple was no help whatsoever, all but denying there was a problem. They also made some less than encouraging admissions, such as this one confessing the algorithm for showing how many bars of signal strength has been misleading. Moreover, owners of the iPhone 4 seemed to be having a lot of pain. I wasn’t sure I wanted to stay with Apple, which put me in the mood for looking around at other options. I knew Windows Phone was coming, and of course Android has been making quite a splash. I was waiting for something to push me in a particular direction.

The LG E900

That push came at PDC when attendees were given Windows Phones. The evening after receiving it I unwrapped the phone from its packaging box with anticipation—noting with some surprise the packaging claim that this phone was intended for use in Europe. I presume that’s just an artifact of this being a give-away. The “European” claim was reinforced by a Frankenstein’s monster of an adaptor, and that I wasn’t able to find an online user guide except on LG’s UK site. Things I can overlook in a free phone.



I immediately noticed the form factor difference—the LG E900 is longer and thinner than an iPhone and also noticeably heavier which gave me some concern at first; however I’ve since found the weight doesn’t bother me. I have to say, there wasn’t much in the way of instructions. The online user manual was of some help, once I tracked it down, but apparently you’re supposed to just figure out most things by exploring. For the most part this worked but it was a little frustrating not knowing how to do some basic things at first.

Making the Switch

I’m not a phone expert, and wondered to myself what I would have to do to get service on the phone. Then it dawned on me that perhaps I could just move the SIM card from my iPhone over to the Windows Phone. Would that work? I decided to find out.

An hour later, I still hadn’t gotten the SIM card out of my iPhone. That’s because it sits behind a door that isn’t easy to open unless you have a special tool or a paper clip. While this sounds simple, good luck doing it if you don’t have a paper clip handy: I tried a ballpoint pen, a plastic toothpick, a twist tie I scraped the insulation off of, and anything else I could find in my hotel room. After many tries but no success, I did what I should have done originally and went down to the hotel front desk and asked for a paper clip. 5 minutes later I had my SIM card out. It wasn’t obvious to me which way this oriented in the LG phone but I eventually figured it out through trial and error. A phone call and a text to my wife confirmed my phone service was working.

As I started to play with the phone I realized that my voice and text service was functional but not my data service. I could not browse the web or get email working. Fortunately I stumbled on this post which explains the set up steps to get your data working if you move from an iPhone to an LG E900 Windows Phone. After I’d followed these instructions to define an “APN”, data worked like a charm.

The User Interface

I hadn’t expected to like the UI of the Windows Phone all that much, for several reasons. First, it’s hard not to admire the iPhone’s user interface—the layout, size, and appearance of application icons as well as the design of many of the screens just feels right. Secondly, I’d seen the Windows Phone UI in conferences, online videos and television ads. I knew Microsoft had to strike out in a unique direction but it didn’t look all that appealing to me.

However, actually using one is a different story. The interface really works well. Three buttons on the bottom of the phone left you navigate back, go to the Start page, or search contextually within whatever app you’re in. There’s a cool feeling you get sliding and flipping around that you can’t appreciate when watching someone else do it. Those television ads that say we need “a phone to save us from our phones” are right on the money: you can get in and do your work really quickly on a Windows Phone. I like it.



I will say I wasn’t thrilled with the default red theme on my Start screen which I promptly changed to an appealing blue. Also, whatever theme color you do choose (there are ten to choose from, shown below), that color does dominate the user interface. Most of your Start screen tiles will be that color. The UI could benefit from a 2-color theme over a single color theme in my opinion. Speaking of the Start screen, those tiles are “live tiles” that contain data and some of them animate. This allows you at a glance to know if you have voice mail, text message, email, new apps in the marketplace, and so on.

Email

Setting up Outlook for my corporate mail could not have gone easier. Hotmail, on the other hand, was a different story. It wouldn’t sync, giving a mysterious error code and message that it was having trouble connecting to the Hotmail service after a few seconds. The fix turned out to be changing the mail account definition’s sync time from All Messages to Last 3 Days. This was discovered after a lot of trial and error: the error message certainly didn’t give a hint about that, nor is it clear to this day why All Messages doesn’t work (for me, anyway).

A few things took me some time to figure out. I couldn’t find a battery indicator at first; this turns out to be on the cover screen that you normally slide away right off the bat when you go to use the phone. Vibrate/ring also eluded me until I realized hitting the volume controls pops up an area for setting that.

Video and Audio

The video and audio quality are quite good. Streaming a movie from Netflix was excellent over a wifi connection—but rather blocky and dissatisfying without wifi. I watched Iron Man over the phone via Netflix and the picture was truly outstanding.

There's a 5 megapixel camera on the LG E900. It's terrific: it takes amazingly good photos and video.

Office

You get Office on the Windows Phone which is pretty cool. On my flight from Seattle back to Southern California I used Word on the phone to write a document and it was usable. It’s a little slow for me because I’m still getting used to this phone’s keyboard but that’s temporary.

App Store and Zune

The App Store experience is good and similar to the iPhone’s App Store. The number one phone app I use is email and that’s already there. Other apps I use that are already available for Windows Phone include NetFlix and Twitter. Another app I like to use on the iPhone is Kindle: fortunately at PDC it was announced Kindle is coming soon for Windows Phone and it was demonstrated in the keynote talk. Another favorite iPhone app of mine is Scrabble; we’ll have to see if that becomes available on Windows Phone. The app store experience is good but of course there aren’t as many apps yet for Windows Phone as for longer-established phones. We developers need to get busy creating great Windows Phone apps.

Just as iTunes is both the music/video store and sync tool for iPhone, Zune servers that purpose for Windows Phone. Once I installed Zune on my PC and connected the phone to it, I was able to download music (on a 14-day trial) and move music and pictures over to the phone.

Development

Another reason to get excited about Windows Phone is that you can program for it in Silverlight, which is a technology I use quite often. Moreover, Windows Phone and Windows Azure cloud computing go very well together. At PDC, Steve Marx gave a really great session on joint phone-and-cloud development which I highly recommend. There’s definitely some phone+cloud development in my future.

Conclusions

It’s not fair to compare a phone you’ve been using for a couple of days against one you’ve been using for a couple of years, but these are my first impressions. So far, I haven’t looked back!

Wednesday, October 27, 2010

Azure Storage Explorer 4 Beta Refresh

Last week we released Azure Storage Explorer 4. We have released several refreshes since then, in order to respond to feedback from the first several hundred downloaders. Some of these refreshes are important because they fix a bug or add a valuable feature. If you get the latest refresh (Beta 1 Refresh 4 or later), Azure Storage Explorer 4 will now notify you when there is a new version so you won't have to remember to check back on the CodePlex site for updates. If you're on Azure Storage Explorer 4, please update to the refresh so that you'll have the best experience.

Here are some of the features added in the refreshes:

• Blob uploads now automatically set ContentType based on file type
• CSV download/upload now preserves column types
• New download/upload formats supported: Plain XMl and AtomPub XML
• UI improvements
• Private/public folder icons indicate whether blob containers are public or not
• Automatically checks for a new software version
• Checks for and corrects blob containers with old (outdated) permissions attributes and updates to current Azure standard
• Preserves window position and size between sessions
• Allows culture to be set

Friday, October 22, 2010

Azure Storage Explorer 4 Beta Now Available

I'm pleased to announce the public beta of Azure Storage Explorer version 4 is now available. Beta 1 can be downloaded from CodePlex.

Azure Storage Explorer allows you to view and edit all 3 types of cloud storage: blobs, queues, and tables. If you're not already familiar with it, Azure Storage Explorer was one of the first (if not the first) GUI tools for viewing and working with Windows Azure storage. This utility was written in the very early days of Windows Azure, and even after 3 major versions all of that pre-dated the commercial release of Windows Azure in early 2010. Altogether there have been over 13,000 downloads.

It's been a year since version 3 was published, and in that time the Windows Azure platform has moved forward at a rapid pace. Many users have been hungry for an update that supports newer features such as blob root containers and better handles nuances such as blob prefix paths and property editing.

Highlights of version 4

Better Code. Versions 1-3 of Azure Storage Explorer didn't have the .NET Storage Client library to use and were based on an SDK sample that had a voluminous number of classes, leading to code that was vast and complex. In version 4 we are using the .NET StorageClient library and the code is compact and well-organized. The source code is open and is part of the CodePlex project.
Newer storage feature support. Support has been added for newer features such as blob root containers, blob path prefixes, and page blobs.
Copy and rename containers, queues, and tables.
Direct data entry and editing of blobs, messages, and entities.
Improved UI. The new WPF-based UI is cleaner, and supports opening multiple storage acounts at the same in tab views. The Mode-View-ViewModel pattern is used.

Containers & Blobs

• Create, View, Copy, Rename, Delete Containers
• Create, View, Copy, Rename, Delete, Upload, Download Blobs



Blobs can be viewed as images, video, or text. Blob properties can be viewed/editied.



Queues & Messages

• Create, View, Copy, Rename, Delete Queues
• Create, View, Pop, Clear, Upload, Download Messages



Message content and properties can be viewed.



Tables & Entities

• Create, View, Copy, Rename, Delete Tables
• Create, Edit, Copy, Rename, Delete, Upload, Download Entities



Entities can be viewed and edited.



Remember, it's a beta

With any beta software, you should exercise caution. Keep in mind that both people and programs can make mistakes, and it's always a good idea to keep safe backups of your data.

Azure Storage Explorer is a community donation of Neudesic. As with previous versions, Azure Storage Explorer remains free. Full source code is on CodePlex and we invite the community to help us keep it up to date and make improvements.

Wednesday, October 20, 2010

Upcoming Webcast: Cloud Computing Assessments, Part 1

Tomorrow (10/21) at 10 AM Pacific time I'll be giving the first in a series of webcasts on Microsoft Cloud Computing Assessments. Register here. These webcasts go hand in hand with my cloud computing assessment article series.

Microsoft Cloud Computing Assessments: The Right Way to Evaluate and Adopt Cloud Computing

Event Code: 151050
10/21/2010
10:00 AM - 11:00 AM
Welcome Time: 09:55 AM
Time Zone: Pacific
Event Language: Not Specified
Connection information for this Webcast will be sent in your event confirmation.
Registration: https://www.clicktoattend.com/invitation.aspx?code=151050

--------------------------------------------------------------------------------

Featured Product/Topic: Windows Azure platform

Recommended Audiences: Technology Executives, IT Managers, IT Professionals, Business Executives, CIO, CTO, IT Directors, Business Decision Maker, Technical Decision Makers, Developers

Cloud computing offers so much promise, but it is new and confusing to many. You may be wondering whether cloud computing is right for your business, what the financial return might be, and how to go about getting started with it without making a mistake. In this webcast you'll be introduced to an assessment process for Microsoft cloud computing that puts the value proposition of the cloud into sharp focus for your business. You'll see how an assessment sheds light on risk vs. reward, identifies promising opportunities, analyzes applications financially and technically, and helps you figure out your cloud computing strategy. With the clarity and plan that comes out of an assessment you will be able to evaluate and adopt cloud computing responsibly, maximizing the benefits while managing risk.

--------------------------------------------------------------------------------

Wednesday, October 13, 2010

Cloud Computing Assessments, Part 2: Envisioning Benefits & Risks

In Part 1 of this series we discussed the ideal rhythm for exploring cloud computing responsibly and the critical role a cloud computing assessment plays. An assessment allows you to get specific about what the cloud can mean to your company. Here in Part 2 we will consider assessment activities for envisioning, in which you take a look at where the cloud could take you.

Envisioning: Finding the Cloud’s Synergies with your Business

Cloud computing has so many value propositions that it’s almost problematic! When you hear general cloud messaging you’re exposed to many potential benefits that include cost reduction, faster time to market, and simplified IT among others. Different organizations care about different parts of the value proposition. In an assessment, we want to find out which benefits have strong synergy with your company—and focus on them.

A good exercise to evaluate where the value proposition resonates is to gather business and technical decision makers, provide some education on benefits, and have a discussion to see where there is interest. Your people may be enthusiastic about some of these benefits but neutral or even negative about others. Here are some of the benefits to consider:

Elasticity
cca_2_01

In the cloud you can change your footprint anytime, quickly and easily. Think of the cloud as a big rubber band.

No Commitment
cca_2_02

In the cloud you have easy entry and easy exit. You can stay in the cloud as long as you wish, but you can walk away any time, with no financial or legal commitments beyond your current month’s bill.

Reduced Cost
cca_2_03

In the cloud you are likely to see reduced costs, in some cases extremely reduced costs. These reduced costs derive from the use of shared resources, the economy of scale in the cloud, and your ability to only use and pay for resources as long as you need them.

Consumption-based Pricing
cca_2_04

In the cloud you only pay for what you use, and you only use what you need.

Extra Capacity
cca_2_05

In the cloud you can expand capacity whenever you need to, even if a surge in demand is sudden and unexpected. You have the comfort of knowing extra capacity is there for you, but you only pay for it when you actually need it.

Faster Time to Market
cca_2_06

In the cloud you can deploy new and updated applications very quickly. On Windows Azure for example you can deploy an application in 20 minutes or less.

Self-Service IT
cca_2_07

In the cloud some IT tasks become so simple anyone can do them. Company IT cultures differ on this, but for some companies the ability to let more individuals and departments directly control their own deployments and level of scale is attractive.

 

SLA
cca_2_08

In the cloud you have a Service Level Agreement that boils down to 3 9’s (99.9%), or up to 8 hours of unavailability in a year. For some companies and applications that’s an improvement over their current SLA; for others it may be a downgrade.

Simplify IT
cca_2_09

In the cloud certain IT tasks become very simple, just a click or two in a web portal. This includes provisioning, software deployment, and upgrades.

Management
cca_2_10

In the cloud you have automated management working on your behalf. In the case of Windows Azure, patches are applied to your servers automatically; server health is monitored; and your availability and data integrity are protected through managed redundancy.

Convert CapEx to OpEx
cca_2_11

In the cloud you do away with a lot of capital expenditures such as buying server hardware. This is replaced with operating expenditures, your pay-as-you-go monthly bill. For many companies this means a healthier balance sheet, but not all companies and managers see this as positive. Some people have easier access to capital budget than operating budget.

New Capabilities
cca_2_12

In the cloud you have new capabilities. For example, Windows Azure provides new capabilities for business-to-business communication and federated security. These new capabilities can allow you to innovate and realize a competitive edge. The cloud also enables some new business models such as Software-as-a-Service that you may have interest in.


It can be useful to have separate envisioning meetings with business and technical people; you’ll likely find different audiences have different interests and concerns. For example, a CIO could be gung-ho about the cloud while the IT department below them is apprehensive about the cloud.

Risks & Concerns

A benefits discussion must be complemented with a risk discussion. Anything new like cloud computing will naturally lead to concerns, real or imaged. Each concern needs to be mitigated to the stakeholders’ satisfaction. Examples of concerns frequently raised are security, performance, availability, disaster recovery, vendor lock-in, in-house capability, and runaway billing concerns.

Security

Security comes up in nearly all cloud discussions. Sometimes there will be a specific risk in mind but often the concern is just a general expression of “I’m concerned about security in the cloud”. The best way to feel good about security in the cloud is first to understand how good security in the cloud is: cloud providers invest a massive amount in security. Next, start getting specific about areas of concern: only then can remedies be designed.

For specific security concerns, the consulting firm performing your assessment should have a knowledge base of commonly-raised concerns—such as data falling into the wrong hands—and standard mitigations for them. In addition, there should be a defined approach for threat modeling risks and planning defenses.

Security in the cloud is best viewed as a partnership between you and the cloud provider. There are certain things the cloud environment will do to protect you, and there are complementary things you can do yourself. An example of something you can do is encrypting all of the data you transmit and store. An assessment should capture your concerns and record the plan for dealing with them.

Performance & Availability

Since the cloud is a different environment from your enterprise, you can’t assume the dynamics are the same. You may find performance to be stellar, about the same, or disappointing depending on what you’re used to. An assessment should consider the performance requirements of applications and plan to validate them in a proof-of-concept.

Availability is more straightforward to predict because there is a published SLA, but the Internet path between the cloud computing data center and your users is outside the cloud provider’s control. If your users are in an area with poor or unreliable Internet service, availability expectations should be revised accordingly.

Vendor Lock-in

Some organizations have a fear of vendor lock-in: if you move something to the cloud, are you stuck there? There’s an interesting discussion to be had here. On the one hand, it’s perfectly possible to write applications that can run on-premise or in the cloud, preserving your ability to move back and forth. On the other hand, if you take advantage of new, only-in-the-cloud features such as Windows Azure AppFabric, you’ll lose some portability (but it may be worth doing so for the benefits). An assessment is an occasion to weigh these tensions and pick a lane.

Disaster Recovery

Cloud providers have many mechanisms to protect your data, such as redundancy, but much of this is automatic and neither visible nor controllable by you. You may require a level above this where you can for example make time-stamped snapshots of your data and be able to restore them on demand. An assessment should map out your DR requirements, including RTO & RPO, and determine how you and the cloud platform will collaborate to meet them.

In-house Capability & Process

If you are going to adopt cloud computing your developers and IT department will need the appropriate skills. An assessment should include an analysis of where people skills are today and where they need to be for cloud computing adoption. It’s not only skills that need updating but process as well: the cloud will surely impact your development and deployment processes. Your cloud computing plans should budget for this training and process refinement.

Billing Concerns

Some find the “just like electricity” metering aspect of the cloud unnerving: what if your billing runs out of control? An assessment should identify procedures for measuring billing and monitoring applications proactively, identifying disturbing trends early so they can be investigated before large charges accrue. In the case of Windows Azure, for example, billing can be inspected daily and it’s not necessary to wait till the end of the month to learn how charges are trending.

Trust

By and large, trust is at the root of most cloud computing concerns. Trust is something that needs to be earned, and in cloud computing it can and should be earned in degrees. If you’ve had a good experience with a proof-of-concept in the cloud, that will bolster your confidence to put something in production in the cloud. Your assessment should produce a roadmap that promotes measured, increasing use of the cloud with validation that expectations were met at every step.

Alignment

Since the purpose of a cloud computing assessment is to find the fit for your organization, it’s very important to understand what is already going on in the company. Any cloud computing plans should align with this backdrop.

Business Alignment

Your company’s business plan likely has significant events on the calendar, for example launch of a new product line or service. Annual planning and budgeting are another example. The flow of business initiatives may suggest that certain cloud opportunities make sense sooner or later on the timeline.

IT Alignment

Your IT department is also likely to have events on the calendar that should be taken into consideration. Is a server refresh cycle scheduled? Consider that using the cloud might allow you to avoid or reduce buying that hardware. Are there plans to overhaul the data center? A cloud strategy might allow you to drastically alter the size and cost of your data center, using the cloud for overflow at peak times.

Envisioning Provides Business Context

Much of a cloud computing assessment will involve identifying and analyzing specific opportunities (applications), but this initial envisioning activity is important. It gives you the business context for your technical decisions. In envisioning you capture both the areas of traction and disconnect between cloud computing and your organization. This information will help you in forming your cloud computing strategy and it will color the suitability scoring of potential cloud opportunities. Timing of cloud initiatives should take business and IT initiatives into account.

In subsequent installments we’ll look at more activities that are performed in a cloud computing assessment. If you’d like to see how we do it at Neudesic, visit http://cloud-assessment.com/.

Sunday, October 3, 2010

Cloud Computing Assessments, Part 1: The Right Way to Adopt Cloud Computing

There is a right way and a wrong way to get involved with cloud computing. This article series is about doing it the right way and focuses on the use of cloud computing assessments to properly evaluate, plan for, and adopt cloud computing. Here in Part 1 we’ll be looking at the technology hype cycle, the best rhythm for adopting the cloud, and the critical role an assessment plays. In subsequent parts of the series we’ll look more deeply at various aspects of an assessment such as computing ROI, dealing with security, technical considerations, and the impact on IT. Since I work with the Windows Azure platform that’s where I’ll be focusing.

The Technology Hype Cycle

When major technology waves are unleashed, there’s a lot of buzz and also a lot of uncertainty. The well-known Gartner Hype Cycle explains why this is so. A new technology triggers many expectations, some of them unrealistic or not immediately attainable. As the market’s understanding gets sorted out, these inflated expectations transition into disillusionment which is also exaggerated. As time passes and technology and best practices mature enlightenment occurs and there’s a general understanding of what’s really possible and advisable. It’s at that point we have widespread, productive use of the technology.


Technology Hype Cycle


The hype cycle can be scary, and you might be thinking right now that the best thing to do is sit back and wait for cloud computing to mature. On the other hand, early adopters who leverage new technology sooner than others can gain a competitive edge. The hype cycle doesn’t mean you can’t benefit from a technology in its early years, but it’s essential to take some careful precautions to avoid getting burned.
In the case of cloud computing it’s particularly confusing for businesses to know what to do and when to do it. What’s real today, and what’s hype? Where are developments headed? Should you be doing something now or is it best to wait? There’s a real tension between the wish to join the party and realize the benefits and savings vs. the fear you might be jumping the gun or making a costly mistake.

Rhythm for Cloud Computing Adoption

Fortunately, there’s a way to move from uncertainty to certainty about cloud computing, and that’s to follow the rhythm shown below, which has 4 stages: awareness, assessment, experimentation, and adoption. The missing link between thinking about the cloud (the awareness phase) and using the cloud happily (the adoption phase) are the inner activities of an assessment and an experiment.



Let’s look at each phase.

Phase 1: Awareness

In the awareness phase, you’re starting to learn about cloud computing and are forming an initial impression of it. You’re likely getting information from many sources which might include the media, discussions with peers and colleagues, webcasts, conferences, vendor presentations, and the like. You’re getting pummeled with information, partial information, and misinformation. There’s a large buzz, but everything you’re hearing is either generalized or is someone else’s experience.

You wonder, what would cloud computing mean for us specifically? When you ask that question, you are ready for the next phase, an assessment.

Phase 2: Assessment

A cloud computing assessment has one purpose, and that is to bring the cloud into focus for your organization. In the cloud computing assessments we practice at Neudesic we seek to answer these questions in an assessment:

• Can I believe the claims of cloud computing?
• What is Microsoft doing in the cloud?
• What are the benefits?
• Is the cloud a good fit for my business?
• Where are the opportunities and what ROI will they bring?
• How do I avoid risk?
• What does it cost?
• What belongs in the cloud and what doesn’t?
• When is the right time to engage?

An assessment is very much like having a suit tailored to fit you perfectly. We move from the general to the specific. You’ll exit the assessment with a clear understanding of how the cloud can benefit your company; a strategy that fits your business plans; a roadmap of opportunities; and a full view of risk/reward considerations. The roadmap your assessment produces will typically recommend a proof-of-concept and a prioritized timetable for cloud adoption. Some opportunities may make sense immediately but others may be more appropriate further out. Business and IT events on your calendar as well as upcoming cloud computing platform features will help determine the best timing for moving applications into the cloud.

Armed with the clarity and plan that comes out of an assessment, you are ready for the next phase, an experiment.

Phase 3: Experiment

Although an assessment plays an important role in planning for the cloud, there’s no substitute for some actual experience. A proof-of-concept experiment is recommended before you start adopting the cloud for production purposes. The experiment serves several purposes. First, it gives you an opportunity to test the claims of the cloud personally. In addition, the experience will either confirm the results of your assessment or cause you to revise your conclusions and cloud adoption plan.

Once you’ve concluded both an assessment and an experiment, you can proceed to cloud adoption with confidence.

Phase 4: Adoption

The final phase is actual adoption of the cloud. Your earlier assessment should have produced a roadmap for adoption, where some opportunities make sense in a “do them now” first wave and others in a potential second wave. After each migration to the cloud or new project in the cloud, you should reflect on the most recent experience and refine your cloud plans if warranted.

It’s important to set up monitoring and management of your production applications in the cloud, adjusting deployment size in response to changes in demand. Failure to do this could undermine the ROI you expect to get from the cloud.

Now or Later?

Having said all this, what should your timing be for cloud computing? My belief is you won’t be in a position to really answer that question until you’ve had an assessment done. Regardless of your ultimate conclusions about where to use the cloud and when, there are some good reasons to take a serious look at cloud computing right now and put an initial plan in place. Since assessments are often free, there’s a lot to be gained by having one sooner than later. It should be clear after an assessment what opportunities there are for your business to leverage the cloud and what the best timing is. Getting an assessment now doesn’t mean you have to start adopting the cloud now.

What considerations should affect your timing decisions? There are several. One is the importance of getting in front of the cloud. Since cloud computing is a self-serve technology, it may come into your organization all by itself as one department or one individual starts to use it. It’s better to proactively have taken a look at cloud computing and have a strategy and guidance in place for acceptable use--and you can only do that if you start looking at it sooner than later.

Another key consideration for timing is your company’s culture of risk vs. reward. Using Gartner’s definitions, we recognize that not all companies balance risk vs. reward equally. There’s the Aggressive Innovator who values reward over risk and has gotten good at managing risk in order to get the brass ring. There’s the Pragmatic Adopter who looks at risk and reward equally. And there’s the Risk-Averse company who is hesitant to consider risks of any kind. If you’re in the first two categories you’re more likely to take an early look at cloud computing.

Cloud computing may offer you significant savings and agility, and if that’s the case the sooner you start using it the sooner you’ll realize the financial and institutional benefits. In particular you might find it useful to consider business and IT activities on the horizon and align your cloud plans with them. Are you launching a new product line or renewing budgets? Are you nearing a server refresh cycle or re-planning your data center? Aligning your cloud computing plans with existing business and IT plans will maximize opportunities and minimize disruption.

In subsequent installments we’ll look at the various activities that are performed in a cloud computing assessment. If you’d to see how we do them at Neudesic, visit http://cloud-assessment.com.

Friday, October 1, 2010

Azure MVP

I'm very honored to have been selected as a Windows Azure MVP by Microsoft. I've been told I was the first selected, so I'm going to consider that a double honor.

To quote the MVP web site, "The Microsoft Most Valuable Professional (MVP) Award is an annual award given to outstanding members of Microsoft's technical communities based on contributions made during the previous 12 months to offline and online Microsoft-related technical communities." I'll take that to mean I'm doing something useful to help adoption of this great technology platform.

Well, enough tooting my own horn. I'm happy to have received this valuable credential, and I know I need to keep up participation to have it renewed next year.

Windows Azure certainly has come a long way since I got my first advance look at it 2 years ago. It's growing up rapidly, and I love working with it. The best is yet to come. Go Cloud!

Thursday, September 23, 2010

My Windows Azure Wish List – The Future Cloud I Hope to See by 2012

What will cloud computing be like in a couple of years? I got my first look at Windows Azure 2 years ago, and the rate of progress has been nothing short of amazing--and shows no sign of slowing down. What will the cloud be like in another year or two? Where should it go? Here’s where I’d like to see the cloud go over the next couple of years:

1. Auto-Sizing: Out-of-box Governance

Many people don’t seem to be aware that cloud computing brings with it a new management responsibility. A big selling point for the cloud is its elasticity and subsequent cost efficiency—but you only get that if you monitor activity and manage the size of your assets in the cloud. That is not by any means automatic today, so you must elect to do it yourself or through a third-party, either through automated means or human oversight.

We could debate whether this is the cloud provider’s responsibility or the customer’s, and in fact it needs to be a partnership between the two. Since this is something everyone needs to do, however, it seems fair to expect the cloud provider to more than meet us halfway. In the Future Cloud, I’d like to be able to easily set technical or financial thresholds and have the cloud monitor them for me—notifying me about changes and trends and taking action as per my marching orders.

We may get some of these capabilities as cloud integrations become available to operations monitoring software such as System Center—but that’s not a full realization of this idea. The modern start-up may run 100% in the cloud with no on-premise IT. Those companies need a completely in-cloud way to do governance.
Human beings shouldn’t have to babysit the cloud, at least not beyond an oversight/approval level of involvement. It should watch itself for us, and governance should be an out-of-box cloud service.

2. Auto Shut-off: App and Data Lifetime Management

I don’t know about you, but my house probably would have burned down long ago and my electric bills gone through the roof if it were not for the auto shut-off feature of many household appliances such as irons and coffee-makers. You only have to browse the forums to see the daily postings of people who are in shock because they left the faucet running or didn’t realize other so-called hidden costs of the cloud.

It’s human nature to be forgetful, and in the cloud forgetfulness costs you money. Every application put in the cloud starts a run of monthly charges that will continue perpetually until you step in and remove it someday. Every datum put in the cloud is in the same boat: ongoing charges until you remove it. It’s extremely unwise to do either without thinking about the endgame: when will this application need to come out of the cloud? What is the lifetime for this data? You might think you won’t forget about such things, but think about what it will be like when you are using the cloud regularly and have many applications and data stores online.

What we need to solve this problem is lifetime management for assets in the cloud. In the Future Cloud, I’d like to see lifetime policies you can specify up-front when putting applications and data into the cloud—with automated enforcement. You can imagine this including ‘keep it until I delete it’ and ‘keep until [time]’—similar to the options you get on your DVR at home. Auto delete could be dangerous, of course, so we will want more sophisticated options such as an ‘archive’ option, where we take something offline but don’t lose it altogether. Perhaps the best choice we could be given is a lease option, where the app or data’s expiration period gets renewed whenever they are used. This is how auto-shutoff works for many appliances: the shut-off timer gets reset whenever we use them, and only after a certain period of inactivity does deactivation take place.

As with the previous wish list item, this is something everyone needs and is therefore a valid ask of cloud providers. Let us set lifetime policies for our apps and data when we put them in the cloud, and enforce them for us.

3. Mothballing & Auto-Activation: Dehydrate & Rehydrate Apps and Data

As described in the previous wish list item, an ideal implementation of lifetime management for applications and data would include decommissioning and archiving. That is, apps and data that become inactive should be mothballed automatically where they cost us far less than when they are formally deployed.

Along with mothballing comes the need for reactivation. Here I think we can take an idea from workflow technologies such as WF and BizTalk Server, where long-running workflows are dehydrated so that they do not consume finite resources such as threads and memory. They get persisted, and the workflow engine knows what events to look for in order to rehydrate them back into running, active entities.

In the Future Cloud, I’d like apps and data to be dehydrated when inactive and rehydrated when needed again—with greatly reduced costs during the inactive period. We can thus imagine an app that people start to use less and less, and eventually stop using altogether. An example of this might be a health care plan enrollment portal, only used once or twice a year. As the app moves to an inactive state, an expiration policy would cause the cloud to remove all of the server instances. However, the “light would be on”: a future access to the application would bring it back online. We can similarly imagine account data that moves into archive mode when inactive: kept around, but not at the premium rate.

The best realization of this concept would be that mothballed apps and data cost us nothing until they are re-activated. That might be a little unrealistic since the cloud provider is keeping the light on for us, but a mothballed asset should certainly cost a small fraction of an activated one.

4. Automatic Encryption

Most customers go through a period of considering risks and concerns (real or imagined) before they start using the cloud. A common concern that surfaces is the use of shared resources in the cloud and the specter of your critical data somehow falling into the wrong hands. The best way to feel okay about that is to encrypt all data transmitted and stored by your application. That way, if data does fall into the wrong hands—remote as that may be—it won’t be intelligible to them. In the Future Cloud, I’d like all data I store—database and non-database—to be automatically encrypted.

This is another example of something I believe we will all be doing: encryption of data will become a standard practice for all data we put into the cloud. As previously mentioned, whenever there is something everyone wants to do in the cloud it’s fair to ask the cloud provider to provide a service rather than each of us having to separately implement the capability. Naturally, the customer should remain in control of keys and strong encryption methods should be used.

5. Get Closer to True Consumption-based Pricing

Cloud computing has great appeal because of the consumption-based pricing model and the analogies we can make to electricity and other utilities. However, the implementation of that idea today leaves room for improvement. While we do have consumption-based pricing it’s very coarse-grained.

For example, let’s consider Windows Azure hosting. For each VM you allocate, you are reserving that ‘machine’ and are paying $0.12/hour or more for wall clock time. The actual usage of each VM has nothing to do with your charges. Is this really consumption-based pricing? Yes, but at a coarse level of granularity: you add or remove servers to match your load. Can we imagine something more ideal? Yes, charging for the machine hours used to service actual activity. This would work well in combination with an auto-sizing feature as previously discussed.

We can make the same observation about SQL Azure. Today, you buy a database bucket in a certain size, such as 1GB or 10GB or 50GB. Whether that database is full, half full, or even completely empty does not affect the price you pay. Is this really consumption-based pricing? Yes, but again at a very coarse level. We can imagine a future where the amount of database storage in use drives the price, and we don’t have to choose a size bucket at all.

In the Future Cloud, I’d like to see more granular consumption-based pricing that more naturally lines up with usage and activities the way the customer thinks about them. It’s when the pricing model is at a distance from actual activity that surprises and disappointments come in using the cloud. We’ve already sold the ‘metering’ concept: now we need to give the customer the kind of meter they are expecting and can relate to.

6. Public-Private Portability: Doing Things the Same Way On-Prem or in the Cloud

I’m convinced many, many more businesses would be exploring the cloud right now if they could easily move portable workloads between cloud and on-premise effortlessly. Today, the cloud is a bit of a big step that requires you to change some things about your application. The cloud would be far more approachable if instead of that one big step, an enterprise could take several small, reversible steps.

In the Future Cloud, I’d like to be able to host things the same way in the cloud and on-premise so that I can effortlessly shuttle portable workloads between cloud and on-prem. Portable workloads would be huge. It doesn’t seem realistic that existing enterprise apps are going to just work in the cloud unchanged, because they weren’t designed to take advantage of a cloud environment. What does seem realistic is that you can update your apps to work “the cloud way” but be able to host identical VMs locally or in the cloud, giving you the ability to change your workload split anytime. The advent of private cloud will play a big role in making this possible.

7. Hybrid Clouds: Joining My Network to the Cloud

Today, on-premise and in-cloud are two very separate places separated by big walls. IT assets are either “over here” or “over there”, and special activities are needed to move applications, data, or messages between them. This makes certain scenarios a poor fit for the cloud today. Consider what I call the “Molar” pattern: an application with so many internal integrations that its deep roots make it impractical to extract out of the enterprise and move into the cloud.

In the Future Cloud, I’d like to be able to bridge parts of my local network to my assets in the cloud. The picture of what makes sense in the cloud changes radically if we can make connections between the cloud and our local network. That molar pattern, for example, might now be a suitable thing for the cloud because the in-cloud application now has a direct way to get to the internal systems it needs to talk to.

We know this is coming for Windows Azure. “Project Sydney”, announced at PDC 2009, will provide us with a gateway between our local networks and our assets in the cloud. What we can expect from this is that in addition to the “first wave” of applications that make sense in the cloud now, there will be a second wave.

8. Effortless Data Movement

Moving data to and from the cloud is not particularly hard—if it’s small, and of the type where you have a convenient tool at hand. When working with large amounts of data, your options are reduced and you may find yourself doing a lot of manual work or even creating your own tools out of necessity.

It’s not just moving data into the cloud and out that’s at issue: you may want to copy or move data between projects in the data center; or you may want to copy or move data to a different data center. In the Future Cloud, I’d like to be able to easily move data between on-premise and cloud data centers around the world, regardless of the amount of data.

9. A Simpler Pricing Model

If you look at Azure ROI Calculators and TCO tools, you’ll see that there are many dimensions to the pricing model. As we continue to get more and more services in the cloud, they will only increase. Although there’s something to be said for the transparency of separately accounting for bandwidth, storage, etc. it certainly puts a burden on customers to estimate their costs correctly. It’s very easy to get the wrong idea about costs by overlooking even one dimension of the pricing model. In the Future Cloud, I’d like to see a simpler, more approachable pricing model. This might mean a less itemized version of the pricing model where you consume at a simple rate; with the ability to reduce your costs slightly if you are willing to go the itemized route. This would be similar to tax returns, where you can choose between easy and itemized forms.

10. Provide SaaS Services

Software-as-a-Service providers are ISVs who face a common set of challenges: they need to provide multi-tenancy and engineer their solutions in a way that protect tenants well. This includes protection and isolation of data, and may involve customer-controlled encryption keys. SaaS providers also have to deal with provisioning of new accounts, which they would like to be as automated as possible. Change management is another consideration, where there is a tension between the ability to provide customizations and the use of a common deployment to serve all customers.

In the Future Cloud, I’d like to see services and a framework for SaaS functionality. Microsoft themselves are solving this for SaaS offerings such as SharePoint Online and CRM Online. Why not offer provisioning, multi-tenancy, and data isolation services for SaaS ISVs as a general cloud service?

11. E-Commerce Services in the Cloud

In line with the BizSpark program and other Microsoft initiatives to support emerging business, e-commerce services in the cloud would be highly useful. A cloud-based shopping cart and payment service would an excellent beginning, best implemented perhaps in conjunction with a well-known payment service such as PayPal. For more established businesses, we could imagine a deeper set of services that might include common ERP and commerce engine features. In the Future Cloud, I’d like to see shopping, payment, and commerce services.

12. Basic IT Services in the Cloud

It may be unrealistic to expect enterprises will put everything they have in the cloud, but start-ups are another matter altogether. For many start-ups, all of their IT will be in the cloud. They won’t have any local IT assets whatsoever beyond laptops. That means the basics, such as email, conferencing, Active Directory, domain management, and backup/restore will need to be in the cloud. We have a start on that today with Exchange Online, Office Communications Online, and Live Meeting in BPOS, but more is needed to complete the picture. In the Future Cloud, I’d like to see basic IT services provided by the cloud to support the fully-in-the-cloud customer.

Well, there’s my wish list. What do you think needs to be in the future cloud? Send me your comments.

Friday, September 17, 2010

Stupid Cloud Tricks #1: Hosting a Web Site Completely from Windows Azure Storage

Can you host a web site in Windows Azure without using Windows Azure Compute? Sure you can: you can ‘host’ an entire web site in Windows Azure Storage, 100% of it, if the web site is static. I myself am currently running several web sites using this approach. Whether this is a good idea is a separate discussion. Welcome to “Stupid Cloud Tricks” #1. Articles in this series will share interesting things you can do with the Windows Azure cloud that may be non-obvious and whose value may range from “stupid” to “insightful” depending on the context in which you use them.

If you host a web site in Windows Azure the standard way, you’re making use of Compute Services to host a web role that runs on a server farm of VM instances. It’s not uncommon in this scenario to also make use of Windows Azure blob storage to hold your web site assets such as images or videos. The reason you’re able to do this is that blob storage containers can be marked public or private, and public blobs are accessible as Internet URLs. You can thus have HTML <IMG> tags or Silverlight <Image> tags in your application that reference images in blob storage by specifying their public URLs.

Let’s imagine we put all of the files making up a web site in blob storage, not just media files. The fact that Windows Azure Storage is able to serve up blob content means there is inherent web serving in Windows Azure Storage. And this in turn means you can put your entire web site there—if it’s of the right kind: static or generated web sites that serve up content but don’t require server-side logic. You can however make use of browser-side logic using JavaScript or Ajax or Silverlight.

How does ‘hosting’ a static web site out of Windows Azure Storage compare to hosting it through Windows Azure Compute?
  • With the standard Windows Azure Compute approach, a single VM of the smallest variety @$0.12/hr will cost you about $88/month--and you need at least 2 servers if you want the 3 9's SLA. In addition you’ll pay storage fees for the media files you keep in Windows Azure storage as well as bandwidth fees.
  • If you put your entire site in Windows Azure storage, you avoid the Compute Services charge altogether but you will now have more storage to pay for. As a reminder, storage charges include a charge for the amount of storage @$0.15/GB/month as well as a transaction fee of $0.01 per 10,000 transactions. Bandwidth charges also apply but should be the same in either scenario.
So which costs more? It depends on the size of your web site files. In the Compute Services scenario the biggest chunk of your bill is likely the hosting charges which are a fixed cost. In the storage-hosted scenario you’re converting this aspect of your bill to a charge for storage which is not fixed: it’s based on how much storage you are using. It’s thus possible for your 'Storage-hosted’ web site charges to be higher or lower than the Compute-hosted approach depending on the size of the site. In most cases the storage scenario is going to be less than the Compute Services scenario.

As noted, this is only useful for a limited set of scenarios. It’s not clear what this technique might cost you in terms of SLA or Denial of Service protection for example. Still, it’s interesting to consider the possibilities given that Windows Azure Storage is inherently a web server. The reverse is also true, Windows Azure Compute inherently comes with storage--but that’s another article.

Saturday, August 28, 2010

Threat Modeling the Cloud


If there’s one issue in cloud computing you have to revisit regularly, it’s security. Security concerns, real or imagined, must be squarely addressed in order to convince an organization to use cloud computing. One highly useful technique for analyzing security issues and designing defenses is threat modeling, a security analysis technique long used at Microsoft. Threat modeling is useful in any software context, but is particularly valuable in cloud computing due to the widespread preoccupation with security. It’s also useful because technical and non-technical people alike can follow the diagrams easily. Michael Howard provides a very good walk-through of threat modeling here. At some level this modeling is useful for general cloud scenarios, but as you start to get specific you will need to have your cloud platform in view, which in my case is Windows Azure.

To illustrate how threat modeling works in a cloud computing context, let’s address a specific threat. A common concern is that the use of shared resources in the cloud might compromise the security of your data by allowing it to fall into the wrong hands—what we call Data Isolation Failure. A data isolation failure is one of the primary risks organizations considering cloud computing worry about.

To create our threat model, we’ll start with the end result we’re trying to avoid: data in the wrong hands.


Next we need to think about what can lead to this end result that we don’t want. How could data of yours in the cloud end up in the wrong hands? It seems this could happen deliberately or by accident. We can draw two nodes, one for deliberate compromise and one for accidental compromise; we number the nodes so that we can reference them in discussions. Either one of these conditions is sufficient to cause data to be in the wrong hands, so this is an OR condition. We’ll see later on how to show an AND condition.

Let’s identify the causes of accidental data compromise (1.1). One would be human failure to set the proper restrictions in the first place: for example, leaving a commonly used or easily-guessed database password in place. Another might be a failure on the part of the cloud infrastructure to enforce security properly. Yet another cause might be hardware failure, where a failed drive is taken out of the data center for repair. These and other causes are added to the tree, which now looks like this:


We can now do the same for the deliberately compromised branch (1.2). Some causes include an inside job, which could happen within your business but could also happen at the cloud provider. Another deliberate compromise would be a hacker observing data in transmission. These and other causes could be developed further, but we’ll stop here for now.


If we consider these causes sufficiently developed, we can explore mitigations to the root causes, the bottom leaves of the tree. These mitigations are shown in circles in the diagram below (no mitigation is shown for the “data in transmission observed” node because it needs to be developed further). For cloud threat modeling I like to color code my mitigations to show the responsible party: green for the business, yellow for the cloud provider, red for a third party.


You should not start to identify mitigations until your threat tree is fully developed, or you’ll go down rabbit trails thinking about mitigations rather than threats. Stay focused on the threats. I have deliberately violated this rule just now in order to show why it’s important. At the start of this article we identified the threat we were trying to model as “data in the wrong hands”. That was an insufficiently described threat, and we left out an important consideration: is the data intelligible to the party that obtains it? While we don’t want data falling into the wrong hands under any circumstances, we certainly feel better off if the data is unintelligible to the recipient. The threat tree we have just developed, then, is really a subtree of a threat we can state more completely as: Other parties obtain intelligible data in cloud. The top of our tree now looks like this, with 2 conditions that must both be true. The arc connecting the branches indicates an AND relationship.


The addition of this second condition is crucial, for two reasons. First, failing to consider all of the aspects in a threat model may give you a false sense of security when you haven’t examined all of the angles. More importantly, though, this second condition is something we can easily do something about by having our application encrypt the data it stores and transmits. In contrast we didn't have direct control over all of the first branch's mitigations. Let’s develop the data intelligible side of the tree a bit more. For brevity reasons we’ll just go to one more level, then stop and add mitigations.

Mitigation is much easier in this subtree because data encryption is in the control of the business. The business merely needs to decide to encrypt, do it well, and protect and rotate its keys. Whenever you can directly mitigate rather than depending on another party to do the right thing you’re in a much better position. The full tree that we've developed so far now looks like this.


Since the data intelligible and data in the wrong hands conditions must both be true for this threat to be material, mitigating just one of the branches mitigates the entire threat. That doesn’t mean you should ignore the other branch, but it does mean one of the branches is likely superior in terms of your ability to defend against it. This may enable you to identify a branch and its mitigation(s) as the critical mitigation path to focus on.

While this example is not completely developed I hope it illustrates the spirit of the technique and you can find plenty of reference materials for threat modeling on MSDN. Cloud security will continue to be a hot topic, and the best way to make some headway is to get specific about concerns and defenses. Threat modeling is a good way to do exactly that.