Silverlight Authors: Automic Blog, Michael Kopp, AppDynamics Blog, Kaazing Blog, Steven Mandel

Related Topics: Silverlight, Microsoft Cloud, @CloudExpo

Silverlight: Blog Feed Post

Windows Azure for Noobs

Windows Azure is not the silver bullet for all applications going forward

Azure Track at Cloud Expo

Ok, so I admit I’ve been busy on projects and of course I’ve been focusing a ton on SharePoint 2010.

In the meantime, I hadn’t been paying much attention to what’s been developing with cloud computing and more specifically in this case Windows Azure.  I was, in fact, a noob. :-)

This week I had the opportunity to attend a Windows Azure Boot Camp, so that now makes me an expert. At least that is what my boss will claim. :-)

So this post today is for those of you who haven’t been keeping up and want to know about some of the basics.  It’s not to teach you the ins and outs of developing with Windows Azure.  Although, getting started isn’t too difficult and the boot camp site has all the materials you need to get you started quickly.

If you’re not familiar yet, Windows Azure is part of Microsoft’s cloud computing platform.  Specifically, it is built for developing ASP.NET applications and WCF services in a cloud environment.  It shouldn’t be confused with Microsoft’s other online product offerings such as BPOS (which includes things like Exchange Online , SharePoint Online, and OCS Online).

I’ll try and start out by clarifying some of the basics that I was wondering about when I first started.  The first thing to know is that you need to get a handle on pricing.  I’m not going to attempt to explain it all, but the gist is you pay for the number of instances you have of a given application (this sort of equates to a VM – but not necessarily), the number of transactions you make to online storage, the amount of storage you use, and the amount of bandwidth you use.  This may sounds like it’s going to cost a lot, but keep in mind most of these costs are measure in cents (although they can add up).

One important thing to note about pricing is that you are paying for “compute time” as they call it whether your application is being used or not.  Even if it is suspended, you are paying for that application until you go and delete it.

This is a hosted environment, so that means there are restrictions on what you can do in it.  Unlike a hosting provider, you don’t have any direct access to the machine.  That means no remote desktop, no IIS access, no file system access (except with a few exceptions), and no registry access.  So how do you get your application to the portal?  We’ll go into it more in a bit, but essentially you package it up in a .csx file (in a very Sharepoint-esque style) and upload it through the administration portal.

One more thing you’ll want to know before starting.  The DevFabric (bascially a local version of the cloud) assumes you are using a SQL Express database.  If you are using a regular SQL server, then follow the information I found in this blog on how to set up your database.  As for developing in Azure, it's just like developing in ASP.NET, you can use most of the things you are used to.  However, some things aren’t allowed.  For example, code that tried to access machine specific resources or writes to the registry is going to throw a SecurityException.


When you create a new Windows Azure project you are asked to pick a role.


Effectively there are two types of roles: Web and Worker.  The Web Role will create an ASP.NET project.  It has some variations (MVC 2, WCF Service, etc), but it really just affects the type of ASP.NET project that is created.  The Worker Role is very similar to a Windows Service.  It typically starts and runs in a loop.

Once you create your project, you’ll get a pair of projects that look like this in Visual Studio.


The roles you have added to your project show up in the Roles folder.  Viewing the settings of this allows you to configure storage settings and endpoints for the service.  You can also choose Full or Partial Trust here.  When this project is active and you hit F5 to debug, it will actually launch something called the DevFabric which is basically local emulation of the “cloud”.  In training, they typically said that the DevFabric covered 90% of what the actual cloud does, but I never saw an example of anything that wouldn’t work.  You can fully debug your ASP.NET application in the DevFabric, but it’s not possible once you get your code in the cloud.

When you’re ready to go to the cloud, you use the Publish menu option.  This packages your solution into a .cscx file.  You then take it and the configuration settings file (.cscfg) and upload it on the deployment page.  At this point, you have to wait a few minutes while it does the deployment.


I kind of like the way you stage deployments to production.  You start by uploading your package file to a staging instance of your application.  Effectively this spins up another instance (i.e. server).  Your application in staging gets a new separate public facing address for you do your QA.  Once you’re happy with it, you click the swap button and it simply swaps what you had in production with staging and you’re done.  Of course, the drawback to this is that when you have a staging instance up and running you’re paying compute hours on it.  That means, you only want to leave staging up as long as you need it.


Additional Development Information

You can’t upload your own COM objects since you can’t write to the registry.  However, you are allowed to use DLLImport to make calls to execute unmanaged code from the Windows API for example.  Honestly, I am surprised this is allowed, but they claim it is.  You can also execute executables which seems like you could use to circumvent some of the things that prevent you from knowing much about the underlying operating system.  You’re also allowed to spawn background threads.

The AppFabric brings a few more advanced features you can take advantage of.  Windows Azure has a service bus which you can use to connect services together.  It also supports a security mechanism for securing REST services.  It’s based upon OAuth and it sounds a lot like SharePoint 2010’s Claims-Based security, but honestly I don’t know if they are one and the same.

There are a few things that Windows Azure adds to ASP.NET.  There are three assemblies that you can reference to take advantage of these extra features.  <>  Windows Azure brings the concepts of Tables, Queues, and BLOB storage.  All of these start with the concept of a Storage Account which is simply a billing and grouping that you create through the Azure administration portal.  The storage account uses a pair of keys to authenticate any application that accesses it.  Keep these keys secure.  You can create the items we mentioned above inside that storage account.  Any time you write something to a storage account it is written in triplicate meaning its written to three separate locations to help prevent data loss.

We’ll talk about Tables first.  Now before we start, forget everything you know about SQL tables because that does not apply here.  These are not relational tables and have nothing to do with SQL Server in any way.  They are highly scalable capable of containing extremely large amounts of data.  Your storage account can contain multiple tables and inside the table you store entities.  The interesting thing is that the entity schema can vary inside the same table.  Since it’s not relational, you can’t do server side joins and there is no foreign key relationship.  You get data out of it using LINQ so this does mean you can do some joins via code running on your web server, but remember its not a true server side join.  There is a 30 second query timeout, so if your query returns too much, it will get killed.

As for the entity going into the table, you simply create a class and inherit from TableServiceEntity.  In the past people have not liked inheriting from some class in their ORM, so this may be an issue for some people.  Tables are partitioned.  The concept of partition is kind of hard to grasp, but you effectively split out a single table into different groups of things.  I’m not really sure why you do this and what the advantage of it is, but you do have to provide a partition key such as Products, or Contacts. You also have to provide a RowKey in your entity class.  This serves as a primary key.  In a lot of code I have seen, they auto-generate this in the constructor.  To query the data, you will create a DataContext class in a very similar manner to how you use one when using LINQ to SQL.

How do you create a table?  Well, there are no design-time tools for it.  It’s created completely in code.  There also isn’t any included tools to view, edit, or delete the data in your tables.  A third party tool called Cloud Storage Studio does exist from a company called Cerebrata, but keep in mind any time you interact with the data in your table you are paying for storage transactions and bandwidth.

The last thing to note is that although your data is relatively secure from hardware loss since its written in triplicate, there is no concept of a backup.  This means if data was accidently deleted, you have no way to get it back.  If you do want a backup, you are going to have to write some code to export the data into something local.

Windows Azure also has the concept of queues which sits on top of your storage account.  There work in many ways like other queuing architectures that you might be familiar with.  Queues can hold an unlimited amount of data (or at least that’s what they claim).  They work by storing XML serializable messages but they can’t be larger than 8KB each.  The way you would typically use a queue in Azure is by pushing a message in your Web Role.  A worker role then comes and pops the messages off periodically and acts on them.

I’m sure you all know what BLOBs are already.  The purpose of these is to give you a place to store files in Azure.  BLOBs are stored in containers and each BLOB can be up to 1TB in size.  Containers have an unlimited capacity (at least that is what is claimed) and they act very much like a folder.  Containers are private in default but you can also make them available to the public.  BLOBs can also contain metadata.  From what I gathered from the training though, I don’t believe there is a way to query upon that metadata.  I hope I’m wrong though so if someone finds out how, please let me know.  One thing you can do is upload a .VHD file as a BLOB and mount that as a legacy file system.  This is called Windows Azure Drive.  This is mainly for legacy use though.  The impression I got was that they really want you to use the new storage mechanisms instead of using this.  As is common with everything else, there is no included way to backup your files.  If you need to backup them you will need to turn to a third party tool or write some code.

SQL Azure
So if Azure Tables aren’t for you, you can choose to make use of SQL Azure (for an additional fee of course).  However, there are a number of limitations.  First, currently you can only create databases that are 1GB or 10GB.  I’ve recently been told that 50GB databases will be available in June.  As far as what you can do in SQL, just stick to the basics.  Things like Cross-Database joins, Database Mirroring, SQL CLR, Replication, Extended Stored Procedures, Spatial Data, and Backup and Restore are simply not supported.  That’s right, I said no backup and restore.  Although, your data is automatically written to multiple servers, you have no built-in mechanism to do backups or restores when accidental data deletion occurs.  This is obviously going to be a deal breaker for a lot of people, but this is supposed to be a very high priority issue for Microsoft, so we may see this in the future.  If this is going to be an issue for you, you can write your own backup routines, or maybe SQL Azure is just not right for you right now.

As for authentication, SSPI is out, only SQL authentication is supported.  When you create a SQL Azure account, it spins up a server (rather quickly even), prompts you for the name and password of an administrator, and it gives you an address of where to find your server.  You then create the database through the web interface.  At this point, you can actually connect to your SQL server using the address it provides using SQL Server Management Studio (latest version).  Unfortunately, I was never able to get the authentication to work.  Also if you are curious about security, you can actually configure the firewall to allow specific IPs access to your SQL server.

It probably goes without saying at this point, but SQL Reporting Services, Analysis Services, and Integration Services are not supported either.

Non .NET in the Cloud?
Sure.  Provided it runs via FastCGI.  This makes things like PHP possible in the cloud.  You can actually upload the executable such as php.exe and then set up pages of the appropriate extension to run through CGI.  It’s still basically getting routed through IIS but it does open up your options somewhat.

Windows Azure is not the silver bullet for all applications going forward.

We all know that’s SharePoint of course. :-)  No, but really, you have to look at your application and see if it makes since.  My biggest concern at this point is the development story.  There are a number of offers for getting cloud access.  Some are paid, some are trials, you get some for being a partner or with MSDN.  If you are a partner, you are only entitled to a few hours a month.  There are a few paid options and what not that are reasonable, but you’re going to have to get someone to pony up a credit card number for you to use anything.  Being a partner, also only the people that have download permissions already have access to the Azure accounts.  This could be an issue with some partners that rule over those accounts with an iron fist.

This is already turning out to be a long post and I’ve far from covered everything in any real detail, but hopefully this is a good start if you just wanted some basic information.  Plus, I’ve provided plenty of links with more information and how to get started.  I hope I got all of the facts right, but if I missed something, please leave me a comment.  If you have a VM with Windows Server 2008 R2 already on it, you can get started pretty fast.  Windows Azure has a relatively low startup cost and has great scalability.  If your application turns out to be a good fit for it, then I definitely recommend taking a look at it.

If you are interested more in Windows Azure, I have to recommend finding a boot camp.  The one I attended was great and very informative.  There are still many more being scheduled.  What’s cool is that if there is not one in your area, they will help you throw one.  I’m seriously considering this for Oklahoma as I think there would be plenty of people interested.

Read the original blog entry...

More Stories By Corey Roth

Corey Roth, a SharePoint Server MVP, is a consultant at Hitachi Consulting specializing in SharePoint and Office 365 for clients in the energy sector. He has more than ten years of experience delivering solutions in the energy, travel, advertising and consumer electronics verticals.

Corey specializes in delivering ECM and search solutions to clients using SharePoint. Corey has always focused on rapid adoption of new Microsoft technologies including Visual Studio 2013, Office 365, and SharePoint.

He is a member of the .NET Mafia (www.dotnetmafia.com) where he blogs about the latest technology and SharePoint. He is dedicated to the community and speaks regularly at user groups and SharePoint Saturdays.

@ThingsExpo Stories
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Digital Transformation (DX) is not a "one-size-fits all" strategy. Each organization needs to develop its own unique, long-term DX plan. It must do so by realizing that we now live in a data-driven age, and that technologies such as Cloud Computing, Big Data, the IoT, Cognitive Computing, and Blockchain are only tools. In her general session at 21st Cloud Expo, Rebecca Wanta explained how the strategy must focus on DX and include a commitment from top management to create great IT jobs, monitor ...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
No hype cycles or predictions of a gazillion things here. IoT is here. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, an Associate Partner of Analytics, IoT & Cybersecurity at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He also discussed the evaluation of communication standards and IoT messaging protocols, data...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
Recently, REAN Cloud built a digital concierge for a North Carolina hospital that had observed that most patient call button questions were repetitive. In addition, the paper-based process used to measure patient health metrics was laborious, not in real-time and sometimes error-prone. In their session at 21st Cloud Expo, Sean Finnerty, Executive Director, Practice Lead, Health Care & Life Science at REAN Cloud, and Dr. S.P.T. Krishnan, Principal Architect at REAN Cloud, discussed how they built...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone inn...
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
"Digital transformation - what we knew about it in the past has been redefined. Automation is going to play such a huge role in that because the culture, the technology, and the business operations are being shifted now," stated Brian Boeggeman, VP of Alliances & Partnerships at Ayehu, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
SYS-CON Events announced today that Evatronix will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Evatronix SA offers comprehensive solutions in the design and implementation of electronic systems, in CAD / CAM deployment, and also is a designer and manufacturer of advanced 3D scanners for professional applications.
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...