Welcome!

Silverlight Authors: Automic Blog, Michael Kopp, AppDynamics Blog, Kaazing Blog, Steven Mandel

Related Topics: Microsoft Cloud, Silverlight, Agile Computing

Microsoft Cloud: Article

Developing Situational Applications with Web 2.0 Mashups

Wring out the advantage of Web 2.0

ENCRYPTION
SSL Encryption is perhaps the most common and straightforward service that lends itself to specialization. If your site includes some pages and services that need to be secured, you can run those pages separately - either through a third-party SSL processor or through dedicated Web servers that are configured to run your SSL pages.

Like image handling, building a specialized SSL service creates a need to point to a different group of servers in your code, adding additional development complexity and requiring additional management effort. Alternatively, as with affinity, there are hardware solutions, such as application acceleration appliances that can reduce the coding requirement for SSL by handling the handshaking and encryption that would normally have to take place at the server. (See Figure 2.)

In general, like all types of specialization, distributing SSL and image handling have a cost in terms of development complexity and management effort. But what these strategies let you do is distribute each specialized task independently of the others and utilize your hardware more effectively - ultimately making your application more agile.

BROWSER & OUTPUT CACHING
One specialization that can have a huge impact on your ability to scale for growth or for sudden changes in demand is caching of both static and dynamic content. By incorporating browser caching, you can significantly reduce the load on your Web servers. You can also use appliance-based caching solutions that adjust HTTP headers to manage browser caching, as well as providing a more scalable substitute for ASP.NET output caching services.

Of course caching has its own costs. Both browser caching and output caching are complicated and time-consuming to code, which means you don't want to use them everywhere. Finding the places in your application where caching makes the most sense and would deliver the most value can be difficult. Even more significant, when you add caching you also add new requirements for maintaining the currency of the cache. The last thing you want is to be serving data that's no longer accurate. Knowing when the cache is wrong and developing a dynamic strategy to deal with cache expiries is critical. Often, the complexity of addressing these issues causes many developers not to use caching, despite its obvious benefits.

DATA CACHING
The most significant affinity in any ASP.NET application is the data. Introducing data caching (separating data into read-only and read/write specializations) is ultimately the highest-impact technique you can use to improve the scalability and agility of your application. Developers with the biggest Web 2.0 sites such as MySpace and Facebook recognize this, and it's the data caching strategies they've employed that have allowed these sites to become hugely successful Web 2.0 applications.

Of course, data caching is probably the most complicated thing you can do in an application. Data, by its nature, wants to be stored in only one place and doesn't distribute well. You have to use a system called multi-master replication to code the "write" components of the application to write to different databases than the "read" components, while allowing read/write components to feed data continually to read-only components. It's incredibly difficult to do. (As a consultant, if a client wants to use data caching, I know I'll have a job for years.) But when you really need to scale to millions of simultaneous users, data caching is the best way to do it.

Building the Agile Web 2.0 Application
Actually implementing all of these strategies isn't trivial. But if your application is going to be agile enough to truly embrace Web 2.0 capabilities at scale, intelligent distribution and specialization are essential.

Of course, there's one other vital piece of the puzzle: instrumentation. How do you know which parts of your site are ideal targets for distribution and specialization? Even after you've architected a sound distribution strategy, how do you measure whether it's continuing to meet your needs as they change? You need proper instrumentation in place to tell you when sudden changes are occurring and when your various distribution and specialization strategies should come into play. Without hard information about what's actually happening in the environment, even the most advanced specialization and distribution strategies are operating in the dark.

When you combine more intelligent distribution and specialization with up-to-the-minute knowledge of your environment, you can build true agility into your application. More important, you can take full advantage of all of the potential that Web 2.0 has to offer, knowing that no matter what demands the future may hold, your application and environment are prepared to meet them.

More Stories By Kent Alstad

Kent Alstad, CTO of Strangeloop Networks, is principal or contributing author on all of Strangeloop's pending patents. Before helping create Strangeloop, he served as CTO at IronPoint Technology. Kent also founded, Eclipse Software, a Microsoft Certified Solution Provider, that he sold to Discovery Software in 2001. In more than 20 years of professional development experience, Kent has served as architect and lead developer for successful production solutions with The Active Network, ADP, Lucent, Microsoft, and NCS. Kent holds a bachelor of science in psychology from the University of Calgary.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


IoT & Smart Cities Stories
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogge...
Nicolas Fierro is CEO of MIMIR Blockchain Solutions. He is a programmer, technologist, and operations dev who has worked with Ethereum and blockchain since 2014. His knowledge in blockchain dates to when he performed dev ops services to the Ethereum Foundation as one the privileged few developers to work with the original core team in Switzerland.
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
In his general session at 19th Cloud Expo, Manish Dixit, VP of Product and Engineering at Dice, discussed how Dice leverages data insights and tools to help both tech professionals and recruiters better understand how skills relate to each other and which skills are in high demand using interactive visualizations and salary indicator tools to maximize earning potential. Manish Dixit is VP of Product and Engineering at Dice. As the leader of the Product, Engineering and Data Sciences team at D...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more busine...
Whenever a new technology hits the high points of hype, everyone starts talking about it like it will solve all their business problems. Blockchain is one of those technologies. According to Gartner's latest report on the hype cycle of emerging technologies, blockchain has just passed the peak of their hype cycle curve. If you read the news articles about it, one would think it has taken over the technology world. No disruptive technology is without its challenges and potential impediments t...
If a machine can invent, does this mean the end of the patent system as we know it? The patent system, both in the US and Europe, allows companies to protect their inventions and helps foster innovation. However, Artificial Intelligence (AI) could be set to disrupt the patent system as we know it. This talk will examine how AI may change the patent landscape in the years to come. Furthermore, ways in which companies can best protect their AI related inventions will be examined from both a US and...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the...