Welcome!

Silverlight Authors: Automic Blog, Michael Kopp, AppDynamics Blog, Kaazing Blog, Steven Mandel

Related Topics: Java IoT, Machine Learning , Silverlight

Java IoT: Article

Ensuring Website Performance

Why measure, what to measure, and how to measure accurately and automatically

It is a fact that end user response time is critical for business success. The faster web pages are perceived the longer users tend to stay on the page and therefore spend more money and drive business.

In order to ensure that end user response times are acceptable at all times it is necessary to measure the time in the way the end user perceives performance. Measuring and monitoring your live system is important to identify problems early on before it affects too many end users. In order to make sure that web pages are fast from the start it is very important to constantly and continuously measure web page performance throughout the development phase and in testing. There are two questions that need to be answered

  • What is the time the user actually perceives as web response time?
  • How to measure it accurately and in an automated way?

What Time to Measure? Technical Response Time vs. Perceived Response Time
Technically - the response time of a web page is the time from the first byte sent by the browser to request the initial document until the last byte of all embedded objects (images, JavaScript files, style sheets, ...) was received. Using network analysis tools like HTTP Watch or Fiddler visualize the individual downloads in a timeline view. The following illustration shows the network timeline when accessing Google Maps (http://maps.google.com) with an empty browser cache using Fiddler:

Network Timeline showing Network Requests but no Browser Activities

Network Timeline showing Network Requests but no Browser Activities

The initial document request returned after 1.6s. Embedded objects get downloaded after the initial document was retrieved. It turns out there are 2 additional HTML documents, a list of images and some JavaScript files. After 5 seconds (when main.js was downloaded) we see a small gap before the remaining requests are downloaded. We can assume that the gap represents JavaScript execution time that delay loaded some other objects- but we cannot be fully sure about that.

From this analysis its hard to tell what the perceived end user response time really is. Is it 1.6 seconds because that is the time when the browser could already start rendering the initial content of the HTML document? Or is it roughly 5 seconds when the first batch of embedded objects was fully downloaded? Or might it be 8 seconds - because that is the time till the last request was completed? Or is the truth somewhere in between?

There is more than meets the "HTTP Traffic" Eye

The browser does much more than just downloading resources from the server. The DOM (Document Object Model) is built and maintained for the downloaded document. Styles are applied to DOM Elements based on the definition in Style Sheets. JavaScript gets executed at different points in time triggered by certain events, e.g.: onload, onclick, .... The DOM and all its containing images are rendered to the screen.

Using a tool like dynaTrace AJAX Edition we get all this additional activity information showing us where and when additional time is spent in the browser for JavaScript execution, Rendering or waiting for asynchronous network requests. We also see page events like onLoad or onError:

Timeline of all Browser Activities

Timeline of all Browser Activities

Looking at this timeline view of the same Google Maps request as before now tells us that the browser started rendering the initial HTML document after 2 seconds. Throughout the download process of the embedded objects the browser rendered additional content. The onLoad event was triggered after 4.8 seconds. This is the time when the browser completed building the initial DOM of the web page including all referenced objects (images, css, ...). The execution of main.js - which was downloaded as last JavaScript file - caused roughly 2 seconds of JavaScript execution time causing high CPU on the browser, additional network downloads and DOM manipulations. The High CPU utilization is an indication of the browser not being very responsive to user input via mouse or keyboard as JavaScript almost exclusively consumed the processor. DOM Manipulations executed by JavaScript got rendered after JavaScript execution was completed (after 7.5s and 8s).

So what is the perceived end user performance?

I believe there are different stages of perceived performance and perceived response time.

The First Impression of speed is the time it takes to see something in the browsers window (Time To First Visual). We can measure that by looking at the first Rendering (Drawing) activity. Get a detailed description about Browser Rendering and the inner workings the Rendering Engine at Alois's blog entry about Understanding Browser Rendering.

The Second Impression is when the initial page is fully loaded (Time To OnLoad). This can be measured by looking at the onLoad event which is triggered by the browser when the DOM is fully loaded meaning that the initial document and all embedded objects are loaded.

The Third Impression is when the web site actually becomes interactive for the user (Time To Interactivity). Heavy JavaScript execution that manipulates the DOM causes the web page to become non interactive for the end user. This can very often be seen when expensive CSS Selector Lookups (check out the blogs about jQuery and Prototype CSS Selector Performance) are used or when using dynamic elements like JavaScript Menus (check out the blog about dynamice JavaScript menus).

Let's look at a second example and identify the different impression stages. The following image shows a page request to a product page on a very popular online retail store:

3 Impression Phases

Three Impression Phases

The initial page content is downloaded rather quickly and rendered to the screen in the first second (First Impression). It takes a total of about 3 seconds for some of the initial images to load that make up the pages initial content (Second Impression). Heavy JavaScript that manipulates the DOM causes the page to be non responsive to the end user for about 10 seconds also delaying the onLoad event where the page delay loads most of the images. In this case the user sees some of the content early on (mostly text from the initial HTML) - but then needs to wait another 10 seconds till the remaining images get delay loaded and rendered by the browser (Third Impression). Due to the high CPU usage and DOM manipulations the page is also not very interactive causing a bad end user perception of the pages performance.

How to Measure? Stop Watch Measuring vs. Tool Supported Measuring
The idea to this blog post came from talking with performance testing engineers at on of our clients. I introduced them to the dynaTrace AJAX Edition and was wondering about a small little gadget they had on their table: a Stop-Watch.

Their task was to measure end-user response time for every build of their new web-site in order to verify if the times are within defined performance thresholds and in order to identify regressions from build to build. They used the Stop-Watch to actually measure the time it took to load each single page and to measure the time till the page was responsive. The "manually" measured numbers were put into a spreadsheet which allowed them to verify their performance values.

Do you see the problems in this approach?

Not only is this method of measuring time very inaccurate - especially when we talk about measuring precise timings in tenths of seconds. Every performance engineer also has a slightly different perception of what it means for the site to be interactive. It also involves additional manual effort as the timing can only be taken during manual tests.

Automate measuring and measure accurately

The solution to this problem is rather easy. With tools like dynaTrace AJAX Edition we capture the performance measures like JavaScript execution, Rendering Time, CPU Utilization, Asynchronous Requests and Network Requests. Not only is this possible for manual tests but also works in an automated test environment. Letting a tool do the job eliminates the inaccuracy of manual time taking and subjective perception of performance.

When using the dynaTrace AJAX Edition as seen on the examples above all performance relevant browser activities are automatically captured and enable us to determine the time of the 3 Impression Stages. The blog article "Automate Testing with Watir" shows how to use dynaTrace AJAX Edition in the combination with automated testing tools. The tool also provides the ability to export captured data to XML or spreadsheet applications like Excel - supporting the use case of automated regression analysis across different web site versions/builds.

Conclusion
Using tools like dynaTrace AJAX Edition for Internet Explorer, YSlow or PageSpeed for FireFox or DevTools for Chrome enables automating web site performance measuring in manual and automated test environments. Continuously measuring web site performance in the browser allows you to always focus on end user performance which in the end determines how successful your website will be.

Related Reading:

  1. 101 on HTTPS Web Site Performance Impact I recently analyzed a secure web page that took 20...
  2. Top Low Hanging Fruit to Performance Optimize your Web Site and boost Business Web Page Performance was one of the big topics at...
  3. Inability to measure SLAs around application performance The 2008 Aberdeen Report on Application Performance Management listed the...
  4. Your Top Links about Web Site/AJAX Performance With the recent work we did for the dynaTrace AJAX...
  5. Automated Performance Analysis: What’s going on in my ASP.NET or ASP.NET MVC Application? I’ve spent some time in the last weeks playing with different...

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

@ThingsExpo Stories
Infoblox delivers Actionable Network Intelligence to enterprise, government, and service provider customers around the world. They are the industry leader in DNS, DHCP, and IP address management, the category known as DDI. We empower thousands of organizations to control and secure their networks from the core-enabling them to increase efficiency and visibility, improve customer service, and meet compliance requirements.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant tha...
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...
SYS-CON Events announced today that N3N will exhibit at SYS-CON's @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data a...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbui...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's applicati...
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
As popularity of the smart home is growing and continues to go mainstream, technological factors play a greater role. The IoT protocol houses the interoperability battery consumption, security, and configuration of a smart home device, and it can be difficult for companies to choose the right kind for their product. For both DIY and professionally installed smart homes, developers need to consider each of these elements for their product to be successful in the market and current smart homes.
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant th...
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its ...
SYS-CON Events announced today that Golden Gate University will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Since 1901, non-profit Golden Gate University (GGU) has been helping adults achieve their professional goals by providing high quality, practice-based undergraduate and graduate educational programs in law, taxation, business and related professions. Many of its courses are taug...
SYS-CON Events announced today that SIGMA Corporation will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. uLaser flow inspection device from the Japanese top share to Global Standard! Then, make the best use of data to flip to next page. For more information, visit http://www.sigma-k.co.jp/en/.
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, will discuss how by using...
In this strange new world where more and more power is drawn from business technology, companies are effectively straddling two paths on the road to innovation and transformation into digital enterprises. The first path is the heritage trail – with “legacy” technology forming the background. Here, extant technologies are transformed by core IT teams to provide more API-driven approaches. Legacy systems can restrict companies that are transitioning into digital enterprises. To truly become a lead...