Welcome!

Silverlight Authors: Automic Blog, Michael Kopp, Jyoti Bansal, Kaazing Blog, Steven Mandel

Related Topics: Java IoT, Machine Learning , Silverlight

Java IoT: Article

Ensuring Website Performance

Why measure, what to measure, and how to measure accurately and automatically

It is a fact that end user response time is critical for business success. The faster web pages are perceived the longer users tend to stay on the page and therefore spend more money and drive business.

In order to ensure that end user response times are acceptable at all times it is necessary to measure the time in the way the end user perceives performance. Measuring and monitoring your live system is important to identify problems early on before it affects too many end users. In order to make sure that web pages are fast from the start it is very important to constantly and continuously measure web page performance throughout the development phase and in testing. There are two questions that need to be answered

  • What is the time the user actually perceives as web response time?
  • How to measure it accurately and in an automated way?

What Time to Measure? Technical Response Time vs. Perceived Response Time
Technically - the response time of a web page is the time from the first byte sent by the browser to request the initial document until the last byte of all embedded objects (images, JavaScript files, style sheets, ...) was received. Using network analysis tools like HTTP Watch or Fiddler visualize the individual downloads in a timeline view. The following illustration shows the network timeline when accessing Google Maps (http://maps.google.com) with an empty browser cache using Fiddler:

Network Timeline showing Network Requests but no Browser Activities

Network Timeline showing Network Requests but no Browser Activities

The initial document request returned after 1.6s. Embedded objects get downloaded after the initial document was retrieved. It turns out there are 2 additional HTML documents, a list of images and some JavaScript files. After 5 seconds (when main.js was downloaded) we see a small gap before the remaining requests are downloaded. We can assume that the gap represents JavaScript execution time that delay loaded some other objects- but we cannot be fully sure about that.

From this analysis its hard to tell what the perceived end user response time really is. Is it 1.6 seconds because that is the time when the browser could already start rendering the initial content of the HTML document? Or is it roughly 5 seconds when the first batch of embedded objects was fully downloaded? Or might it be 8 seconds - because that is the time till the last request was completed? Or is the truth somewhere in between?

There is more than meets the "HTTP Traffic" Eye

The browser does much more than just downloading resources from the server. The DOM (Document Object Model) is built and maintained for the downloaded document. Styles are applied to DOM Elements based on the definition in Style Sheets. JavaScript gets executed at different points in time triggered by certain events, e.g.: onload, onclick, .... The DOM and all its containing images are rendered to the screen.

Using a tool like dynaTrace AJAX Edition we get all this additional activity information showing us where and when additional time is spent in the browser for JavaScript execution, Rendering or waiting for asynchronous network requests. We also see page events like onLoad or onError:

Timeline of all Browser Activities

Timeline of all Browser Activities

Looking at this timeline view of the same Google Maps request as before now tells us that the browser started rendering the initial HTML document after 2 seconds. Throughout the download process of the embedded objects the browser rendered additional content. The onLoad event was triggered after 4.8 seconds. This is the time when the browser completed building the initial DOM of the web page including all referenced objects (images, css, ...). The execution of main.js - which was downloaded as last JavaScript file - caused roughly 2 seconds of JavaScript execution time causing high CPU on the browser, additional network downloads and DOM manipulations. The High CPU utilization is an indication of the browser not being very responsive to user input via mouse or keyboard as JavaScript almost exclusively consumed the processor. DOM Manipulations executed by JavaScript got rendered after JavaScript execution was completed (after 7.5s and 8s).

So what is the perceived end user performance?

I believe there are different stages of perceived performance and perceived response time.

The First Impression of speed is the time it takes to see something in the browsers window (Time To First Visual). We can measure that by looking at the first Rendering (Drawing) activity. Get a detailed description about Browser Rendering and the inner workings the Rendering Engine at Alois's blog entry about Understanding Browser Rendering.

The Second Impression is when the initial page is fully loaded (Time To OnLoad). This can be measured by looking at the onLoad event which is triggered by the browser when the DOM is fully loaded meaning that the initial document and all embedded objects are loaded.

The Third Impression is when the web site actually becomes interactive for the user (Time To Interactivity). Heavy JavaScript execution that manipulates the DOM causes the web page to become non interactive for the end user. This can very often be seen when expensive CSS Selector Lookups (check out the blogs about jQuery and Prototype CSS Selector Performance) are used or when using dynamic elements like JavaScript Menus (check out the blog about dynamice JavaScript menus).

Let's look at a second example and identify the different impression stages. The following image shows a page request to a product page on a very popular online retail store:

3 Impression Phases

Three Impression Phases

The initial page content is downloaded rather quickly and rendered to the screen in the first second (First Impression). It takes a total of about 3 seconds for some of the initial images to load that make up the pages initial content (Second Impression). Heavy JavaScript that manipulates the DOM causes the page to be non responsive to the end user for about 10 seconds also delaying the onLoad event where the page delay loads most of the images. In this case the user sees some of the content early on (mostly text from the initial HTML) - but then needs to wait another 10 seconds till the remaining images get delay loaded and rendered by the browser (Third Impression). Due to the high CPU usage and DOM manipulations the page is also not very interactive causing a bad end user perception of the pages performance.

How to Measure? Stop Watch Measuring vs. Tool Supported Measuring
The idea to this blog post came from talking with performance testing engineers at on of our clients. I introduced them to the dynaTrace AJAX Edition and was wondering about a small little gadget they had on their table: a Stop-Watch.

Their task was to measure end-user response time for every build of their new web-site in order to verify if the times are within defined performance thresholds and in order to identify regressions from build to build. They used the Stop-Watch to actually measure the time it took to load each single page and to measure the time till the page was responsive. The "manually" measured numbers were put into a spreadsheet which allowed them to verify their performance values.

Do you see the problems in this approach?

Not only is this method of measuring time very inaccurate - especially when we talk about measuring precise timings in tenths of seconds. Every performance engineer also has a slightly different perception of what it means for the site to be interactive. It also involves additional manual effort as the timing can only be taken during manual tests.

Automate measuring and measure accurately

The solution to this problem is rather easy. With tools like dynaTrace AJAX Edition we capture the performance measures like JavaScript execution, Rendering Time, CPU Utilization, Asynchronous Requests and Network Requests. Not only is this possible for manual tests but also works in an automated test environment. Letting a tool do the job eliminates the inaccuracy of manual time taking and subjective perception of performance.

When using the dynaTrace AJAX Edition as seen on the examples above all performance relevant browser activities are automatically captured and enable us to determine the time of the 3 Impression Stages. The blog article "Automate Testing with Watir" shows how to use dynaTrace AJAX Edition in the combination with automated testing tools. The tool also provides the ability to export captured data to XML or spreadsheet applications like Excel - supporting the use case of automated regression analysis across different web site versions/builds.

Conclusion
Using tools like dynaTrace AJAX Edition for Internet Explorer, YSlow or PageSpeed for FireFox or DevTools for Chrome enables automating web site performance measuring in manual and automated test environments. Continuously measuring web site performance in the browser allows you to always focus on end user performance which in the end determines how successful your website will be.

Related Reading:

  1. 101 on HTTPS Web Site Performance Impact I recently analyzed a secure web page that took 20...
  2. Top Low Hanging Fruit to Performance Optimize your Web Site and boost Business Web Page Performance was one of the big topics at...
  3. Inability to measure SLAs around application performance The 2008 Aberdeen Report on Application Performance Management listed the...
  4. Your Top Links about Web Site/AJAX Performance With the recent work we did for the dynaTrace AJAX...
  5. Automated Performance Analysis: What’s going on in my ASP.NET or ASP.NET MVC Application? I’ve spent some time in the last weeks playing with different...

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

@ThingsExpo Stories
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, shared examples from a wide range of industries – including en...
Detecting internal user threats in the Big Data eco-system is challenging and cumbersome. Many organizations monitor internal usage of the Big Data eco-system using a set of alerts. This is not a scalable process given the increase in the number of alerts with the accelerating growth in data volume and user base. Organizations are increasingly leveraging machine learning to monitor only those data elements that are sensitive and critical, autonomously establish monitoring policies, and to detect...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Consumers increasingly expect their electronic "things" to be connected to smart phones, tablets and the Internet. When that thing happens to be a medical device, the risks and benefits of connectivity must be carefully weighed. Once the decision is made that connecting the device is beneficial, medical device manufacturers must design their products to maintain patient safety and prevent compromised personal health information in the face of cybersecurity threats. In his session at @ThingsExpo...
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to ...
From 2013, NTT Communications has been providing cPaaS service, SkyWay. Its customer’s expectations for leveraging WebRTC technology are not only typical real-time communication use cases such as Web conference, remote education, but also IoT use cases such as remote camera monitoring, smart-glass, and robotic. Because of this, NTT Communications has numerous IoT business use-cases that its customers are developing on top of PaaS. WebRTC will lead IoT businesses to be more innovative and address...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
SYS-CON Events announced today that GrapeUp, the leading provider of rapid product development at the speed of business, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company, specialized in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market acr...
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...
In his opening keynote at 20th Cloud Expo, Michael Maximilien, Research Scientist, Architect, and Engineer at IBM, discussed the full potential of the cloud and social data requires artificial intelligence. By mixing Cloud Foundry and the rich set of Watson services, IBM's Bluemix is the best cloud operating system for enterprises today, providing rapid development and deployment of applications that can take advantage of the rich catalog of Watson services to help drive insights from the vast t...
SYS-CON Events announced today that CA Technologies has been named "Platinum Sponsor" of SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business - from apparel to energy - is being rewritten by software. From planning to development to management to security, CA creates software that fuels transformation for companies in the applic...
Recently, IoT seems emerging as a solution vehicle for data analytics on real-world scenarios from setting a room temperature setting to predicting a component failure of an aircraft. Compared with developing an application or deploying a cloud service, is an IoT solution unique? If so, how? How does a typical IoT solution architecture consist? And what are the essential components and how are they relevant to each other? How does the security play out? What are the best practices in formulating...
In his session at @ThingsExpo, Arvind Radhakrishnen discussed how IoT offers new business models in banking and financial services organizations with the capability to revolutionize products, payments, channels, business processes and asset management built on strong architectural foundation. The following topics were covered: How IoT stands to impact various business parameters including customer experience, cost and risk management within BFS organizations.