US demonstrates lack of basic website optimisation

The problematic US website appears to fail the most fundamental of performance optimisation tests, an analysis has found.


The problematic US website appears to fail the most fundamental of performance optimisation tests, an analysis has found., a key piece of the Affordable Care Act, often referred to as ‘Obamacare’, is a website from the US Department of Health and Human Services (HHS) that allows uninsured US citizens to shop for new health insurance plans. However, the $630 million website has been plagued with problems since its launch on 1 October 2013, including outages, slow page loads and users not being able to complete applications.

Over a month since the website’s launch, an analysis by application performance company Compuware has found that standard website design optimisation practices, that have been around for years, have still not been applied. This leads to too much content being downloaded by the site, which continues to slow it down.

In the most simple example, the homepage hosts a large background image which Compuware was able to compress by more than 70 percent without affecting the quality.

“It is this ‘blurry’ background image that is 350KB in size. I downloaded it, opened and re-saved without losing any quality with my free Paint.NET program. Now the size is 99KB - that’s a reduction of more than 70 percent,” Andreas Grabner, technology strategist for Compuware and lead of the Compuware APM Centre of Excellence team, wrote in a blog.

Not enough testing

Grabner told ComputerworldUK that is, and should be able to act like, any other e-commerce website. The only difference is that the customer - the US citizen - has no other competitor to go to when it experiences problems on the site.

“It should have been done the same way as other companies do as they prepare for the holiday season. They should have done the load balancing. They should have taken the time to do the testing,” he said.

“Proper testing was not done, I assume because of time constraints.”

Systems integrator QSSI, the contractor hired to oversee the entire website, and one of the primary contractors CGI Federal have admitted that the site was inadequately tested before launch.

The team spent about two weeks testing the site, when CGI “would have liked to have months” to do so, according to Cheryl Campbell, a senior vice-president for the company.

Missed merging and minification opportunities

Another observation Grabner made was that there are a “huge” number of JavaScript and CSS (Cascading Style Sheets) files on the site, which could have easily been merged to reduce the number of files being downloaded.

For example, Grabner found 65 JavaScript files on the myAccount page of the site alone. Moreover, the registration page loaded 55 individual JavaScript files and 11 individual CSS files - which, under common web performance optimisation practice, would have been merged.

“Instead of 55, they could have had two or three,” Grabner said, adding that the jQuery JavaScript library that the website uses even comes with prepackaged merged files.

He suggested that a lack of oversight of the whole website development, automation processes and time may have led to developers downloading what they needed without checking first if had already been downloaded elsewhere, therefore missing the opportunities to merge files.

As well as merging, files could have been ‘minified’, Grabner said.

“Let’s take a look at one particular file - jQuery.DataTables.js - with a size of 440KB,” he illustrated in his blog.

“Putting it through a public available minifier such as reduces the file size to 83KB - that’s a reduction of more than 80 percent!”

Continued performance monitoring

Grabner is continuing to monitor the performance of and has created a web page to map the results.

The performance map highlights the US states that experience an average response time of more than eight seconds in red, and green for four seconds or less.

There are two possible explanations for the longer response times, Grabner said. One is that the broadband connectivity of the states might not be as good.

However, Grabner suggested might be responsible for the second reason.

“It could be they don’t leverage CDN (content delivery network) services,” he said. “[CDN would allow them to] distribute the content geographically local to the end user [rather than sending the data from servers based thousands of miles away].”

"Recommended For You"

Congress wants to know: Who the blank is responsible for mess? Security concerns prompt subpoena for data