Never mind the Olympics. You don't have to be a 'tween' to know that the Disney original movie Camp Rock was a sensation this summer, drawing some nine million viewers.
Wanting to capitalise on the success, the network decided to put the full-length movie on Disney.com for one day, along with interactive features like the ability to chat with other viewers online, take polls and answer trivia questions.
With a window of 60 days to get the movie on the site, Disney's Interactive Media Group relied on a combination of virtualisation, load balancing and content delivery networks (CDN). About 25 servers were provisioned for different parts of the architecture to balance the load of the anticipated increased traffic, says Bud Albers, CTO of the Interactive Media Group, in Seattle.
The group had done virtualisation projects before, but never of this magnitude, Albers says. The strategy was to be able to scale server capacity up and down, depending on the demand, he says. Deploying a physical infrastructure was not a viable alternative. "There wasn't time to do it any other way,' Albers says, since Disney had to gather requirements, features and content and then come up with a production schedule.
The goal, adds Adam Fritz, principal software engineer for the interactive media group, was to ensure capital and operating efficiencies as well as the ability to remain agile by relying upon virtual machines. "By taking a pool of equipment and dedicate it to the event, and move it around instead of having to go through a deployment and purchasing cycle Ã¢Â¬Â¦ makes us more agile," says Fritz, also in Seattle.
Other sites, including ABC news and ESPN are hosted out of the same facility, "so we were able to spread our load and use 25 different machines that weren't at a peak time. Basically by doing that, we were able to hold the peak load and there were no incremental capital costs," says Albers.
Disney.com also used "XD" features, a dynamically integrated environment incorporating video, games images and community elements, which are a part of the site. In order to provide a good user experience, the group relied upon two CDNs, Akamai Technologies and Limelight Networks, to help meet the volume for multiple types of content delivery. Instead of having the user requests come in to Disney's group, they were sent to nearby CDN nodes, says Fritz.
"Using the CDN for large-scale assets like video delivery is the stated practice today no matter who you're talking to,' says Albers. "CDN is the key component in that value chain. What we're adding to that is ability to measure and optimise the CDN as that market and service commoditise."
According to Disney's internal tracking, the site reached a daily record with 3.17 million visitors, increasing traffic to Disney.com by 37 percent on June 23. It received 860,000 video plays for the one-day event.
Albers says the day-long event allowed them to prove "beyond a shadow of a doubt" the scalability of a virtualisation scheme, which will continue to be a huge advantage to them for future events where huge spikes in online traffic are anticipated, like during the presidential election. "Going forward we're now very well positioned to leverage growing this environment."
Melanie Posey, a research director at consultancy IDC, agrees. Major events that will have mass appeal require a flexible architecture and the ability to reallocate existing server capacity. "That's the advantage of virtualisation technology,' she says; using a combination of load balancing and CDN and corralling underused servers. "Having the ability to reallocate server capacity that already exists is a lot easier and more time efficient for the company that's providing the content,' says Posey, "than going out and getting a physical server and installing it and configuring it."