WAN optimisation or a 'real' upgrade?

WAN optimisation seems to make sense but IT managers are faced with the burning question: Is it really worth the investment?

Share

Solutia is mostly optimising HTTP traffic for its SAP deployment, although it's also using de-duplication techniques for file transfers, where data is cached for files that were transmitted in a previous session. De-duplication is the same technology that companies such as NetApp use when making backups of virtualised servers to speed operations and use less storage.

With WAN optimisation, if a business user sends a large PowerPoint file to a branch office and the recipient changes just one slide, the entire file is not retransmitted back from the branch office to the user -- only the changed data.

In some ways, the Solutia example reveals how the term "WAN optimisation" is a misnomer, because optimisation often implies acceleration: If you optimise a sports car, it goes faster. Even though the companies that make the appliances usually talk about statistical value -- increasing performance up to 32 times, in some cases -- the products aren't actually changing the speed of the network.

Yet the compression alone is often worth the investment, which typically runs just a few thousand dollars for an appliance that optimises a 3Mbit/sec. to 5Mbit/sec. connection. (Optimising a 100Mbit/sec. pipeline can cost well over $10,000 for just one appliance, however.) That was the case at Activision, a well-known game publisher, which has three district offices, 13 development studios (some in countries such as Japan and Australia), several sales and marketing offices ranging in size from six to 200 people each, and a corporate office in Santa Monica, Calif.

The issue Activision faced wasn't directly related to application performance -- although the company recently conducted its first WAN optimisation test from the US to London for an Oracle 11i application. Instead, the company that made Guitar Hero 3 and Call of Duty 4 was struggling with network latency.

According to Thomas Fenady, a senior IT director, the process of developing games was hampered by some harsh realities of WAN networking. With each game build ranging in size from 4GB to 12GB each, employees in remote offices had to wait about eight hours to receive the latest files, sometimes watching movies or heading home for the day until they could start working on the latest version of the game.

"We blamed the problem on two issues," says Fenady. "One is just the speed of light, which we could do nothing about. Even over a 35Mbit/sec. or 45Mbit/sec. connection from Santa Monica to Dublin, we saw latency go from 100ms to 180-200ms to 250ms or higher.

“The other issue was TCP inefficiencies [where the nature of the Transmission Control Protocol causes the transfer speed to throttle down as congestion occurs]. TCP throttling makes a connection drop down slowly when you lose packets. With WAN optimisation, those same game-builds now transfer in about 15 minutes instead of eight hours." The transfer times were reduced so dramatically because WAN optimisation weeds out congestion and latency problems and therefore helps reduce TCP throttling.

Fenady noted that although network latency wouldn't have hampered game development (because the company would have found a work-around in order to make release dates), the optimisation makes the development process more fluid and less of a waiting game.

He says the big challenge with optimisation -- one that separates one vendor from another -- is that it's easier to optimise a T1 or T3 connection, but a line that runs at 45Mbit/sec. or higher is more difficult to optimise. This is because the much faster networks transmit data so rapidly that it's difficult to quickly analyse which data can be compressed, which data is encrypted and which data can be de-duplicated.

"Recommended For You"

WAN optimisation takes off Silver Peak to offer free trials of WAN optimisation products on a virtual marketplace