Driven by a very strong belief in the future of software-defined data center technology, Bank of America is steering its IT to almost total virtualization, from the data center to desktop.
The technology does for the entirety of a data center what virtualization did for servers: It decouples hardware from the computing resources. Its goal is to enable users to create, expand and contract computing capability virtually, quickly and efficiently.
The software-defined data center is not yet a reality. But there are enough parts of the technology in place to convince David Reilly, Bank of America's global infrastructure executive, that it is the future.
"The software-defined data center is going to dramatically change how we provide services to our organizations," said Reilly. "It provides an opportunity for, in effect, the hardware to disappear.
"We think it's irresistible, this trend," said Reilly.
The direction Bank of America sets for IT is important. The financial services industry, as a whole, is a fast adopter and can build markets for new technologies. It also tends to have very large IT operations.
Goldman Sachs, for instance, employs about 10,000 IT professionals, representing more than 25% of its workforce. Bank of America won't disclose the size of its IT staff, but it employs around 250,000 people overall.
The financial services industry is expected to spend $430 billion globally on IT this year, or more than 20% of the expected worldwide IT spend of $2.14 trillion, according to IDC.
The implications of a shift to software-defined IT operations may have multiple impacts.
Bank of America has been a heavy user of propriety and special purpose hardware, but, "we see that, increasingly, there will be no differentiation in the hardware," said Reilly.
The bank is using more commodity hardware, which means x86-based systems, and decreasing its use of propriety systems, said Reilly. "You can imagine for many hardware partners that's a little bit of a frightening moment. The key for infrastructure is going to be the software that defines it, not the physical hardware layers."
The components that will make up the software-defined data center are arriving.
VMware, for instance, this month announced general availability of its Virtual San (vSAN) product that delivers a major part of what will make up a software-defined data center. The software, which is part of VMware's vSphere kernel, "decouples dependencies" that exist between the application and the underlying infrastructure, said Alberto Farronato, VMware director of product marketing for storage and availability.
That decoupling lets IT administrators treat storage as a resource pool instead of something attached to a specific device. It should help prevent over-provisioning, or buying more storage capacity than needed for an application, as well as reducing the need for specialized skill sets to manage various systems.
Farronato says it enables "policy-based" management of resources, where you create a virtual machine and associate a policy to it. That policy captures the requirements and automatically brings the resources needed to meet service levels.
Reilly, who isn't disclosing which vendor products the bank is using or testing, explains the policy approach this way: An application will arrive with a "manifest" that details the needs of the application, such as how much storage and compute capacity it requires. Those resources will be returned to the "pool" when the application no longer needs them.
Building a software-defined data center that treats networks, storage and servers as computing resource pools will allow the bank to quickly change its computing environment to meet business needs, whether it's in response to growth, shrinkage, new markets or regulatory changes, said Reilly.
This virtual model is also extending through the bank's enterprise. While it still has a fair number of thick clients in place, Bank of America is increasingly moving to a virtual model. Offering virtual sessions also enables the bank's BYOD policies.
There's still work ahead, however.
VMware's vSAN manages all the memory inside a server, but does not manage external storage arrays in the same way. Eventually, that capability will arrive as well, said Farronato.
There is a belief that software-defined technologies will significantly reduce IT costs, but cost reduction has always been a relative concept in IT. While the number of mainframe administrative costs, for instance, has declined over the years thanks to management improvements, IT costs in other areas, such as mobile management, have increased.
Some believe that a software-defined data center "will do for data centers what robots have done for a lot of manufacturing processes," said Charles King, an analyst at Pund-IT.
King says the second half of this year may reveal new products that can do just that.
The software defined data center "could portend changes in the enterprise and IT industry that are tectonic," said King.
Patrick Thibodeau covers cloud computing and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld. Follow Patrick on Twitter at @DCgov or subscribe to Patrick's RSS feed. His e-mail address is [email protected]