Another shift in the technology landscape appears to be underway which has the potential to dramatically alter the way data is created and processed.
Edge computing is, in essence, tied to the evolution of the internet of things (IoT). As various industries push to connect previously dumb objects to the internet, the way in which these objects talk to one another will change.
For some uses, low latency is really crucial – think of a connected car needing to decided to avoid an object in the road – and so computing will need to take place at the outer reaches, or the 'edge' of the network, nearer the objects themselves.
An edge device could be anything that provides an entry point to a network, for example, routers, WANs and switches. They will act as miniature data centres, able to communicate with one another – to form a 'fog' – and typically will be used for communicating urgent data.
For example, think of applications in automotive, manufacturing, fleet management, emergency and disaster response, where it is simply not prudent to transmit data back to a central data centre.
This will enable 'fog computing', a relatively recent term used to describe decentralised computing happening at the level of the object which requires data processing, rather than pinging complex requests to data centres that can be hundreds of miles or more away.
What the fog?
In November 2015, a coalition of vendors and academics active in the internet of things field joined forces to create the OpenFog Consortium. The founding members are all heavyweights in tech: Cisco, ARM, Dell, Intel, Microsoft, and Princeton University.
The group's stated aim is to "accelerate the development of fog technologies through the development of an open architecture, core technologies including the capabilities of distributed computing, network, and storage as well as the leadership needed to realise the full potential of IoT."
Analyst group 451 Research estimates that the market for fog computing will overtake $18 billion by the year 2022, driven by utilities, transport, healthcare and industrial markets in particular. A working group created by the IEEE Standards Association is currently working with OpenFog to hash out standards related to advanced IoT, 5G, AI and more relevant to fog computing.
Semiconductor and IP business Rambus recently published a succinct blog that charts the boom in data demands from telcos in running mobile networks, the upcoming transition from 4G to 5G, and how this all bleeds into the edge.
The leap from 4G to 5G connectivity is going to be a dramatic one for edge computing. It could enable technologies such as microcells with very low latency rates, for example, and 5G will also create opportunities to build new networks – not just the public networks for mobile use we are familiar with, but also private sub-nets that can be run within a single organisation.
And all of these things will help push forward more efficient consumer technologies. AT&T, which is investing heavily in edge, describes it as having the cloud come to you. It will, AT&T says, move "the computation into the cloud in a way that feels seamless – it's like having a wireless supercomputer wherever you go."
An example AT&T uses is running a VR cloud experience on your phone – being able to send and received commands from thousands of locations near the user should make that experience more seamless than if it was powered by a data centre far away.
In this way, telcos will become even more powerful stakeholders in networking and cloud technologies because they'll be running much of the underlying infrastructure required for edge and fog computing.
A more urgent example is set out by Intel's GM for IoT, Kumar Balasubramanian. If a smart car detects that a driver is about to suffer from a stroke, that kind of data needs immediate attention and can't wait to go to the cloud to be analysed.
"Data has an expiration date, you can't afford to devalue the data before you apply analytics," writes SAS CTO Oliver Schabenberger in this blog post.
The industrial edge
General Electric, which is heavily invested in predictive maintenance and industrial applications for the internet, says that edge is taking form now for a number of reasons, including, but not limited to: the lowering cost of compute, cheaper sensors, more power in small footprint devices like gateways and sensor hubs, the enormous swell of big data, and improvements to modern machine learning and analytics.
Imagine a single vehicle in an autonomous fleet, estimated by Intel to generate 40 terabytes of data for every eight hours on the road.
GE says sending all that data to a far-away cloud would be "unsafe, unnecessary and impractical" – this is because it is data that holds most of its value in the short term, and it requires extremely low latency for quick decision-making. The difference could literally be life and death.
All of this power at the edge should help businesses use IoT to do digital better.
Microsoft estimates among other things, edge computing to enable interoperability between new and legacy devices, better operations with intermittent connectivity, quicker response times, and improved security and compliance.
To prepare for all this change, vendors are starting to dress their product suites as end-to-end options that deliver the whole package for edge.
Dell EMC, for example, has bet big on IoT and edge computing.
At a New York event in October this year, the vendor announced a new IoT division flush with $1 billion in investment over three years.
The company, which recently completed a mega-merger with EMC, will hope to offer almost the full infrastructure for IoT, edge, and fog. It sells gateways, Dell EMC covers the analytics and on-premise appliances, and Pivotal could be touted for cloud and app integration.
Old Dell rival HPE also acknowledged that edge computing will be important for its future, with the recently departed Meg Whitman orienting an October analyst meeting in October in the direction of IoT and edge.
IBM points out that edge computing will likely also lead to the creation of new mesh networks that could be of use in developing neural networks and machine learning. A meshnet is essentially a kind of decentralised network that enables transmission from one device to the other, without requiring a cell tower.
Meanwhile, chip company Nvidia recently unveiled plans for its Jetson TX2 platform, designed to provide powerful AI closer to smart objects at the edge, with the company citing robots, drones and security cameras as examples.
There are other more subtle strands of edge computing that could snowball into benefits for the bottom line: low-energy but highly powered sensors will require less bandwidth, and less energy to ping data about than would be required for relying on a central cloud. That could mean a significant reduction in energy footprints across a whole field of industries, where big data is currently consuming a high amount of energy, despite efforts to create more green data centres.
The internet of things has understandably carried a perception that it was insecure: some sensor manufacturers had been thought to have raced to get products out to market without baking in security from the get-go.
Now that is less the elephant in the room than it once was. And actually, as edge systems are developed, they might be able to better provide traffic visibility for organisations.
So there are two sides to the coin, comments Keiron Shepherd, senior systems engineer for F5 Networks: "Crucially, the security implications of edge computing must be factored into an organisation's plans to use it," Shepherd says. "What some may not realise is that it has the potential to simplify security management, because the organisation has a clear idea of where the data is coming from and where it's going.
"If everything is being sent to a central data centre or a cloud system, the high volumes of traffic can be hard to monitor for a business that doesn't have enough resource to do so. Cybercriminals can take advantage of this by intercepting data unnoticed."
But, he explains, the nature of edge means that there will also be a wider spread of possible weak links that need to be secured.
"Businesses need to have strong patch management in place that can be replicated quickly and rolled out to the different sensors collecting and sending data," Shepherd says. "There's a misconception that hackers won't go after these specific sensors, but hackers can use holes that haven't been patched as a way of getting into an organisation's network under the radar. What's more, once they can do this for one device, they can often use the same flaw to access data on hundreds, thousands or potentially even millions of others."
There are also potential advantages for privacy as well, by keeping the data near the device rather than sending it to the cloud.
On the precipice
Analyst house Gartner expects that by 2022, an astonishing 50 percent of enterprise-generated data will be created and processed outside of a traditional data centre or cloud – a stark contrast to the roughly 10 percent mark where it currently sits.
IDC, meanwhile, expects IoT spending to hit $1.35 trillion by 2021, with much of that depending on the underlying infrastructure being developed and rolled out. GE believes the industrial internet market in isolation will be worth $225 billion by 2020, and Accenture expects industrial IoT to heap on $14.2 trillion to the global economy by 2030.
IBM describes edge as the 'next frontier' for cloud, so come what may, it seems that a new way forward for compute is likely to bring with it new ecosystems, fresh strategic thinking, and perhaps entirely new ways of doing business.