As improbable as it sounds, it’s going to take some fog to clear up the cloud.
The cloud is great. An amazing place, in fact, to store and process enormous amounts of data, and to extend those capabilities beyond a user’s on-premises borders. About 10 years into the cloud era, however, it’s painfully clear that explosive growth in data volumes and device connections is pushing the cloud to its limit.
“Connecting more and different kinds of things directly to the cloud is impractical,” according to Cisco’s definitive whitepaper on fog computing.
How fog works
As Cisco’s paper explains, the traditional cloud model adds latency because data is sent all the way back to the cloud for analysis.
But in fog computing, algorithms prioritize data. The most time-sensitive data — requiring immediate processing and action — is analyzed much closer to the edge of the network. Data that is less time-sensitive is sent “to the cloud for historical analysis and longer term storage,” the Cisco report states.
In a blog post (would that be a fog blog?), Lynne Canavan argues fog computing is a necessary tool “for those involved in IoT, 5G, artificial intelligence and virtual reality” because it bridges “the continuum from cloud to things.”
“It distributes compute, communication, control, storage and decision making closer to where the data is originated, enabling dramatically faster processing time and lowering network costs,” wrote Canavan, executive director of the industry group OpenFog Consortium.
Fog for IoT
Besides freeing up bandwidth and reducing latency, fog’s data segmentation enables real-time analysis and response — which is exactly what IoT demands.
In fact, Cisco suggests fog computing is ideal in situations where “it is necessary to analyze and act on the data in less than a second.”
That could be a sensor gauging dangerous pressure levels in oil and gas pipelines, or a surveillance camera detecting an intruder. Cisco researchers write that without fog computing in place, “by the time [IoT] data makes its way to the cloud for analysis, the opportunity to act on it might be gone.”
Fog computing closes that opportunity gap with nodes. Fog nodes can be physical (like switches, servers or routers) or virtual (like virtual machines or virtualized switches).
“Fog nodes … can be deployed anywhere with a network connection: on a factory floor, on top of a power pole, alongside a railway track, in a vehicle or on an oil rig,” according to Cisco.
These nodes can be strategically placed closer to the network edge so data from remote devices can be analyzed quickly, and necessary action can be triggered instantly.
Where fog is headed
In a report released about six months ago, the OpenFog Consortium predicts the global fog computing market will grow from $3.7 billion in 2019 to $18.2 billion in 2022. The group expects fog to be adopted earliest in the energy/utilities, transportation and healthcare verticals.
The authors also estimate that by 2022, about one-third of total fog deployments will be Fog-as-a-Service, whereby customers lease fog hardware, software and services from vendors on a subscription basis.
Fog computing development and innovation should get a boost from the new fog technical standard, which was just released by the IEEE Standards Association (the Institute of Electrical and Electronics Engineers).
Yes, the fog is rolling in. Not to obscure the cloud but to make it work that much better.