Edge Computing is Reshaping the Cloud and How We Compute
Talk to anyone about the biggest trends in computing over the last few years and you’ll inevitably hear about “the cloud.” Though it sounds a bit amorphous, most people have come to realize that cloud computing consists of lots of big data centers filled with thousands of powerful servers scattered throughout the world. More importantly, even though they may not understand how it all works, they know that these cloud-based datacenters power all the web-based applications and services upon which we’ve become so dependent.
From streaming movies, to social media sites, to online gaming services, as well as millions of business-focused applications, the cloud has become an immense force in our digital lives.
The Rise of Edge Computing
But despite all that influence, we’re starting to see the rise of a new computing model called edge computing. The basic idea behind edge computing is to take some of the power of cloud computing and bring it closer to our devices that are located near the end, or edge, of the network. It’s not that cloud computing is going away—far from it. In fact, it’s essentially because the type of capabilities that cloud computing has enabled have become so powerful and so important, that there has been a pressing need to spread the centralized intelligence resources of the cloud out to more places.
In addition, there’s been tremendous growth in the number and type of smart connected devices, which go well beyond traditional PCs and smartphones, that are connecting to the internet. These new IoT (Internet of Things) devices range from connected consumer appliances to smart industrial machines and even to connected cars. The expanded usage of these devices is placing even more demands on the current cloud infrastructure.
As a result, there’s been a recognition that a new type of computing architecture is necessary to supplement the traditional cloud: hence, edge computing. The fundamental idea behind edge computing is to distribute computing and storage resources across millions, or even billions, of different devices and locations to provide the web of computing support necessary to meet our growing need for digital applications and services.
Edge computing devices can range from drones that are used to inspect difficult-to-reach places, such as large power lines or wind turbines, to autonomous cars to devices in your home, such as smart security cameras. In each of these cases, the devices include powerful compute engines and the requisite storage to support the applications and data being run on (and through) them. In the case of an intelligent surveillance camera, for example, video is stored directly on the device and new types of chips are being used to analyze that video footage to look for objects—without having to send the entire video stream up to the cloud.
Why the Move to Edge Computing?
The reasons for the move to distributed edge computing are many, but they can be simplified into a few strategic buckets: speed, capacity, efficiency, cost and responsiveness. Another example within the home might help illustrate the challenges. Popular voice-based smart speakers are IoT devices that currently depend completely on cloud-based data centers to translate your voice requests into actions. Every time you make a request to one of them, it sends your audio to a cloud-based data center, translates it into digital form, decides what to do or what information to send back to you, and then responds. Thankfully, this all happens very quickly, but it’s still not as fast or efficient as having a conversation with another person in the room (or even on the phone).
As the usage of these types of devices grow, there’s a strong possibility that responses times could slow, impacting their usability. More importantly, most people would like to start having more natural, multi-part conversations with these devices, rather than single requests, and that places even more strains on the system. In fact, people have claimed that if all the owners of these devices increased the amount of voice-based interactions they have with them by just a few minutes a day, the resulting impact on data center requirements would be enormous.
That’s part of the reason why there has been so much excitement and momentum building around edge computing. By locating more high-power computing resources throughout the network, you can reduce latencies (or delays) in responding to requests, you can leverage network bandwidth more efficiently, and you can respond to more requests at a faster rate.
From the Edge to the Cloud
The implications of the move from a centralized, cloud-based computing to a more distributed edge computing model are profound, particularly when it comes to the computing and storage demands that are going to be placed on an enormous range of different devices that live out near the edge. Just as the transition from mainframes to PC-based client-server architectures had an enormous impact on the tech industry (and society overall), so too will the transition from a cloud-based model to an IoT-driven edge computing environment. The effect will be particularly strong on the components necessary to power these edge computing devices.
Instead of relying on the shared resources of large data centers in a cloud-based world, the edge computing world will place more demands on both individual endpoint devices, as well as an entire range of new intermediary devices, such as gateways, edge servers and other new computing elements that are being developed to enable a full edge computing environment.
These devices will all need significant built-in computing and storage capabilities to handle some of the applications and workloads that traditionally went straight to the cloud. To be sure, some applications in these edge computing resources will continue to pass along workloads to the cloud, but the long-term goal is and will be to distribute workloads across many different elements in an edge computing environment.
Companies designing and building these new edge devices need to be cognizant of the kinds of computing and storage requirements that will be necessary to make them effective tools in this new computing world. Speedy flash-based storage will be one the critical elements to make them as useful as possible, by providing the kinds of throughput speeds that will be needed to function in these demanding new edge computing environments.
Ultimately, the world is moving to a more distributed computing model where cloud-based computing plays a critical but not solitary role in driving forward the capabilities of both connected consumer devices and critical devices for enterprise and industry. This, in turn, will drive increased interest in devices with compute and storage capabilities at the edge of the network.
Changes this big certainly aren’t going to happen overnight, but there’s no question that we’re at the dawn of an exciting new era in the tech industry that’s going to be defined by the concept of edge computing.