What Is Edge Computing, and How Will It Affect the Enterprise?
The answers to the question "What is edge computing" can help companies better serve their customers and themselves.
The old saying “everything old is new again” can hold true even in the realm of cutting-edge technologies. There was once a time--a time before the internet was open to every company, educational institution and grandmother with a cell phone--when all processing happened within the computer tower that sat on your desk in your office or home. A time when “transmitting data” to work on remotely consisted of copying files from your office workstation to a disk that you then copied to your home computer. While the days of copying data to floppies and other disks are gone (hopefully), we are starting to see the return of processing and storing data on the fringes--or the edge--of the internet. In short, we’re seeing the rise of what has come to be called edge computing. But, many companies are asking themselves, what is edge computing?
In the early days of the public-facing internet, we got really accustomed to waiting--waiting for pages to load, waiting for responses from websites, waiting just to be able to “hit” the Internet. In today’s world, we rely on the internet to provide everything from social media posts, to grocery ordering, to national defense. Indeed, the internet has grown from novelty to necessity, and it serves as the backbone for cloud computing.
The Edge of the Cloud
The term “edge” is taken from the abstract geographic area where distributed computing takes place in today’s cloud technology landscape. In general terms, “cloud” computing comprises an array of connected servers clustered to protect against server-level failure. These clustered arrays of servers are owned by providers such as Microsoft (Azure), Amazon (AWS), and Google (GCP), forming a globally dispersed array of gargantuan data centers. The placement of these data centers protects against geo-centric disasters, but also aids in alleviating what is turning into a major problem in data processing and end-user experience: latency.
Not Just About the Public Cloud
Public clouds are not the only source of centralized compute and storage. Almost every business in the developed world has some form of computerized footprint. It may simply be a server located under the cash register or in a storeroom, or it may be a private cloud composed of arrays of servers in your company’s office, dedicated corporate data center, or a co-located data center. Latency is an issue for any company that has to ingest significant amounts of data for processing in order to return it quickly for split second decisions or presentation back to end users. This is the case whether you're talking about large public clouds hosted in rented hardware or private clouds hosted in your data center. It’s also an issue if you’re a mom-and-pop shop with a single server in your storefront where you take orders over the internet.
Edge Computing Defined
Edge computing attempts to reduce latency in processing by providing compute closer to the source of data collection: on the edges of the cloud. This compute may take place on an IoT device itself, as is the case with devices being developed by Nvidia for autonomous cars, or it may be used by a company to provide a particular service--such as Apple keeping authentication on-device. In the case of Nvidia and other companies developing technology for self-driving cars, overcoming latency is a matter of life and death--literally. For Apple, reducing latency not only allows for a better end user experience--faster unlocking--but it also addresses concerns about data privacy violations. By shortening the runway that network traffic needs to provide some form of calculated result, edge computing addresses latency, security, data privacy, and health and safety concerns.
Processing and storage of data may also still be considered edge computing even if the processing/collection occurs off-device. Processing can still occur close enough to the edge of the cloud to be considered “edge computing,” thanks to storage and compute devices that are starting to proliferate at the perimeters of the cloud. For example, telecom companies, for fear of being shut out of the cloud economy from a provider perspective, are rushing to provide edge computing devices close to network access points such as cell towers.
How Edge Computing Helps
By placing compute and storage closer to the devices collecting IoT data, you improve your latency for provision of time-sensitive calculations to end users. In many cases this data--or, at least, data with this level of granularity--matters to end users on the edge only for a specific amount of time. Many devices collect information in loads that occur at a high frequency: think telemetry or systems data for autonomous vehicles or biometric data for exercise/health devices. This high-volume data may have importance to the local device--be it a car or a spin bike in your local fitness center--but the sheer frequency of data may not be important for long-term storage/aggregation/analytics at the centralized server tier. Edge computing allows companies to provide near-real time processing and decision management where it’s needed, without overwhelming central databases or web servers with high volumes of unnecessary data.
Is Edge Computing Right for You?
If you’re an enterprise that collects data from sensors on the periphery of the cloud and find yourself drowning in extraneous data, or if you're unable to provide results to your end users at speeds they expect or need, then exploring the "edge" may be in your best interest.
About the Author
You May Also Like