From Cloud to Edge: Edge Computing

Rehan Meer
2 min readSep 3, 2023

Edge Computing is a computing type that takes place on-site or near a specific data source, reducing the need for data to be processed in a remote data center which eventually minimizes the need for long distance client-server communication, which reduces latency and bandwidth.

Fig.1 . IoT & Computing

Data is the lifeblood of all modern day businesses yet challenges such as bandwidth,latency issues and unexpected network interruptions can hinder operations so businesses are coping with these challenges through Edge Computing.

In order to understand Edge Computing and how it works, it’s important to explore its three key components.

1. Edge devices

These are the devices which are at the edge of the network and generate data. These devices are equipped with an array of technologies to provide streamlined data handling , built-in storage, processing capabilities & connectivity options. Example includes IoT Sensors, Smart Cameras, Smart Wearable Devices.

2. Edge networks

An Edge network is a specific network configuration tailored to offer both connectivity and computational resources for Edge devices. Its primary purpose is to facilitate data processing and storage at the Edge rather than relying on a centralized data center or cloud infrastructure. Typically, an Edge network is comprised of different Edge devices interlinked with each other.

3. Edge infrastructure

Edge infrastructure encompasses both the hardware and software components essential for Edge Computing which iclude Edge devices, Edge networks, and other supporting systems. This encompasses physical components like servers, storage, and networking gear, and the software and platforms that run on these. Examples of Edge infrastructure include gateway devices, Edge data centers and specialized Edge AI chips.

Edge computing is a decentralized method of data processing that brings computing resources closer to the data originating source.Edge computing is rapidly emerging as the future of computing technology, with a projected 75% growth in its current usage anticipated by 2025.

--

--