FAQs
What is edge computing?
Edge computing focuses on bringing computation and data storage closer to where it is needed. In traditional cloud computing, data is sent to big data centres for processing. But with edge computing, the processing happens on or near the device or ‘edge’ of the network where the data is created.
What does edge computing do?
Edge computing reduces delays, enhances performance, and makes things more efficient by processing data closer to the source. This is especially important for tasks that need real-time or near-real-time processing, like Internet of Things (IoT) devices, self-driving cars, augmented reality, and other applications that are very sensitive to delays.
Edge computing vs cloud computing: what are the differences?
Edge computing involves processing data near its source, minimising latency and reducing the need for centralised cloud servers. In contrast, cloud computing relies on centralised data centres for processing, introducing potential latency due to data transmission.
Cloud computing is better for applications with less stringent latency requirements and extensive storage needs, while edge computing excels in scenarios where bandwidth usage needs to be optimised, as it processes data locally and transmits only essential information to the cloud.