The basic difference between edge computing and cloud computing lies in where the data processing takes place.
Cloud computing is when remote servers hosted on the Internet store and process data, rather than local servers or personal computers.
The data is sent, stored, and processed at centralized data centers.
Hence statement 1 is correct.
Edge computing is computing that’s done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work.
Hence statement 2 is correct.
It doesn’t mean the cloud will disappear.
It means the cloud is coming to you.
Edge computing enables data to be analyzed, processed and transferred at the edge of a network.
The idea is to analyze data locally, closer to where it is stored, in real-time without latency, rather than send it far away to a centralized data center.
So whether you are streaming a video on Netflix or accessing a library of video games in the cloud, edge computing allows for quicker data processing and content delivery than cloud computing.
Hence statement 3 is not correct.
At the moment, the existing Internet of Things (IoT) systems perform all of their computations in the cloud using data centers.
Hence statement 4 is not correct.
Edge computing, on the other hand, essentially manages the massive amounts of data generated by IoT devices by storing and processing data locally.
That data doesn’t need to be sent over a network as soon as it processed; only important data is sent, therefore, an edge computing network reduces the amount of data that travels over the network.