Edge computing has grown over the past several years to become one of the most important current trends in IT. It is increasingly viewed as a part of digital transformation, and linked with other trends such as the internet of things (IoT), analytics and cloud computing. But, as with those trends, there is no precise definition – and often much hype – about what edge computing is.
A simple definition of edge computing is that it involves some processing and decision-making taking place at the edge of the network, rather than everything being centralised in the datacentre or the cloud.