40% increase in edge deployment of network resource forecast by 2022

New research paper from IDC and Limelight Networks highlights importance of preparing content delivery for edge compute, enabling business leaders to leverage the agility of network resources.

  • Wednesday, 17th February 2021 Posted 3 years ago in by Phil Alsop

A 40% increase in edge deployment of network resource is forecast to occur by 2022, according to the research paper ‘Outlook for Edge Services’ from IDC and Limelight Networks. By next year, 60% of all network resources will be deployed at remote edge or service provider locations, allowing business leaders to leverage the agility of their network resources, up from 20% in 2020.


The report reinforces the trend for content processing to move increasingly to the edge, which can offer high-quality video experiences with minimal buffering and cost reductions.

The report also explores the benefits that industry professionals expect edge to add to their business:

  • 45% believe edge will bring increased productivity or efficiency
  • 44% believe edge will bring increased security and compliance
  • 42% believe edge will bring faster decision making
  • 40% believe edge will bring improved customer relations or customer experience

Steve Miller-Jones, VP of Strategy, Industry & Partnership at Limelight Networks commented:

“In the last few years, we have seen advances in both the range of edge services and their adoption within a variety of content and enterprise workflows. In 2021, we can expect the variety and scope of customization capabilities to grow, helping enterprises meet high end user expectations for accessing and consuming content. 

“The network edge makes it possible to affect data as it flows towards end users and devices, and to control the flow of data from those devices. Shipping large quantities of raw data towards cloud providers is expensive and can introduce significant latency.

“This is why the network edge will be used more for data processing and decision making. It allows access to multiple compute locations close to end users and devices, therefore removing the potential for latency without incurring additional operational overheads. The net result is improved content performance in a cost efficiency way, with built-in security and scale.”