

AI has undoubtedly taken the spotlight this year, with its popularity steadily on the rise. However, rapid advancements in AI pose new requirements for edge computing infrastructure due to the increased demand for dense, compute-intensive infrastructure.
Following the introduction of ChatGPT, AI is now centre stage. The versatility of AI has been a crucial factor in its success, ranging from simplifying everyday manual tasks to serving industrial applications and beyond. Some AI applications come with specific requirements, including ultra-low latency, substantial compute power for model training, and large-scale data collection. Edge computing, with its ability to serve these needs, offers a compelling solution for meeting the infrastructure needs of some AI workloads.
What is the role of edge computing in AI?
Edge AI – a term for the use of edge infrastructure for AI development and deployment – possesses the ability to provide substantial speed and reliability improvements, low latency for mission-critical applications, and cost-effective solutions. These factors can be crucial for the delivery and training of AI models.
The future of edge AI infrastructure: GPUs versus CPUs
The debate over GPUs versus CPUs is one crucial aspect of this landscape. CPUs, most commonly used in edge infrastructure deployments today and known for their suitability across a multitude of use cases and compute requirements. However, CPUs may struggle to handle heavier workloads, especially in the demanding field of AI, prompting discussions about a potential shift towards increased GPU usage.
GPUs are gaining traction because of their ability to handle intensive workloads, especially in applications like computer vision-driven video analytics. They also excel in the deep training of AI models, a crucial characteristic for the development of AI. GPUs, come at a higher cost compared to general-purpose CPUs, making them challenging to use at scale. Considered and strategic planning will be required from those looking to deploy GPUs at the edge to try and ensure that these expensive assets do not end up being under-utilised.
Figure 1: Advantages and disadvantages of CPU versus GPUs
With both CPUs and GPUs having different advantages, the choice between CPUs and GPUs will depend on the specific application requirements in the context of edge AI. For instance, for tasks that require lower power consumption, and versatility, CPUs may be a better choice due to their cost-effectiveness. However, if an application demands high-performance AI processing, especially for deep learning models, GPUs are likely the preferred option.
The question of whether GPUs will replace CPUs as the preferred infrastructure at the edge was raised in a recent survey by STL Partners. In the survey, 43% of the respondents thought GPUs would be used for AI/ML workloads at the edge, while 39% chose CPUs for these tasks, showing that the market remains split.
This perhaps should not come across as a surprise, partly because of the advancements made by large chipset players, notably Intel, who have produced AI-specific CPUs that can operate at the edge and handle high compute power. These developments begin to blur the fundamental differences between CPU and GPU in the first place.
To read more about STL’s analysis of the role of edge computing in enabling AI workloads, see Edge AI – How AI is sparking the adoption of edge computing.
Are you looking for advisory services in edge computing?
Edge computing market overview
Edge computing market overview
This 33-page document will provide you with a summary of our insights from our edge computing research and consulting work:
50 edge computing companies to watch in 2025
As always, this list includes a range of companies, from start-ups to those established in the ecosystem. This year, we’ve asked companies to provide more information on their product development roadmap, with a particular focus on AI-enablement features.
An insight into the future of AI-RAN
Discover insights from our recent interview with Dr. Alex Jinsung Choi on the AI RAN Alliance’s vision, drivers of RAN evolution, and how AI RAN tackles key industry challenges.
Edge computing at MWC 2025: AI is the trigger
Edge computing was present across the Fira this year, though not as the headline act. Instead, it appeared in its rightful place as a key enabler, deeply woven into the discourse surrounding AI monetisation.