How can telcos become more relevant enablers of AI?

Download Listen

The rise of AI presents a major opportunity for service providers. However, they need to proactively understand and address customer long-term needs. This means getting involved early in their customers’ thinking and planning around AI – and this article looks at how providers can do that effectively.

Telcos should proactively understand and address long-term customer AI needs

Artificial intelligence (AI) is not a new technology. However, recent developments in foundation models and generative AI (GenAI), coupled with significant infrastructure investment, have rapidly advanced its capabilities and visibility. Every major tech vendor now positions AI at the centre of its roadmap. AI will undoubtedly impact every aspect of society and enterprises in all sectors.

STL Partners’ Telco generative AI adoption tracker indicates that telcos are keen and early adopters of AI. Initially, this has been focused on internal efficiencies, such as automating network operations, deploying customer service chatbots and supporting customer-facing functions. More recently, some are beginning to explore adjacent, non-connectivity services – AI-as-a-service (AIaaS) and GPU-as-a-service (GPUaaS) – such as tailored and integrated AI services for enterprises (e.g., Orange’s Live Intelligence). However, the use of AI by end users, even by enterprises, has not yet created any significant changes to their connectivity needs (no surge in traffic, no needs for higher speed, no increase to SLA requirements, no requirement for more diverse physical resilience); therefore, service providers (SPs) have not yet needed to adapt their offerings. Until now.

Figure 1: 37% of telcos’ AI implementations concern AIaaS to enterprises

Source: STL Partners, Telco generative AI adoption tracker, June 2025

Service providers are increasingly asking what AI’s long-term impact will be in terms of customer needs and how their services will meet the associated customer expectations and requirements. The indications are that AI adoption will soon be far more disruptive for SP services than it has been to date, centred on three main elements:

• The emergence of agentic AI
• The opportunity to offer AI inferencing at the edge, in particular for small language models (SLMs)
• A surge in sensor data associated with new AI analytics capabilities

A key driver is the emergence of agentic AI, whereby autonomous systems pursue goals and act independently based on intent. Unlike humans, these agents interact with systems continuously and at machine speed, generating dynamic and unpredictable data flows. They do not require humans to initiate a process but can respond to external triggers. Depending on where inference and reasoning occur, these systems introduce very different demands on network infrastructure. They also interact between themselves (ideally without human intervention or control), which means that their actions must be highly dependable to prevent erratic and uncontrollable system behaviour.

• Even centralised agent architectures may require data aggregation from distributed sources, significantly increasing upstream and interconnect traffic, and elevating security risk.
• In contrast, decentralised agents operating on device or at the edge shift the load on access networks and east-west traffic within local domains. In both cases, inference may be triggered dynamically or batched, causing unpredictable traffic spikes.
Multi-agent environments introduce an additional layer of complex, real-time coordination. These systems rely on ultra-low latency not just for data transfer but also for high-frequency control signals underpinning distributed decision-making. Ensuring resilience, observability and dynamic prioritisation of these lightweight flows is critical to prevent systemic failure.

Additionally, the rise of SLMs as a smaller-footprint (and more specialised) alternative to large, centralised foundation models introduces new architectural dynamics. SLMs can be deployed at the edge or on premises to minimise latency, meet regulatory or data sovereignty requirements, and overcome power supply logistics and space constraints of centralised data centres. This architectural shift creates a potential opportunity for telcos to offer distributed compute infrastructure for SLM inference, especially at sites with existing edge presence or network aggregation points. This opportunity also brings about an additional constraint on telcos’ network architecture choices, as they need to act decisively in order not to miss out. Other players with a distributed enough footprint could offer inferencing at the edge: hyperscalers, content delivery networks (CDNs), or emerging AI-native infrastructure providers with a nimbler go-to-market model.

Finally, we foresee that AI (and GenAI in particular) will become a data monster with an insatiable appetite for real-time streamed sensor data. Due to its versatility and the installed base of networked cameras, video data would be captured in ever-larger volumes to support learning and inference. However, compute infrastructure on device or locally on premises will undertake much of the processing, minimising the data flows beyond this.

The impact of these emerging AI capabilities on customer requirements is inherently uncertain and will depend on where AI workloads are processed, the size and architecture of models, data formats, nature of traffic and the broader topology of AI delivery systems. For this reason, telcos must stay embedded in the AI conversation to shape evolving requirements, guide service evolution and remain positioned as critical enablers of next-generation AI systems, not mere observers.

Telcos are missing out by not being plugged into enterprises’ internal AI conversations

Many telcos are not part of today’s AI dialogue for two main reasons:

• Firstly, enterprises are grappling with a host of big challenges when it comes to AI adoption: legal, ethical, organisational, security, skills and culture. Networking is not high on this list and enterprises don’t see telcos as critical AI enablers. With limited visible engagement, telcos are not top of mind for enterprises shaping AI strategies.

• Secondly, telcos themselves are not proactively engaging customers, because their networks aren’t yet feeling the impact. Most of the data flows associated with AI, model training, inference and coordination occurs within data centre LANs or between data centres, bypassing service provider networks entirely (or limiting operators’ contribution to dark fibre and wavelengths). As a result, telcos only see the ‘tip of the iceberg’, the minority of traffic associated with user and sensor connections to AI systems, such as uploading data or receiving output (which, as we have explained above, has not changed much yet). Some telcos missed out on the data centre interconnect (DCI) market both in terms of new business and when it comes to learning and service evolution.

The risk is missing out on key growth opportunities

Failing to anticipate long-term AI-driven customer needs risks locking telcos out of three major growth opportunities:

1. Elevating the value of core connectivity: Without aligning infrastructure to AI workloads, telcos miss the chance to make communication services smarter and more valuable – e.g., through network-as-a-service (NaaS) models. The result is continued commoditisation of core services.
2. Expanding into adjacent high-growth connectivity domains: If telcos don’t capture opportunities such as DCI and AI-specific networking, others will. Telcos will struggle to grow share of spend with application providers and enterprise customers, if they can’t deliver differentiated, AI-enabling connectivity.
3. Moving beyond connectivity: Without a proactive repositioning, telcos won’t be seen as relevant partners in AI delivery, shutting them out of higher-margin, non-connectivity roles, such as distributed compute, orchestration, AI application lifecycle support, observability or data services.
Therefore, the key question is: what must telcos do, inside and beyond the network, to seize AI’s potential and stay relevant partners in customers’ and application providers’ AI adoption journey?

Telcos must become insightful AI enablers

Operators themselves are early adopters of AI, with first-hand experience deploying AI across operations, customer service and network optimisation.

Figure 2: Operators have deployed GenAI capabilities across a variety of functions

Source: STL Partners, Telco generative AI adoption tracker, June 2025

Many operators also serve customers that are themselves leading in AI adoption and deployment. Even if telcos are not directly involved in such AI projects, they can still extract learnings into evolving AI workload patterns and real-world integration challenges. This could position operators not just as observers but as credible, experience-backed partners, if they actively learn from AI-native customers and translate those insights into differentiated offers. Without this, proximity alone means little.

However, to remain relevant, telcos must do more than adopt AI internally and passively observe external shifts. They have to become essential enablers of AI, underpinning adoption by both enterprise and consumer platforms. This means anticipating customer needs and adapting infrastructure and offerings, including redefining their role within key ecosystems, to meet those needs.

Telcos must:

Anticipate customer AI needs, don’t wait to be asked

Telcos must lead with insight, not react to requests. Expecting customers to articulate their future AI requirements (and translating these into service provider requirements) is flawed for three reasons:

• Customer uncertainty: Enterprises are still experimenting. Their AI roadmaps are fluid. They are unclear how AI will impact a host of capabilities and connectivity is way down that list. Customers are, therefore, not reaching out to their existing connectivity providers to explore these impacts.

• Other players are setting the agenda: Hyperscalers, AI-native platforms and integrators are already engaging with customers, shaping architectures and capturing mindshare.

• Customer adoption development happening in stealth: AI use cases are increasingly embedded within broader platforms – e.g., customer relationship management (CRM), enterprise resource planning (ERP), marketing stacks – offered by independent software vendors (ISVs) and app providers. Connectivity is abstracted away, meaning telcos are bypassed entirely, often without the buyer even realising there’s a telco role to consider.

Telcos need to start owning this conversation – something they’ve rarely done in the past. Historically, a byte was a byte, and the nature of traffic rarely shaped the connectivity offer. But AI changes that: understanding the application and its architecture now matters. Telcos must lead with perspective, not wait for RFPs.

This requires not only deeply understanding AI workload characteristics, latency, compute locality/topology and data movement, but also identifying the true, enterprise-specific pain points and the blockers to solving them. As mentioned, telcos are already among the fastest adopters of AI, applying it to optimise operations, automate customer support and improve network performance. In doing so, many have already confronted and resolved the very challenges that often stall enterprise AI adoption, particularly around compliance, governance and security. This hands-on experience gives telcos a credible voice in the enterprise AI conversation.

In terms of future connectivity needs, the only way for telcos to find out what these might be is to start observing enterprise behaviour in a closer manner and how this changes as AI adoption ramps up. For instance, telcos could segment enterprises by geography, vertical and size, and assess their respective level of AI maturity. For many businesses, the network is an afterthought; they expect fast, stable and secure connectivity that ‘just works’ to support AI-enabled applications. However, as the technology becomes more embedded in business-critical workflows, connectivity will quietly become a performance differentiator. Telcos must respond with verticalized, go-to-market strategies that bundle tailored infrastructure, integration support and compliance-aware capabilities. While some enterprises may evolve into AI-native operators, many will require simple, reliable enablers – and telcos must be ready to support both. Operators’ understanding of traffic flows, device distribution and latency needs to be leveraged to deliver differentiated and valuable propositions, including capabilities around content provenance, authentication and validation.

In relation to the consumer market and smaller organisations, AI’s impact is indirect but profound – surfacing through AI-optimised content delivery, gaming experiences, caching and cloud applications that use this technology to personalise and streamline user interactions, and increasingly, intelligent assistants. To capitalise on this trend, telcos must embed themselves within these value chains by proactively engaging with platform providers, developing solutions optimised for AI content delivery and device-specific performance, and focusing on monetising quality of experience (QoE) rather than just data volume.

Re-architecting and complementing connectivity for AI enablement

Crucially, enabling AI also demands re-architecting networks to support lower latency, higher resiliency, edge breakout and dynamic service exposure, enabling telcos to offer programmable infrastructure tailored to AI workloads. This is not a marginal change – it’s a business model shift. Telcos must think in terms of comprehensive enablement:

• AI-specific services, not generic bandwidth (e.g., NaaS, slice-as-a-service tuned to inference needs, optical routing, minimising IP ‘hops’)
• Integration capability, including orchestration and edge-cloud coordination
• Co-creation, especially for high-value verticals (e.g., manufacturing, logistics, healthcare).

Lessons from cloud: Act before AI leaves you behind

The cloud transition offers a clear cautionary tale for telcos. When cloud infrastructure first emerged, many operators viewed it as just another demand driver for their existing core connectivity business but not directly transformative for their services. However, by fundamentally changing enterprise IT architectures, shifting application hosting models, data locality and security boundaries, cloud also redefined the connectivity services required to support the new IT architecture. During this shift, some telcos missed several critical opportunities.

Some operators were slow to respond as enterprise workloads shifted to the cloud. Instead of adapting their offerings, they persisted with legacy MPLS-based services, even as this model became increasingly mismatched to the new traffic patterns and performance needs of cloud-centric architectures. Enterprises responded by sourcing internet access from ISPs and overlaying it with SD-WAN solutions from non-telco vendors. In failing to reframe their role from bandwidth providers to enablers of secure, performant cloud access, telcos ceded both relevance and revenue to more agile players.

The outcome was that telcos lost control over value chains they once anchored, with pricing power eroded and strategic influence diminished. Today, cloud architectures shape application performance, data sovereignty and even network design, often without telco involvement. If operators repeat these mistakes in the AI era — underestimating its impact, failing to act early and clinging to legacy commercial models, outdated product portfolios and inflexible network architectures – they risk further marginalisation.

Early signs are already evident. One example is the DCI market, where most telcos were slow to act. Companies such as Megaport and Lightstorm quickly capitalised, especially in fast-growing markets such as India, bypassing established telcos entirely in building high-performance, AI-ready infrastructure. Telcos’ inability to capture this emergent value pool exemplifies the risk of delayed action.

From signals to substance: STL Partners’ next step in decoding AI’s impact for SPs

AI will reshape all digital infrastructure – and telcos should not wait to be impacted. They need clarity now on what’s real, what’s hype and where the real inflection points will be. That’s what STL Partners aims to uncover next.

To support telcos, we are launching a dedicated research programme focused on translating AI hype into actionable insight.

To support telcos, we will continue our ongoing research into how AI can deliver meaningful outcomes for the telecoms industry. Building on the challenges outlined above, we aim to quantify how AI will impact telco networks in practice. In particular, we will attempt to translate the order-of-magnitude growth anticipated in AI workloads into the material impact on telco infrastructure. Even if only a tiny percentage of AI data flows extend beyond the device, LAN or DCI and end up travelling across telco networks, that will still be a small portion of a very large number.

To do this, we are conducting an interview programme with enterprise application providers, connectivity infrastructure specialists and telecom thought leaders to build a deeper picture of AI’s true impact on SPs. This will be complemented by a global enterprise survey to quantify the scale, variation and commercial implications of AI adoption.

The findings will be published in a comprehensive whitepaper, supported by interactive webinars, videos and visual content to help telcos and their partners navigate the transition from connectivity providers to AI enablers.

Kindly supported by: Cisco

Kuba Smolorz

Kuba Smolorz

Kuba Smolorz

Senior Consultant

Kuba is a Senior Consultant at STL Partners, specialising in AI while bringing broad expertise across next-generation connectivity and infrastructure to assess its impact on telco operations and B2B revenue growth. He has led projects for a diverse range of companies, from major Tier-1 operators to technology startups, delivering market forecasting to prioritise opportunities, shaping product and GTM strategies, and facilitating customer workshops. Kuba holds a BSc in Biochemistry from the University of Bristol.

Jonas Topp-Mugglestone

Jonas Topp-Mugglestone

Jonas Topp-Mugglestone

Consultant

Jonas is a Consultant at STL Partners, specialising in data centres and M&A.

Do you want to know more about our research in this area?