This year’s TM Forum Accelerate event highlighted significant advancements in AI-driven automation within the telecommunications sector. Over 200 business leaders, architects, and technology experts gathered for four days of collaboration focused on implementing the Open Digital Architecture (ODA) roadmap to enhance operations across IT and networks. Participants shared insights on aligning strategy, architecture, governance, and execution to facilitate scalable AI deployments, marking a shift from AI experimentation to the development of enterprise-grade systems that deliver tangible business value.
A key initiative showcased during the event was Project Foundation, designed to establish the world’s first AI-Native ODA Canvas Sandbox. This secure, Kubernetes-based environment allows communications service providers (CSPs), hyperscalers, and technology partners to co-develop and test interoperable AI agents based on the TM Forum AI-Native Blueprint. Participants reported substantial progress in defining shared terminology and detailed requirements for Canvas use cases, setting the stage for a reference implementation aimed at the upcoming DTW Ignite event in June 2026 in Copenhagen.
Advancing the AI-Native Blueprint
In parallel with Project Foundation, TM Forum members advanced discussions around the AI Native Blueprint, which addresses the essential issues of trust, security, governance, lifecycle management, and accountability in moving AI systems into production. Workstreams focused on Model-as-a-Service (MODaaS), Data Product Lifecycle Management (DPLM), and secure agent interactions underscored the necessity of establishing robust security frameworks for widespread AI deployment.
During the sessions, a consensus emerged around the need for practical and actionable AI-native security frameworks embedded in industry standards. Members explored strategies for operationalizing trust in agentic AI, aligning security, data, and AI domains although utilizing collaboration tools to streamline security approvals. The DPLM discussions examined how data could be treated as a product within an ODA environment, emphasizing ownership, lifecycle controls, and consumer context.
Building Foundations for Autonomous Networks
The event also featured extensive work on Autonomous Networks (AN), emphasizing their operational value. Attendees agreed that achieving scalable Level 4 autonomy requires grounding AN scenarios in ODA components, APIs, and value streams. Benefits of such alignment include reduced service loss, improved mean time to repair (MTTR), and fewer manual interventions, leading to more predictable service performance.
Measurement and benchmarking tools were identified as critical for capturing the business value of autonomy. Participants focused on enhancing AN level assessments through key effectiveness indicators (KEIs) and key capability indicators (KCIs). The goal is to provide objective measures of autonomy’s business impact, such as operational savings and energy efficiency.
Discussions also highlighted the role of digital twins in predictive and optimization efforts within autonomous networks. Although, there was an acknowledgment that no single “universal twin” exists; instead, multiple domain-specific twins are necessary and must be managed carefully to address cost, accuracy, and security concerns.
Emphasizing Open APIs and Composable Architectures
Throughout the event, teams worked diligently on various aspects of Open API development to balance a rapidly expanding Open API portfolio with practical deployment strategies. Achievements included refining Gen5 Open API rules and formalizing API specialization, as well as developing Open APIs to support wholesale broadband and Open Gateway business models.
In conjunction with these efforts, the commitment to modular, composable architectures was reinforced through ongoing work on ODA Components, Canvas, and Conformance. Participants noted that the transformation to AI-native operations is not only a technical challenge but also an organizational one. Skills and culture remain significant hurdles, prompting sessions focused on validating new AI components against TM Forum’s Digital Talent Maturity Model (DTMM) to identify gaps and overlaps.
The Autonomous Networks Upskilling Hub addressed the industry-wide challenge of rapidly scaling skills to realize the value of autonomous networks. Proposals included establishing a shared, vendor-agnostic foundation complemented by differentiated offerings to support professional development.
Looking Ahead
As the TM Forum community progresses toward a future where AI is seamlessly integrated into telecommunications operations, the focus remains on embedding high-impact industry use cases directly into the Project Foundation pipeline. This approach ensures that TM Forum standards are shaped by practical implementation architecture and execution, enabling members to transition from AI vision to operational reality.
The continued collaboration within the TM Forum, particularly in the realms of AI and autonomous networks, promises to drive efficiency and innovation across the telecommunications landscape, paving the way for enhanced service delivery and customer experiences. Stakeholders are encouraged to share their insights and experiences as this transformative journey unfolds.