2 Sources
2 Sources
[1]
ServiceNow says it's 'AI-enabling' its entire product suite to turbocharge enterprise automation - SiliconANGLE
ServiceNow says it's 'AI-enabling' its entire product suite to turbocharge enterprise automation ServiceNow Inc. today announced a sweeping overhaul of its entire product lineup, saying that every single one of its services, platforms and products has been "AI-enabled" to enhance agentic automation in the enterprise. The company said it's trying to push enterprises beyond the experimental stage of artificial intelligence add-ons. It wants to transition them toward an "AI-native architecture," where every system they use has agentic capabilities, data connectivity and governance built in. To do this, ServiceNow said, its AI Control Tower and Workflow Data Fabric offerings have now become core components of its entire product lineup. It's trying to create a unified experience, accessible through its EmployeeWorks interface, that allows every worker to start automating work with autonomous AI agents. According to ServiceNow, most enterprises' AI efforts have been held back by extreme fragmentation. When companies grow to a certain size, they inevitably end up dealing with a chaotic sprawl, using hundreds of disconnected applications - each one with its own data silos and security protocols. It's a nightmare for anyone trying to integrate AI, and the result is that most automation efforts fail. Though most organizations have adopted AI tools that can chat and summarize, getting them to do actual work is a whole different ball game, because the AI agents meant to do it lack the deep business context that's required to automate work accurately. The solution to this is a new offering called the Context Engine that's being made available in preview to select customers soon. It's designed to link together all of the fragmented tools and applications spread across organizations, so it can inform AI agents of what's happening across the entire business. It uses the company's Service Graph and Knowledge Graph tools to understand identity relationships, asset dependencies and policy controls across each app. What this means is that when an AI agent needs to make a decision, it's not just trying to predict the next word in a sentence. Rather, it will check to see if assets are tied to regulated processes or if there's a specific approval chain before taking any action. At the same time, ServiceNow is opening up its ecosystem to developers with the ServiceNow SDK and a new Build Agent Skills offering that will launch later this month. Starting next week, developers will be able to use third-party tools such as Claude Code, OpenAI Codex and Cursor to build applications and deploy them directly on ServiceNow, so that these will also inherit its core security and governance. The company also made it clear that it's not only targeting the largest organizations. With the launch of its new Enterprise Service Management Foundation, it's looking to cater to small and medium-sized firms, helping them to launch AI agents that can automate their information technology, human resources and legal services. President and Chief Product Officer Amit Zavery said organizations typically spend months trying to assemble the pieces needed to get enterprise AI up and running, and very often they fail in those endeavors. "ServiceNow brings it all together, so customers start with a complete AI-native experience across all products and packages, not a procurement project," he explained. "From Context Engine's enterprise intelligence to data connectivity, governance and execution, everything is included by default."
[2]
ServiceNow ends the AI add-on era and defines its new platform approach
For most of its existence, ServiceNow was built around the CMDB: the Configuration Management Database, a single system of record that maps every IT asset, service, and relationship across an organization's infrastructure. Workflow products - ITSM, HR Service Delivery, Customer Service Management, Security Operations - were built on top of that foundation, each serving a different department, each sold as a distinct product. The CMDB gave ServiceNow its cross-functional coherence, which has been key to much of its success. The 'one platform, one data model' architecture it has spoken about for years has allowed it to go across the enterprise in a way that other SaaS vendors have struggled. Today, that structure is shifting somewhat. ServiceNow announced that its entire product portfolio is now AI-enabled, with every offering including AI, data connectivity, workflow execution, security, and governance as standard. Not as an add-on. Not as a separate tier. Built in (more on this from a competitive/pricing standpoint, shortly). The new platform architecture ServiceNow is describing looks like this: EmployeeWorks as the conversational front door, Workflow Data Fabric as the connected data layer, AI Control Tower for visibility and governance, and autonomous workflows that move from assisting people to acting on their behalf. Sitting underneath all of it, and now explicitly named as such, is Context Engine - a new enterprise context solution that connects the relationships, policies, and decision history behind every AI agent action. The way to think about this, I think, is that the CMDB was about knowing what you have, whereas the Context Engine is about knowing how your business actually works - and giving AI agents the real-time intelligence to act on that knowledge. Amit Zavery, president, chief product officer and chief operating officer, said: ServiceNow is redefining how companies realize value from AI, with the capabilities required for enterprise scale. From Context Engine's enterprise intelligence to data connectivity, governance, and execution, everything is included by default, all operating inside the flow of work. The "not a separate purchase" framing in today's announcement, I'd argue, is a bit pointed. It's likely a direct response to something that's been frustrating enterprise buyers for the past two years. The standard vendor playbook has been to build AI capabilities and then charge extra for them - new tiers, consumption credits, add-on licences, each requiring their own procurement conversation. In some cases the bill arrives before anyone has figured out whether the capability actually works. In the diginomica network's SaaS vs AI micro-pulse survey in February, one CIO described their Google Workspace spend jumping because Gemini had been bundled in - which they don't actually use - forcing cuts elsewhere to compensate. It's just one example, but it likely reflects a real budget consequence playing out right now in many enterprise environments. ServiceNow is saying that rather than layering AI on top as a billable extra, it's baking it into the base. The new tiered model spans AI assistance, agentic automation, and fully autonomous operations across the entire portfolio. Every customer starts with the full package. As such, the question of whether AI is included stops being a procurement decision. For mid-size organizations in particular, the new Enterprise Service Management Foundation is notable, bringing IT, HR, legal, finance, procurement and workplace services onto the platform in weeks rather than months. Robinhood's Jay Hammonds, head of Technology Operations, said: ServiceNow AI deflects 70 percent of our employee requests before human intervention is needed - across IT, HR, and Legal. We reduced manual effort by 2,200 hours across 1,300 tickets monthly with AI embedded directly into our workflows. And with ServiceNow's new AI-driven offerings, we can bring new teams and acquired entities live in weeks, not months. Time to value is the thing CIOs in the diginomica network keep returning to as the deciding factor in AI investment. Our January 2026 AI Projects pulse survey found that nearly a quarter of organizations have no AI projects running in production at all. A vendor that removes the barriers between buying and getting value is something buyers will respond positively to, as it's a real issue on the ground. The CMDB gave ServiceNow two decades of structured knowledge about enterprise IT environments. Context Engine is the attempt to turn that into something AI agents can actually use in real time. With 85 billion workflows and seven trillion transactions running on the platform annually, ServiceNow claims it can ground LLM decisions in the specific strategy, approval chains, asset dependencies, and vendor history that define how your organization works - not language in general, but your business specifically. Built on the Service Graph, Knowledge Graph, and data inventory, Context Engine draws from identity relationships, business intelligence, and data lineage in real time, with the stated intention that it compounds intelligence with every human and agent decision made. This is a direct answer to something our CIO network has consistently identified as a primary barrier to AI success. Our November 2025 research with 35 CIOs and CTOs found that poor data quality and disconnected systems - not technical capability - are the primary blockers to AI delivering returns. Our latest data report found that 94 percent of organizations still have siloed data despite years of integration investment, with 30 to 70 percent of professional time consumed by manual reconciliation. Context Engine is ServiceNow's argument that it has the enterprise context to prevent exactly that. The caveat is that Context Engine is currently available for preview with select customers only, with full availability to be confirmed later. The Build Agent skills announcement extends the platform logic outward. From April 15, developers can build using Claude Code, Cursor, OpenAI Codex, Windsurf and others and deploy directly to ServiceNow without leaving their preferred environment. This reinforces the hub-and-spokes architecture Paul Fipps described when I spoke with him last month. ServiceNow is the governed, deterministic execution hub. The models - Claude, Gemini, GPT, whatever comes next - are interchangeable spokes. Fipps said: We fully want to take advantage of Large Language Models - these are amazing innovations - but we do it in a way where we abstract the intelligence layer from ServiceNow. We use LLMs for decisioning and reasoning. That part is probabilistic. But the action part in ServiceNow - the workflow part - is deterministic. No guessing. Opening the developer tooling to every major AI environment deepens that position without requiring developers to change how they work. It's a pragmatic move. ServiceNow stays in the middle, whilst the tools around it become a matter of preference. However, it's worth being cautious when it comes to pricing, because "not a separate purchase" doesn't end the pricing conversation - it just shifts it in the right direction. What ServiceNow has done is remove procurement friction at the entry point. AI governance, data connectivity and workflow execution are now in the base package. But heavier agentic usage - autonomous workflows running at scale, AI specialists handling thousands of cases - will likely still draw on consumption. The model remains seat-based plus consumption. The industry as a whole hasn't decided on whether outcome-based pricing actually works, but it's something I think ServiceNow is considering. When I pressed Fipps on this directly, he was candid about how unsettled it remains: On the SaaS versus AI question - honestly, I can't tell the difference anymore. The same players you'd describe as AI companies now have seat-based models. OpenAI and Anthropic both have seat-based models. So what are they? His point was that customers need predictability, and that consumption models create the kind of unpredictable bills that procurement teams resist. ServiceNow's current answer is seat base plus consumption - predictability upfront, additional cost linked to additional value. But Fipps was explicit this is provisional: We'll probably have to think about more innovative pricing models as our customers pull us in that direction. I'd argue that today's packaging simplification, welcome as it is, is likely an intermediate position. CIOs want transparent pricing models that help control ongoing IT costs. That's a higher bar than "AI included in base package." Outcome-based pricing - paying for the work AI actually completes rather than for access to the capability - has been the theoretical destination for years, but making it workable is still a pipe dream in many respects. What I keep coming back to is the architectural significance of today's announcement relative to where ServiceNow has come from. The shift from a CMDB-centric platform with workflow products wrapped around it, to a platform defined by a conversational front door, connected data, governance, and autonomous execution isn't just a product refresh. ServiceNow is laying out a platform redesign built around how AI agents work rather than how human users work. Context Engine, if it delivers on its preview promise, is the piece that makes the whole thing possible - enterprise context as the foundation rather than configuration data. The "not a separate purchase" positioning is smart commercial framing for the moment we're in. CIOs are consolidating vendors, scrutinizing AI spend, and demanding shorter runways to demonstrable value. Removing procurement friction while compressing deployment time addresses at least two of those pressures directly. The open question, as Fipps himself acknowledged, is pricing at scale. Knowledge 2026 in Vegas in a couple of months will be the moment to press on whether the packaging holds when customers start running serious agentic workloads. We'll be on the ground to find out.
Share
Share
Copy Link
ServiceNow announced a comprehensive overhaul of its product lineup, embedding AI capabilities into every service, platform, and product by default. The company is introducing Context Engine to connect fragmented enterprise tools and enable AI agents to automate work with full business context. This shift moves beyond AI add-ons toward an AI-native architecture that includes governance and data connectivity as standard features.
ServiceNow has announced a sweeping transformation of its product lineup, embedding AI capabilities into every single service, platform, and product it offers
1
. The move signals a decisive shift away from the AI add-on era that has characterized enterprise software for the past two years. Instead of charging extra for AI features through separate tiers or consumption credits, ServiceNow is building these capabilities directly into its base offerings, making AI Control Tower and Workflow Data Fabric core components across its entire portfolio1
.The company aims to push enterprises beyond experimental AI implementations toward an AI-native architecture where agentic capabilities, data connectivity, and governance are built into every system by default
1
. This unified approach becomes accessible through the EmployeeWorks interface, allowing workers to start automating tasks with autonomous AI agents without navigating procurement hurdles or integration challenges.At the heart of ServiceNow's strategy sits Context Engine, a new enterprise context solution launching in preview to select customers soon
1
. According to ServiceNow, most enterprise automation efforts fail because organizations struggle with extreme fragmentation—hundreds of disconnected applications, each with its own data silos and security protocols1
. While many companies have adopted AI tools that can chat and summarize, getting AI agents to perform actual work requires deep business context that most systems lack.
Source: SiliconANGLE
Context Engine addresses this by linking fragmented tools and applications across organizations, leveraging Service Graph and Knowledge Graph to understand identity relationships, asset dependencies, and policy controls
1
. When AI agents need to make decisions, they can verify whether assets are tied to regulated processes or if specific approval chains exist before taking action. This represents a fundamental shift from ServiceNow's traditional CMDB foundation, which mapped IT assets and relationships. The CMDB was about knowing what you have, whereas Context Engine focuses on understanding how your business actually works2
.With 85 billion workflows and seven trillion transactions running on the platform annually, ServiceNow can ground AI decisions in specific organizational strategy, approval chains, and vendor history
2
. This means AI agents operate with knowledge of your business specifically, not just language patterns in general.ServiceNow is expanding access for developers through the ServiceNow SDK and a new Build Agent Skills offering launching later this month
1
. Starting next week, developers can use third-party tools including Claude Code, OpenAI Codex, and Cursor to build applications and deploy them directly on ServiceNow1
. These applications will inherit the platform's core security and governance by default, addressing a critical concern for enterprise AI investments.Amit Zavery, president and chief product officer, emphasized that organizations typically spend months assembling the pieces needed for enterprise AI, and very often fail. "ServiceNow brings it all together, so customers start with a complete AI-native experience across all products and packages, not a procurement project," he explained
1
. "From Context Engine's enterprise intelligence to data connectivity, governance and execution, everything is included by default."Related Stories
ServiceNow is extending its reach beyond large enterprises with the launch of Enterprise Service Management Foundation, designed to help small and medium-sized firms automate ITSM, HR Service Delivery, legal, finance, procurement, and workplace services
1
. The offering promises to bring teams and acquired entities live in weeks rather than months, addressing a critical pain point around time to value.Robinhood's Jay Hammonds, head of Technology Operations, reported that "ServiceNow AI deflects 70 percent of our employee requests before human intervention is needed—across IT, HR, and Legal. We reduced manual effort by 2,200 hours across 1,300 tickets monthly with AI embedded directly into our workflows"
2
. These concrete results demonstrate how autonomous workflows can move from assisting people to acting on their behalf.
Source: diginomica
The new platform architecture positions EmployeeWorks as the conversational front door, with Workflow Data Fabric serving as the connected data layer and AI Control Tower providing visibility and governance
2
. This structure represents a pointed response to the standard vendor playbook of charging extra for AI capabilities through new tiers and add-on licenses. By making AI capabilities standard rather than optional purchases, ServiceNow removes procurement barriers that have slowed enterprise AI adoption. For organizations where nearly a quarter have no AI projects running in production at all, a vendor that streamlines the path from buying to getting value addresses a real issue affecting technology leaders today.Summarized by
Navi
[1]
30 Sept 2025•Technology

27 Feb 2026•Technology

10 Sept 2024
