3 Sources
[1]
How ServiceNow gets customers to gorge at the AI trough
'AI is now infused in every package that we offer to our addressable market,' SVP John Aisien told us ServiceNow's latest product announcements show how hardcore the company has become about embedding AI across its go-to-market strategy. ServiceNow senior vice president John Aisien, who oversees forward-deployed engineering and strategic partnerships with companies like Anthropic and Microsoft, outlined the major moves in a conversation with The Register. "AI is now infused in every package that we offer to our addressable market," he said. For starters, ServiceNow has reorganized its pricing around three levels of AI capability. Customers can now choose the level of pricing that corresponds to their AI maturity. Assistive AI summarizes data or generates content. Task Automation handles discrete jobs from start to finish. With Full Role Automation, AI workflows operate autonomously with minimal human oversight. "Whether the organization is ready for assistive AI, AI automation, or full-blown role automation using autonomous AI, we've introduced a new packaging model that gives them access to the entirety of the ServiceNow price list organized into those three categories," Aisien said. Aisien described the second announcement, Build Agent SDK, as "a real, real game changer." The tool is designed to let any developer create or modify ServiceNow applications from whatever coding environment they already use, whether that's GitHub Copilot, Cursor, Codex, or various natural-language "vibe coding" tools. "We want the maximum number of developers in the world - regardless of the development form factor and the development experience that they got - we want those developers to have the ability to build ServiceNow workloads super quickly, and run those workloads," he said. "Build agent SDK is the technical manifestation of making that promise real." Aisien described a life sciences company in Pennsylvania with about 8,000 developers, 650 of whom build on ServiceNow Studio. That leaves more than 7,000 who don't - not because they lack interest, but because developers tend to be loyal to their preferred tools. The Build Agent SDK is meant to meet all of those developers where they already work. ServiceNow's Context Engine, another new offer, is an enterprise spin on the kind of personalization that users see in consumer apps like Netflix and Amazon -- systems that learn from every click, hover, and pause to serve tailored experiences. Enterprise software, Aisien said, has lagged behind in adopting that technology. "That historical context and continuous learning, that's been the norm in consumer apps," he told us. "Enterprise software really hasn't learned that well-worn pattern. AI is now forcing this construct to become real. Why? Because AI is software and ultimately needs data with which to make autonomous decisions." He described the context engine as essential infrastructure for making autonomous AI work at enterprise scale. He called it "one of the last-mile technologies needed for enterprise AI to get real." "The context engine in the back-end says 'I'm going to determine the intent of what this human being wants to do with the digital asset, depending on the questions they ask, depending on the interaction paradigm, and then depending on their identities and permissions in the back end,' " Aisien said. ServiceNow also introduced the Enterprise Service Management Suite, a new bundled package targeting companies with roughly 1,000 to 5,000 employees. It combines IT, HR, supply chain, and finance service workflows into a single offering and pairs that with an AI-powered implementation agent that dramatically simplifies deployment. Where traditional ServiceNow implementations could take six months or more, Aisien said the new approach can compress that to about 30 days. He said most of that elapsed time comes from organizational logistics, not configuration. "Actual time in the UI is maybe a day or two," he said. The implementation agent provides a conversational interface that guides administrators through setup without requiring deep ServiceNow proficiency. Aisien tried not to oversell it. "You don't have to be my mom, who's 80 years old and only uses an Android device" -- but he said anyone comfortable with a chat interface should be able to handle it. ®
[2]
ServiceNow says it's 'AI-enabling' its entire product suite to turbocharge enterprise automation - SiliconANGLE
ServiceNow says it's 'AI-enabling' its entire product suite to turbocharge enterprise automation ServiceNow Inc. today announced a sweeping overhaul of its entire product lineup, saying that every single one of its services, platforms and products has been "AI-enabled" to enhance agentic automation in the enterprise. The company said it's trying to push enterprises beyond the experimental stage of artificial intelligence add-ons. It wants to transition them toward an "AI-native architecture," where every system they use has agentic capabilities, data connectivity and governance built in. To do this, ServiceNow said, its AI Control Tower and Workflow Data Fabric offerings have now become core components of its entire product lineup. It's trying to create a unified experience, accessible through its EmployeeWorks interface, that allows every worker to start automating work with autonomous AI agents. According to ServiceNow, most enterprises' AI efforts have been held back by extreme fragmentation. When companies grow to a certain size, they inevitably end up dealing with a chaotic sprawl, using hundreds of disconnected applications - each one with its own data silos and security protocols. It's a nightmare for anyone trying to integrate AI, and the result is that most automation efforts fail. Though most organizations have adopted AI tools that can chat and summarize, getting them to do actual work is a whole different ball game, because the AI agents meant to do it lack the deep business context that's required to automate work accurately. The solution to this is a new offering called the Context Engine that's being made available in preview to select customers soon. It's designed to link together all of the fragmented tools and applications spread across organizations, so it can inform AI agents of what's happening across the entire business. It uses the company's Service Graph and Knowledge Graph tools to understand identity relationships, asset dependencies and policy controls across each app. What this means is that when an AI agent needs to make a decision, it's not just trying to predict the next word in a sentence. Rather, it will check to see if assets are tied to regulated processes or if there's a specific approval chain before taking any action. At the same time, ServiceNow is opening up its ecosystem to developers with the ServiceNow SDK and a new Build Agent Skills offering that will launch later this month. Starting next week, developers will be able to use third-party tools such as Claude Code, OpenAI Codex and Cursor to build applications and deploy them directly on ServiceNow, so that these will also inherit its core security and governance. The company also made it clear that it's not only targeting the largest organizations. With the launch of its new Enterprise Service Management Foundation, it's looking to cater to small and medium-sized firms, helping them to launch AI agents that can automate their information technology, human resources and legal services. President and Chief Product Officer Amit Zavery said organizations typically spend months trying to assemble the pieces needed to get enterprise AI up and running, and very often they fail in those endeavors. "ServiceNow brings it all together, so customers start with a complete AI-native experience across all products and packages, not a procurement project," he explained. "From Context Engine's enterprise intelligence to data connectivity, governance and execution, everything is included by default."
[3]
ServiceNow ends the AI add-on era and defines its new platform approach
For most of its existence, ServiceNow was built around the CMDB: the Configuration Management Database, a single system of record that maps every IT asset, service, and relationship across an organization's infrastructure. Workflow products - ITSM, HR Service Delivery, Customer Service Management, Security Operations - were built on top of that foundation, each serving a different department, each sold as a distinct product. The CMDB gave ServiceNow its cross-functional coherence, which has been key to much of its success. The 'one platform, one data model' architecture it has spoken about for years has allowed it to go across the enterprise in a way that other SaaS vendors have struggled. Today, that structure is shifting somewhat. ServiceNow announced that its entire product portfolio is now AI-enabled, with every offering including AI, data connectivity, workflow execution, security, and governance as standard. Not as an add-on. Not as a separate tier. Built in (more on this from a competitive/pricing standpoint, shortly). The new platform architecture ServiceNow is describing looks like this: EmployeeWorks as the conversational front door, Workflow Data Fabric as the connected data layer, AI Control Tower for visibility and governance, and autonomous workflows that move from assisting people to acting on their behalf. Sitting underneath all of it, and now explicitly named as such, is Context Engine - a new enterprise context solution that connects the relationships, policies, and decision history behind every AI agent action. The way to think about this, I think, is that the CMDB was about knowing what you have, whereas the Context Engine is about knowing how your business actually works - and giving AI agents the real-time intelligence to act on that knowledge. Amit Zavery, president, chief product officer and chief operating officer, said: ServiceNow is redefining how companies realize value from AI, with the capabilities required for enterprise scale. From Context Engine's enterprise intelligence to data connectivity, governance, and execution, everything is included by default, all operating inside the flow of work. The "not a separate purchase" framing in today's announcement, I'd argue, is a bit pointed. It's likely a direct response to something that's been frustrating enterprise buyers for the past two years. The standard vendor playbook has been to build AI capabilities and then charge extra for them - new tiers, consumption credits, add-on licences, each requiring their own procurement conversation. In some cases the bill arrives before anyone has figured out whether the capability actually works. In the diginomica network's SaaS vs AI micro-pulse survey in February, one CIO described their Google Workspace spend jumping because Gemini had been bundled in - which they don't actually use - forcing cuts elsewhere to compensate. It's just one example, but it likely reflects a real budget consequence playing out right now in many enterprise environments. ServiceNow is saying that rather than layering AI on top as a billable extra, it's baking it into the base. The new tiered model spans AI assistance, agentic automation, and fully autonomous operations across the entire portfolio. Every customer starts with the full package. As such, the question of whether AI is included stops being a procurement decision. For mid-size organizations in particular, the new Enterprise Service Management Foundation is notable, bringing IT, HR, legal, finance, procurement and workplace services onto the platform in weeks rather than months. Robinhood's Jay Hammonds, head of Technology Operations, said: ServiceNow AI deflects 70 percent of our employee requests before human intervention is needed - across IT, HR, and Legal. We reduced manual effort by 2,200 hours across 1,300 tickets monthly with AI embedded directly into our workflows. And with ServiceNow's new AI-driven offerings, we can bring new teams and acquired entities live in weeks, not months. Time to value is the thing CIOs in the diginomica network keep returning to as the deciding factor in AI investment. Our January 2026 AI Projects pulse survey found that nearly a quarter of organizations have no AI projects running in production at all. A vendor that removes the barriers between buying and getting value is something buyers will respond positively to, as it's a real issue on the ground. The CMDB gave ServiceNow two decades of structured knowledge about enterprise IT environments. Context Engine is the attempt to turn that into something AI agents can actually use in real time. With 85 billion workflows and seven trillion transactions running on the platform annually, ServiceNow claims it can ground LLM decisions in the specific strategy, approval chains, asset dependencies, and vendor history that define how your organization works - not language in general, but your business specifically. Built on the Service Graph, Knowledge Graph, and data inventory, Context Engine draws from identity relationships, business intelligence, and data lineage in real time, with the stated intention that it compounds intelligence with every human and agent decision made. This is a direct answer to something our CIO network has consistently identified as a primary barrier to AI success. Our November 2025 research with 35 CIOs and CTOs found that poor data quality and disconnected systems - not technical capability - are the primary blockers to AI delivering returns. Our latest data report found that 94 percent of organizations still have siloed data despite years of integration investment, with 30 to 70 percent of professional time consumed by manual reconciliation. Context Engine is ServiceNow's argument that it has the enterprise context to prevent exactly that. The caveat is that Context Engine is currently available for preview with select customers only, with full availability to be confirmed later. The Build Agent skills announcement extends the platform logic outward. From April 15, developers can build using Claude Code, Cursor, OpenAI Codex, Windsurf and others and deploy directly to ServiceNow without leaving their preferred environment. This reinforces the hub-and-spokes architecture Paul Fipps described when I spoke with him last month. ServiceNow is the governed, deterministic execution hub. The models - Claude, Gemini, GPT, whatever comes next - are interchangeable spokes. Fipps said: We fully want to take advantage of Large Language Models - these are amazing innovations - but we do it in a way where we abstract the intelligence layer from ServiceNow. We use LLMs for decisioning and reasoning. That part is probabilistic. But the action part in ServiceNow - the workflow part - is deterministic. No guessing. Opening the developer tooling to every major AI environment deepens that position without requiring developers to change how they work. It's a pragmatic move. ServiceNow stays in the middle, whilst the tools around it become a matter of preference. However, it's worth being cautious when it comes to pricing, because "not a separate purchase" doesn't end the pricing conversation - it just shifts it in the right direction. What ServiceNow has done is remove procurement friction at the entry point. AI governance, data connectivity and workflow execution are now in the base package. But heavier agentic usage - autonomous workflows running at scale, AI specialists handling thousands of cases - will likely still draw on consumption. The model remains seat-based plus consumption. The industry as a whole hasn't decided on whether outcome-based pricing actually works, but it's something I think ServiceNow is considering. When I pressed Fipps on this directly, he was candid about how unsettled it remains: On the SaaS versus AI question - honestly, I can't tell the difference anymore. The same players you'd describe as AI companies now have seat-based models. OpenAI and Anthropic both have seat-based models. So what are they? His point was that customers need predictability, and that consumption models create the kind of unpredictable bills that procurement teams resist. ServiceNow's current answer is seat base plus consumption - predictability upfront, additional cost linked to additional value. But Fipps was explicit this is provisional: We'll probably have to think about more innovative pricing models as our customers pull us in that direction. I'd argue that today's packaging simplification, welcome as it is, is likely an intermediate position. CIOs want transparent pricing models that help control ongoing IT costs. That's a higher bar than "AI included in base package." Outcome-based pricing - paying for the work AI actually completes rather than for access to the capability - has been the theoretical destination for years, but making it workable is still a pipe dream in many respects. What I keep coming back to is the architectural significance of today's announcement relative to where ServiceNow has come from. The shift from a CMDB-centric platform with workflow products wrapped around it, to a platform defined by a conversational front door, connected data, governance, and autonomous execution isn't just a product refresh. ServiceNow is laying out a platform redesign built around how AI agents work rather than how human users work. Context Engine, if it delivers on its preview promise, is the piece that makes the whole thing possible - enterprise context as the foundation rather than configuration data. The "not a separate purchase" positioning is smart commercial framing for the moment we're in. CIOs are consolidating vendors, scrutinizing AI spend, and demanding shorter runways to demonstrable value. Removing procurement friction while compressing deployment time addresses at least two of those pressures directly. The open question, as Fipps himself acknowledged, is pricing at scale. Knowledge 2026 in Vegas in a couple of months will be the moment to press on whether the packaging holds when customers start running serious agentic workloads. We'll be on the ground to find out.
Share
Copy Link
ServiceNow has announced a major shift in its go-to-market strategy, embedding AI capabilities across every product it offers rather than selling them as separate add-ons. The company introduced Context Engine, a new solution that provides AI agents with enterprise-wide business intelligence, and reorganized pricing around three AI maturity levels. This move targets the widespread frustration with vendors charging extra for AI features.
ServiceNow has fundamentally restructured its approach to enterprise automation by embedding AI capabilities into every product, platform, and service it offers. "AI is now infused in every package that we offer to our addressable market," John Aisien, senior vice president overseeing forward-deployed engineering and strategic partnerships, told The Register
1
. This shift marks the end of the AI add-on era, where vendors typically charge extra for AI features through separate tiers or consumption credits.The company announced that its AI Control Tower and Workflow Data Fabric offerings have become core components of its entire product lineup
2
. Every customer now starts with integrated AI capabilities including data connectivity, workflow execution, security, and AI governance built in by default. This represents a direct response to enterprise buyers' frustration with procurement processes that layer AI costs on top of existing subscriptions.ServiceNow has reorganized its pricing model around three levels of AI capability that correspond to different stages of organizational AI maturity. Assistive AI summarizes data or generates content. Task Automation handles discrete jobs from start to finish. Full Role Automation enables autonomous workflows to operate with minimal human oversight
1
.
Source: diginomica
"Whether the organization is ready for assistive AI, AI automation, or full-blown role automation using autonomous AI, we've introduced a new packaging model that gives them access to the entirety of the ServiceNow price list organized into those three categories," Aisien explained
1
. This tiered approach allows organizations to adopt AI at their own pace while accessing the full platform capabilities.The newly introduced Context Engine represents what Aisien called "one of the last-mile technologies needed for enterprise AI to get real"
1
. This solution tackles a critical challenge: most enterprises struggle with extreme fragmentation, using hundreds of disconnected applications with separate data silos and security protocols2
.
Source: SiliconANGLE
The Context Engine uses ServiceNow's Service Graph and Knowledge Graph to understand identity relationships, asset dependencies, and policy controls across applications. When AI agents need to make decisions, they can check whether assets are tied to regulated processes or specific approval chains before taking action
2
. Built on a foundation of 85 billion workflows and seven trillion transactions running on the platform annually, it grounds Large Language Models (LLM) decisions in specific organizational context rather than general language patterns3
.The Build Agent SDK, which Aisien described as "a real, real game changer," enables any developer to create or modify ServiceNow applications from their preferred coding environment, whether that's GitHub Copilot, Cursor, Codex, or various natural-language coding tools
1
. Starting later this month, developers can use third-party tools like Claude Code and OpenAI Codex to build applications that inherit ServiceNow's core security and governance2
.Aisien cited a life sciences company in Pennsylvania with about 8,000 developers, only 650 of whom build on ServiceNow Studio. The Build Agent SDK targets the remaining 7,000-plus developers who prefer their existing tools, meeting them where they already work
1
.Related Stories
For most of its existence, ServiceNow was built around the CMDB (Configuration Management Database), a single system of record mapping IT assets and relationships. Workflow products for IT Service Management (ITSM), HR Service Delivery, and Customer Service Management were layered on top
3
. That structure is now shifting toward an AI-native architecture where EmployeeWorks serves as the conversational front door, with the Context Engine providing the intelligence layer beneath.Amit Zavery, president and chief product officer, explained: "ServiceNow brings it all together, so customers start with a complete AI-native experience across all products and packages, not a procurement project. From Context Engine's enterprise intelligence to data connectivity, governance and execution, everything is included by default"
2
.The new Enterprise Service Management Foundation targets companies with roughly 1,000 to 5,000 employees, combining IT, HR, supply chain, and finance service workflows into a single offering
1
. An AI-powered implementation agent compresses traditional six-month deployments to about 30 days, with actual configuration time in the user interface taking just a day or two1
.Robinhood's Jay Hammonds, head of Technology Operations, reported that "ServiceNow AI deflects 70 percent of our employee requests before human intervention is needed - across IT, HR, and Legal. We reduced manual effort by 2,200 hours across 1,300 tickets monthly with AI embedded directly into our workflows"
3
. This focus on time-to-value addresses a critical concern, as a January 2026 survey found nearly a quarter of organizations have no AI projects running in production3
.Summarized by
Navi
[1]
[2]
30 Sept 2025•Technology

27 Feb 2026•Technology

10 Sept 2024

1
Health

2
Technology

3
Policy and Regulation
