The $236 Billion Agent Economy: Why Private Data Is the Missing Infrastructure Layer
AI agent adoption is accelerating across enterprises. Market data from IBM, PwC, Gartner, and KPMG shows why private data infrastructure is the critical bottleneck — and opportunity.
By ipto.ai Research
The market signal is clear
Enterprise AI agent adoption is no longer speculative. Multiple independent sources paint a consistent picture of rapid, budget-backed adoption.
According to IBM’s 2025 CEO Study, 61% of surveyed CEOs were actively adopting AI agents and preparing to implement them at scale. The same study found 72% said proprietary data is key to unlocking generative AI value.
PwC’s 2025 survey reinforced this: 79% of executives said AI agents were already being adopted in their companies, 88% planned to increase AI-related budgets because of agentic AI, and among adopters, 66% said agents were already delivering measurable productivity value.
These are not forward-looking projections. They describe current enterprise behavior.
The spending is real
The budget commitments behind agent adoption are substantial and durable.
KPMG reported in January 2026 that 67% of business leaders would maintain AI spending even in a recession scenario, with a projected $124 million to be deployed over the coming year among surveyed organizations. 59% expected measurable ROI within that same timeframe.
Gartner’s August 2025 analysis projected that agentic AI could drive approximately 30% of enterprise application software revenue by 2035, exceeding $450 billion. The firm predicted 40% of enterprise applications would feature task-specific AI agents by end of 2026, up from less than 5% in 2025.
Even applying conservative discounts to these projections, the conclusion holds: enterprise AI agent spending is growing faster than most adjacent technology categories.
Where the bottleneck sits
If agent adoption is this strong and budgets are this committed, why haven’t agents delivered more value?
The answer is not model capability. Modern foundation models can reason, plan, and execute complex tasks. The bottleneck is data access.
IBM’s study revealed that 50% of executives said rapid AI investment had left them with disconnected technology. Enterprises have agents, but those agents cannot reach the private data they need.
The data exists. It lives in:
- Internal knowledge bases and wikis
- Legal and compliance document repositories
- Financial models and market research
- Procurement systems and vendor databases
- Engineering specifications and operational runbooks
- Customer interaction histories and support records
None of this is accessible to agents through existing infrastructure. Vector databases and RAG pipelines address part of the problem, but they lack pricing, permissions, provenance, and audit — the layers enterprises require before they’ll connect agents to their most sensitive data.
The governance requirement
The more autonomous the agent, the more governance matters.
TechRadar’s reporting on Microsoft, KPMG, and CSA-backed findings shows enterprises are increasingly concerned about visibility, governance, least-privilege access, and auditability for agents. This is not theoretical concern — it reflects real security incidents and regulatory pressure.
KPMG’s AI governance framework for the agentic era specifically calls for traceable inter-agent handoffs, explainability, confidence thresholds, guardrails, and human oversight, alongside strict access controls and privacy protections.
PwC noted that companies now need to orchestrate and integrate multiple agents across applications and workflows rather than use isolated point tools. This orchestration requires a consistent trust and audit layer underneath.
The implication for infrastructure: any system that connects agents to private data must have governance as a core capability, not a bolt-on.
Where the opportunity sits
The opportunity is not in “generic AI search.” That market will get crowded and margin-compressed.
The opportunity is in becoming the trusted private data infrastructure for agent execution and monetization. Specifically:
For data owners: A way to make proprietary data available to agents with full control over access, pricing, and usage terms. A new revenue stream from existing knowledge assets.
For agent builders: A retrieval API that returns structured, provenance-tracked, permission-aware data at agent-grade latency. Not text chunks — actionable retrieval units with confidence scores and citation terms.
For the platform: Transaction economics on every retrieval. Network effects where more data supply attracts more agent demand, which attracts more data supply.
Deloitte’s 2026 enterprise AI report identified search and knowledge management as one of the most impactful GenAI areas, alongside customer support, supply chain, R&D, and cybersecurity. The report noted that agentic AI is expected to have high impact across knowledge-heavy workflows in regulated industries.
The timing
Three forces are converging to create this opportunity now:
LLM capability matured. Foundation models can reason and plan well enough for real business workflows. The models aren’t the constraint.
Agent frameworks shipped. MCP, tool use protocols, and orchestration frameworks are production-ready. Agents can reliably call external services.
Enterprise budgets moved. As KPMG’s data shows, the spending is committed and recession-resistant. Enterprises are not experimenting — they are deploying.
The window is open because the infrastructure layer between agents and private data does not yet exist at scale. The agents are deployed. The data is locked. The layer that connects them safely, economically, and auditably is the most important unsolved infrastructure problem in enterprise AI.
Key takeaways
- Enterprise AI agent adoption is accelerating: 79% of executives say agents are already deployed (PwC)
- AI budgets are substantial and durable: $124M projected per organization, 67% recession-resistant (KPMG)
- The bottleneck is private data access, not model capability: 72% of CEOs see proprietary data as key (IBM)
- Governance is a requirement, not a feature: enterprises demand auditability, permissions, and provenance
- The opportunity is in private data infrastructure for agents — not generic search
- The market for this infrastructure could participate in $450B+ in agentic AI revenue by 2035 (Gartner)
Frequently Asked Questions
How big is the AI agent market?
The AI agent market is growing rapidly across multiple metrics. Gartner predicts 40% of enterprise applications will feature task-specific AI agents by end of 2026, up from less than 5% in 2025. In a best-case scenario, Gartner says agentic AI could drive approximately 30% of enterprise application software revenue by 2035, exceeding $450 billion. PwC found 88% of executives plan to increase AI budgets specifically because of agentic AI.
What are enterprises spending on AI agents?
According to KPMG's January 2026 AI Pulse Survey, organizations are projected to deploy $124 million in AI spending over the coming year, with 67% of business leaders saying they would maintain AI spending even in a recession scenario. PwC found 88% of executives increasing AI-related budgets because of agentic AI, and 59% expected measurable ROI within the same timeframe.
Why is private data infrastructure needed for AI agents?
AI agents executing enterprise workflows need access to proprietary data — internal documents, operational procedures, domain knowledge — to make reliable decisions. IBM found 72% of CEOs view proprietary data as key to AI value, while 50% said rapid investment left them with disconnected technology. The infrastructure to make private data safely retrievable, priced, and auditable for agents is the critical missing layer.
Which industries will adopt AI agents fastest?
Deloitte's 2026 enterprise AI report identifies financial services, manufacturing, healthcare, and public sector as leading verticals. The common factors are: high-value proprietary data, repeat workflows where agents can create measurable value, regulatory requirements that demand auditability, and established IT budgets. Financial services and legal/compliance are particularly strong early adopters due to the high cost of errors and existing data governance practices.
ipto.ai is building the private data infrastructure layer for the agent economy.