Formerly OccamSec, provides offensive security testing and strategic cybersecurity advisory services.
Operationalizing artificial intelligence for enterprise impact.
It bridges AI innovation with real operational execution.
AI is only as powerful as the data it can reach.
Yet inside most enterprises, the most valuable data is also the hardest to use: fragmented across systems, locked behind governance, and burdened by slow, manual processes that weren’t designed for autonomous, real-time workloads.
PVML exists to change that - by doing for data what VMware did for hardware.
Just as VMware virtualized physical servers into scalable virtual machines and helped define the modern cloud era, PVML virtualizes enterprise databases into AI-ready Virtual Databases - live, scoped, and secure. Instead of copying data into endless pipelines and duplicates, PVML creates a virtual layer on top of existing data sources that makes them instantly consumable by analytics tools and GenAI agents - without moving the underlying data.
The missing piece in the future AI stack isn’t a better model - it’s a better way for models and agents to access its source of truth - enterprise data: fast, secured, and production-ready.
PVML began with a familiar frustration - one that most enterprise teams face: the organization has the data, but teams can’t use it quickly enough, or safely enough, to build what they need.
Rina Galperin, PVML’s CTO, spent years in advanced AI engineering environments at Microsoft. She repeatedly saw how the gap between “what’s possible” and “what actually ships” had little to do with the models themselves. The real blockers were everything surrounding them: data duplication, brittle pipelines, and the slow, manual work required to make new access patterns safe and compliant.
Dr. Shachar Schnapp, PVML’s CEO, came from deep research and engineering in data protection - focused on enabling real-world access to sensitive data with enterprise-grade performance, reliability, and control.
When they combined perspectives, a powerful insight emerged: enterprises were treating data access as a series of manual exceptions, not as a scalable architectural layer. And in a future shaped by AI agents - where systems need to reason, act, and adapt in real time - that model wouldn’t scale.
To unlock the full potential of AI in the enterprise, they saw the need for a new primitive: an infrastructure layer that makes access fast, secure, and reusable by default, without the hidden tax of duplication, re-platforming, or performance bottlenecks.
That vision led to the creation of PVML’s Virtual Databases — a foundational technology designed to make enterprise data usable by the next generation of AI systems.
In the traditional enterprise stack, “getting data into AI” often means copying it. Data is extracted into staging environments, duplicated across teams, replicated into new tools, and repeatedly transformed to satisfy conflicting security and operational requirements.
This creates four compounding consequences:
PVML’s answer is a new infrastructure layer: AI-ready Virtual Databases.
Instead of moving data, PVML virtualizes it. Enterprises connect PVML directly to their existing databases using live, zero-copy connectivity - with no duplication, no new pipelines, and no changes to existing workflows. From there, PVML creates virtual, scoped database interfaces, consumable by analytics tools and GenAI agents via MCP or any other AI-native protocol.
At the heart of this system is PVML’s proprietary security engine, which enforces dynamic, context-aware access policies at query time. A high-performance Golang-based compiler enables real-time connectivity between any data source and any AI framework - with low latency and enterprise-grade performance.
This combination of deep security algorithms and high-efficiency engineering forms PVML’s core differentiation: a platform that is secure by design, scalable by architecture, and plug-and-play for any AI ecosystem.
With PVML, teams no longer need to create new datasets just to support new users, environments, or agent use cases. Innovation is unlocked - no overhead, and no compromise on security or performance.
In one real-world deployment with a global Fortune fintech, PVML replaced a complex, multi-team workflow that had previously taken months. Within a week, the company launched secure, agentic access to live production data — achieving an 80% acceleration in time-to-production, while eliminating millions of dollars in unnecessary duplication and infrastructure costs. All while preserving full security, compliance, and governance by default.
PVML’s leadership is grounded in a belief that the shift to AI-driven work will be defined less by model breakthroughs and more by infrastructure readiness. As agents become more capable, organizations will need systems that reliably translate enterprise reality - data, policies, and operational context - into a fuel agents can use safely and efficiently.
That’s why PVML focuses on infrastructure, not point solutions.
The company’s philosophy is to reduce friction at the deepest layer of the stack: remove the need for manual dataset duplication, minimize operational overhead, and make performance optimal - especially as workloads become increasingly real-time and agent-driven.
Equally important is how PVML approaches enterprise adoption: it is designed to integrate into the current stack rather than replace it. Organizations do not need to redesign their data architecture or slow down transformation programs to benefit - PVML is intended to be a layer that modernizes and optimizes what is already in the stack.
Just like VMware didn’t replace physical servers - it made them usable by virtualizing the interface to compute. PVML applies the same leap to data: virtualizing access creates elasticity, speed, and efficiency without exchanging or multiplying the underlying resource.
The next era of enterprise software will be agentic - systems that reason, interact, and act across workflows. But agents introduce new demands: they need lower latency, broader reach across systems, and higher confidence in the “truth” they operate on. The legacy approach - copying data into isolated environments and building one-off pipelines - won’t scale.
PVML’s Virtual Databases are built to enable an optimal future: one where enterprise data access becomes modular, reusable, and optimized by default. A future where teams no longer spend months plumbing data for every new AI initiative. Where organizations stop paying the hidden tax of duplication. Where security is built-in by design and organizations can scale with confidence instead of caution. And where performance is optimized by operating directly on live, scoped and governed data.
Just like VMware powered the modern cloud era by virtualizing compute, PVML aims to define the modern AI era by virtualizing data access.