Hitachi Vantara has expanded its Hitachi iQ portfolio with new software for agent orchestration, additional NVIDIA-based infrastructure options, and tighter integration with distributed data environments for organisations that want to run agentic AI on premises.

The update focuses on production deployments in on-premises and virtualised environments, where data sensitivity, regulatory requirements, and governance rules can limit the use of public cloud services. Hitachi iQ combines storage, networking, and accelerated computing in a pre-validated stack, with software to manage AI workflows and data access.

Agentic AI refers to systems that can plan and take action through software agents. Companies are testing these approaches for customer service, IT operations, and document processing. Many are also exploring multi-agent designs, where specialised agents handle different steps under a coordinating function.

Hitachi Vantara framed the changes as a response to data readiness gaps and rising oversight expectations as AI moves from pilots to operational roll-outs. It cited research showing 42% of organisations in the US and Canada are considered data-mature. Among those organisations, 84% reported measurable AI return on investment, compared with 48% of organisations with weaker data foundations.

“AI is moving into production faster than many organizations’ data foundations are ready to support,” said Octavian Tanase, Chief Product Officer, Hitachi Vantara.

Infrastructure options

The infrastructure layer of Hitachi iQ is built on the Virtual Storage Platform One data platform. The product line now supports air-cooled NVIDIA Blackwell GPUs, as well as Blackwell Ultra GPUs in both air-cooled and liquid-cooled configurations.

Hitachi iQ also supports a 2U NVIDIA MGX-based system with up to four NVIDIA RTX PRO6000 Blackwell Server Edition GPUs. Support for the NVIDIA RTX PRO 4500 Blackwell Server Edition GPU is planned.

These additions expand configuration options for different AI use cases and for data centres facing power, cooling, and space constraints. In on-premises deployments, those constraints often determine what accelerated computing hardware teams can deploy at scale.

Hitachi iQ integrates compute, networking, and storage in a single stack designed to keep data close to compute. For data-intensive AI workloads, this approach can reduce bottlenecks created by moving data across tiers or systems.

Studio changes

The software layer, Hitachi iQ Studio, now includes expanded AI blueprints and multi-agent coordination functions. Hitachi Vantara says the framework is intended to help enterprises design, deploy, and govern AI agents.

The blueprints define agent roles, including supervisor and worker models. Worker agents execute tasks, while supervisor agents coordinate workflows across multiple agents and adapt based on outcomes.

Hitachi iQ Studio also adds support for NVIDIA Nemotron models, large language models designed for agentic use cases such as tool use and structured task execution. Hitachi Vantara also introduced what it calls “time machine capabilities,” described as a way for AI systems to work with historical datasets with additional context.

In regulated sectors such as financial services, healthcare, and parts of the public sector, long retention periods and audit requirements can shape AI system design. They can also affect where data resides and how models and agents can access it.

“As enterprises continue to scale AI, the ability to combine accelerated computing with consistent software and trusted data becomes essential,” said Jason Hardy, Vice President of Storage Technologies, NVIDIA.

Data integration

Hitachi iQ also deepens integration with Hammerspace, a data orchestration platform for managing distributed data estates. The companies already partner, and the update extends the connection between Hammerspace-managed data and Hitachi iQ Studio.

Under the expanded integration, data managed by Hammerspace can be accessed within Hitachi iQ Studio using the Model Context Protocol. MCP is an open standard for connecting AI systems to external data sources under controlled conditions, reflecting a broader industry shift toward standardised connectors that reduce custom integration work.

According to Hitachi Vantara, the integration lets organisations build AI agents that work with distributed datasets without relocating data. This is often necessary in large enterprises where data spans sites and platforms, and where copying sensitive information into new repositories can create security and compliance risks.

Hitachi Vantara said data remains governed and protected within VSP One, while maintaining availability and performance as agents operate across environments. It also referenced VSP One Block infrastructure in relation to consistent performance and availability.

Storage roadmap

Separately, Hitachi Vantara said it will support the newly announced NVIDIA STX reference architecture as part of its work on AI-focused storage systems. It described the architecture as incorporating NVIDIA Vera Rubin, BlueField-4, Spectrum-X networking, and NVIDIA AI software.

The announcement ties Hitachi Vantara’s storage roadmap to NVIDIA’s next wave of infrastructure designs and signals continued collaboration as demand grows for on-premises AI stacks aligned with vendor reference architectures.

“With these latest enhancements to the Hitachi iQ portfolio, we are expanding across software innovation, high-performance infrastructure and intelligent data integration to give customers greater flexibility and control as they move agentic AI from pilot to production,” said Tanase.