Hitachi Data Systems Delivers Next-Generation Hyper-Converged, Scale-Out Platform For Big Data, Powered By Pentaho
New Turnkey HSP Appliance Delivers Native Integration With Pentaho for Robust Data Integration and Analytics; Simplifies Deployment, Operations and Scaling of Enterprise Big Data Projects
Hitachi Data Systems Corporation (HDS) (www.HDS.com), a wholly owned subsidiary of Hitachi, Ltd. unveiled its next generation Hitachi Hyper Scale-Out Platform (HSP), offering native integration with the Pentaho Enterprise Platform to deliver a sophisticated, software-defined, hyper-converged platform for big data deployments.
Modern enterprises increasingly need to derive value from massive volumes of data generated by information technology (IT), operational technology (OT), the Internet of Things (IoT) and machine-generated data in their environments. HSP offers a software-defined architecture to centralise and support easy storing and processing of these large datasets with high availability, simplified management and a pay-as-you-grow model.
“Many enterprises don’t possess the internal expertise to perform big data analytics at scale with complex data sources in production environments. Most want to avoid the pitfalls of experimentation with still-nascent technologies, seeking a clear path to deriving real value from their data without the risk and complexity,” said Stuart Cheverton, Business Development Manager, Emerging Technologies.
Delivered as a fully configured, turnkey appliance, HSP takes hours instead of months to install and support production workloads, and simplifies creation of an elastic data lake that helps customers easily integrate disparate datasets and run advanced analytic workloads.
“Enterprise customers stand to benefit from turnkey systems like the Hitachi Hyper Scale-Out Platform, which address primary adoption barriers to big data deployments by delivering faster time to insight and value, accelerating the path to digital transformation,” adds Cheverton.
HSP’s scale-out architecture provides simplified, scalable and enterprise-ready infrastructure for big data. The architecture also includes a centralized, easy-to-use user interface to automate the deployment and management of virtualized environments for leading open source big data frameworks, including Apache Hadoop, Apache Spark, and commercial open source stacks like the Hortonworks Data Platform (HDP).
The next-generation HSP system now offers native integration with Pentaho Enterprise Platform to give customers complete control of the analytic data pipeline and enterprise-grade features such as big data lineage, lifecycle management and enhanced information security. The powerful combination of technologies in the next-generation HSP appliance was designed to accelerate time to business insight and deliver rapid return on investment (ROI), while simplifying the integration of information technology (IT) and operational technology (OT)—a strategic imperative for modern, data-driven enterprises.
“Modern enterprises must merge their IT and OT environments to extend the value of their investments. HSP is a perfect solution to accelerate and simplify IT/OT integration and increase the time to insight and business value of their big data deployments,” said Michael Haddad, BI Architect at Praxis, the South African Pentaho partner and systems integrator. “The HSP-Pentaho appliance gives customers an affordable, enterprise-class option to unify all their disparate datasets and workloads—including legacy applications and data warehouses—via a modern, scalable and hyper-converged platform that eliminates complexity. We’re pleased to be working with HDS to deliver a simplified solution that combines compute, analytics and data management functions in a plug-and-play, future-ready architecture.”
With HSP, Hitachi continues to deliver on the promise of the software-defined datacenter to simplify the delivery of IT services. While its initial focus is on big data analytics use cases, the company’s long-term direction for HSP is to deliver best-in-class total cost of ownership (TCO) for a variety of IT workloads.