Data Architecture

Vertica Integrates with H3C ONEStor to Bring Cloud-Scale Analytics

Unified Analytics and Machine Learning on a Reliable Platform are delivered through a Combination Offering.

Vertica and H3C ONEStor have announced a partnership to bring cloud-native analytics to business data centres. Vertica and H3C, working together, enable analytically driven businesses to elastically increase capacity and performance as data volumes rise and machine learning efforts become a business need - all from inside hybrid environments.

"With this integration, data-driven leaders in the APAC region will benefit from a powerful combination of industry-leading platforms that accommodate any present and future strategic analytical and machine learning initiatives. H3C has a solid presence in this region, enabling our joint customers to run Vertica's cloud-optimized architecture with H3C's ONEStor to meet the most demanding performance and financial requirements – from enterprise data centers or private clouds."

Scott Richards, vice president and general manager of Vertica

Businesses may use Vertica and H3C ONEStor to leverage cloud innovation for analytics wherever their data is stored, even if cloud migration isn't viable due to scheduling or cost. When these 2 technologies are combined, they provide quick analytics while also simplifying data protection with features like backup and replication. Additionally, because it uses cloud technology for data storage, it ensures 99.9999999% dependability for mission-critical data.

Yili Liu, VP of Cloud and Intelligence Product Line from H3C stated that "We're delighted to offer Vertica analytics and machine learning on top of our ONEStor from H3C. The analytical performance offered by Vertica combined with the data reliability of ONEStor offer ultra-large-scale and capacity, unmatched analytical performance, and high data reliability."

The integrated solution combines high-performance analytics and machine learning with enterprise-grade object storage, allowing businesses to:

Address present and future scalability requirements - As you’re analytical and machine learning requirements change, elastically expand out to accommodate terabytes to petabytes of data and thousands of users.

Utilize computing and storage architectural separation - Administrators can scale computation and data storage resources separately to meet fluctuating dynamic workload needs.

Simplify database operations - The solution is quite dependable, as it includes several data-protection measures. Data loss is highly rare, with a dependability rating of nine out of ten.

Address all data consumer demands - Separate analytical workloads so that different types of data consumers – from business analysts to data scientists – can get what they need without fighting for resources.

Spotlight

Spotlight

Related News