Business Intelligence, Big Data Management, Data Science

CockroachDB Dedicated for Microsoft Azure Realizes Company's Vision for a Single Distributed Database that Runs Everywhere and Anywhere, Effortlessly

CockroachDB Dedicated for Microsoft Azure Realizes

Cockroach Labs, the company behind the leading independent cloud-native distributed SQL database CockroachDB, today announced the realization of its vision to enable customers to run a best of breed distributed database everywhere and anywhere effortlessly. With the launch of CockroachDB-as-a-service running on Microsoft Azure in limited access, CockroachDB is now available on-demand across all three major cloud providers (with existing product availability on Amazon Web Services and Google Cloud Platform).

In addition, CockroachDB's serverless, on-consumption data platform now allows users to read and write data across multiple geographically distributed regions. Paying only for the exact amount of data stored and usage of that data on the CockroachDB serverless platform dramatically reduces the cost of operating a global business and makes data-intensive, multi-region applications accessible to companies of any size. These additions provide customers with even greater flexibility, capability, and control in their cloud strategies, building on CockroachDB's existing benefits of being considerably less expensive to operate than traditional scale-up systems, helping ensure a flawless customer experience through bulletproof resilience and elastic scale, and making it easy to expand into new markets.

CockroachDB as-a-Service on Microsoft Azure

Cockroach Labs' database-as-a-service on Azure (now available in limited access) unlocks effortless resilience and scalability, operational efficiency, and easy multi-region deployments to support local performance and data compliance, all while operating on and integrating with the Microsoft ecosystem. Now, with CockroachDB dedicated for all three major clouds, users can run data-intensive applications anywhere and everywhere, effortlessly. With CockroachDB, organizations can choose between cloud providers or across multiple cloud providers, and can easily mix workloads between owned data centers and public cloud providers.

"The idea of our platform is to be able to survive a cloud vendor outage, so we're not going to be dependent on any cloud vendor," said Kevin Holditch, Head of Platform, Form3. "We're going to have a Kubernetes cluster in each cloud vendor — so Azure, AWS, GCP – and run CockroachDB across the three."

According to Gartner® research, "By 2023, 40% of all enterprise workloads will be deployed in cloud infrastructure and platform services, which is an increase from 20% in 2020."1 And yet several surveys note up to 80% of transactional workloads have not yet moved to the cloud. That is expected to change as organizations re-evaluate their tech stack. CIO's are prioritizing moving their most critical data into the cloud. This movement is indicative of a massive market shift that could signal the end of market dominance by legacy databases like Oracle and IBM. CockroachDB was built from the ground up to operate across any cloud and/or private cloud infrastructure while still keeping true to the consistency and durability of a traditional relational database. CockroachDB is the only cloud-independent, distributed SQL database that offers this level of flexibility and greater operational control today.

Gartner notes, "data and analytics leaders can use this research to plan against their operational use cases for relational and nonrelational cloud DBMSs, which increasingly require features for augmented operations via machine learning, multicloud scenarios and effective financial governance to achieve leadership." Further, Gartner recommends, "select your cloud DBMS independent of the strategic cloud provider. Independent software vendors (ISVs) with multi-cloud and intercloud capability are more likely to fit best when using multiple cloud service providers (CSPs)."2

Introducing CockroachDB serverless for global and multi-region deployment

Today, Cockroach Labs also released multi-region capabilities for its consumption-based, auto-scaling offering, CockroachDB serverless. The update allows customers to distribute rows of data across multiple cloud regions, while still functioning as a single logical database and paying only for the exact storage and compute uses. With both legacy or existing cloud database solutions, the complexity and cost attached to spinning up a new region adds up very quickly. CockroachDB serverless now enables any organization to build applications that serve a globally dispersed user-base at incredibly low cost and simpler operations, opening up a global audience to companies of any size.

Enhanced capabilities in CockroachDB MOLT (Migrate Off Legacy Technology)

Migrations require extensive technical and logistical preparation, time, energy, troubleshooting, and optimization. All the while, there's business-level pressure to keep services online during a migration so customers don't know that any changes happened. Today, CockroachDB MOLT now enables easier migrations from legacy databases like Oracle, Postgres, MySQL, and Microsoft SQL Server.

Additionally, several enhancements have been made to CockroachDB MOLT, including MOLT Verify which validates migrated data from Postgres and MySQL to ensure correct replication and a smoother syntax conversion in bulk changes, authentication of Postgres and MySQL. clusters, and more intuitive workflows.

"While the move of transactional data to the cloud is accelerating, many data leaders are still at the beginning of their cloud journey and are finding that legacy solutions simply do not meet their needs, especially for their mission-critical applications," said Spencer Kimball, CEO and co-founder at Cockroach Labs. "This release embodies the fulfillment of the vision we set out to accomplish eight years ago. We offer true flexibility and resilience and will meet you where you are in your cloud journey–now and in the future."

Additional feature updates released today include:

  • Distributed User-Defined-Functions (UDFs): Allow developers to build UDFs into the database without running into scaling bottlenecks. Distributed UDFs increase developer and application efficiency, and enable easier migrations from legacy databases.
  • Terraform provider in General Availability: Use the Terraform provider to automate provisioning and management of CockroachDB dedicated and serverless.
  • Address the Federal Information Processing Standard (FIPS) 140-2: A cryptography standard required for many government agencies and organizations that work with such agencies, with a new FIPS-ready binary for CockroachDB self-hosted.

For more information on CockroachDB on Azure, click here. To test CockroachDB serverless, click here.

GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved.

About Cockroach Labs

Cockroach Labs is the creator of CockroachDB, the most highly evolved, cloud-native, distributed SQL database on the planet. Helping companies of all sizes — and the apps they develop — to scale fast, survive failures, and thrive everywhere. CockroachDB is in use at some of the world's most successful companies across all industries, including leading companies in financial services, technology, media & entertainment, and retail. Headquartered in New York City, Cockroach Labs is backed by Altimeter, Benchmark, Greenoaks, GV, Firstmark, Index Ventures, Lone Pine, Redpoint Ventures, Sequoia Capital, Tiger Global, and Workbench. For more information, please visit cockroachlabs.com.

Spotlight

Other News
Big Data Management

data.world Integrates with Snowflake Data Quality Metrics to Bolster Data Trust

data.world | January 24, 2024

data.world, the data catalog platform company, today announced an integration with Snowflake, the Data Cloud company, that brings new data quality metrics and measurement capabilities to enterprises. The data.world Snowflake Collector now empowers enterprise data teams to measure data quality across their organization on-demand, unifying data quality and analytics. Customers can now achieve greater trust in their data quality and downstream analytics to support mission-critical applications, confident data-driven decision-making, and AI initiatives. Data quality remains one of the top concerns for chief data officers and a critical barrier to creating a data-driven culture. Traditionally, data quality assurance has relied on manual oversight – a process that’s tedious and fraught with inefficacy. The data.world Data Catalog Platform now delivers Snowflake data quality metrics directly to customers, streamlining quality assurance timelines and accelerating data-first initiatives. Data consumers can access contextual information in the catalog or directly within tools such as Tableau and PowerBI via Hoots – data.world’s embedded trust badges – that broadcast data health status and catalog context, bolstering transparency and trust. Additionally, teams can link certification and DataOps workflows to Snowflake's data quality metrics to automate manual workflows and quality alerts. Backed by a knowledge graph architecture, data.world provides greater insight into data quality scores via intelligence on data provenance, usage, and context – all of which support DataOps and governance workflows. “Data trust is increasingly crucial to every facet of business and data teams are struggling to verify the quality of their data, facing increased scrutiny from developers and decision-makers alike on the downstream impacts of their work, including analytics – and soon enough, AI applications,” said Jeff Hollan, Director, Product Management at Snowflake. “Our collaboration with data.world enables data teams and decision-makers to verify and trust their data’s quality to use in mission-critical applications and analytics across their business.” “High-quality data has always been a priority among enterprise data teams and decision-makers. As enterprise AI ambitions grow, the number one priority is ensuring the data powering generative AI is clean, consistent, and contextual,” said Bryon Jacob, CTO at data.world. “Alongside Snowflake, we’re taking steps to ensure data scientists, analysts, and leaders can confidently feed AI and analytics applications data that delivers high-quality insights, and supports the type of decision-making that drives their business forward.” The integration builds on the robust collaboration between data.world and Snowflake. Most recently, the companies announced an exclusive offering for joint customers, streamlining adoption timelines and offering a new attractive price point. The data.world's knowledge graph-powered data catalog already offers unique benefits for Snowflake customers, including support for Snowpark. This offering is now available to all data.world enterprise customers using the Snowflake Collector, as well as customers taking advantage of the Snowflake-only offering. To learn more about the data quality integration or the data.world data catalog platform, visit data.world. About data.world data.world is the data catalog platform built for your AI future. Its cloud-native SaaS (software-as-a-service) platform combines a consumer-grade user experience with a powerful Knowledge Graph to deliver enhanced data discovery, agile data governance, and actionable insights. data.world is a Certified B Corporation and public benefit corporation and home to the world’s largest collaborative open data community with more than two million members, including ninety percent of the Fortune 500. Our company has 76 patents and has been named one of Austin’s Best Places to Work seven years in a row.

Read More

Big Data

Airbyte Racks Up Awards from InfoWorld, BigDATAwire, Built In; Builds Largest and Fastest-Growing User Community

Airbyte | January 30, 2024

Airbyte, creators of the leading open-source data movement infrastructure, today announced a series of accomplishments and awards reinforcing its standing as the largest and fastest-growing data movement community. With a focus on innovation, community engagement, and performance enhancement, Airbyte continues to revolutionize the way data is handled and processed across industries. “Airbyte proudly stands as the front-runner in the data movement landscape with the largest community of more than 5,000 daily users and over 125,000 deployments, with monthly data synchronizations of over 2 petabytes,” said Michel Tricot, co-founder and CEO, Airbyte. “This unparalleled growth is a testament to Airbyte's widespread adoption by users and the trust placed in its capabilities.” The Airbyte community has more than 800 code contributors and 12,000 stars on GitHub. Recently, the company held its second annual virtual conference called move(data), which attracted over 5,000 attendees. Airbyte was named an InfoWorld Technology of the Year Award finalist: Data Management – Integration (in October) for cutting-edge products that are changing how IT organizations work and how companies do business. And, at the start of this year, was named to the Built In 2024 Best Places To Work Award in San Francisco – Best Startups to Work For, recognizing the company's commitment to fostering a positive work environment, remote and flexible work opportunities, and programs for diversity, equity, and inclusion. Today, the company received the BigDATAwire Readers/Editors Choice Award – Big Data and AI Startup, which recognizes companies and products that have made a difference. Other key milestones in 2023 include the following. Availability of more than 350 data connectors, making Airbyte the platform with the most connectors in the industry. The company aims to increase that to 500 high-quality connectors supported by the end of this year. More than 2,000 custom connectors were created with the Airbyte No-Code Connector Builder, which enables data connectors to be made in minutes. Significant performance improvement with database replication speed increased by 10 times to support larger datasets. Added support for five vector databases, in addition to unstructured data sources, as the first company to build a bridge between data movement platforms and artificial intelligence (AI). Looking ahead, Airbyte will introduce data lakehouse destinations, as well as a new Publish feature to push data to API destinations. About Airbyte Airbyte is the open-source data movement infrastructure leader running in the safety of your cloud and syncing data from applications, APIs, and databases to data warehouses, lakes, and other destinations. Airbyte offers four products: Airbyte Open Source, Airbyte Self-Managed, Airbyte Cloud, and Powered by Airbyte. Airbyte was co-founded by Michel Tricot (former director of engineering and head of integrations at Liveramp and RideOS) and John Lafleur (serial entrepreneur of dev tools and B2B). The company is headquartered in San Francisco with a distributed team around the world. To learn more, visit airbyte.com.

Read More

Data Architecture

SingleStore Announces Real-time Data Platform to Further Accelerate AI, Analytics and Application Development

SingleStore | January 25, 2024

SingleStore, the database that allows you to transact, analyze and contextualize data, today announced powerful new capabilities — making it the industry’s only real-time data platform. With its latest release, dubbed SingleStore Pro Max, the company announced ground-breaking features like indexed vector search, an on-demand compute service for GPUs/ CPUs and a new free shared tier, among several other innovative new products. Together, these capabilities shrink development cycles while providing the performance and scale that customers need for building applications. In an explosive generative AI landscape, companies are looking for a modern data platform that’s ready for enterprise AI use cases — one with best-available tooling to accelerate development, simultaneously allowing them to marry structured or semi-structured data residing in enterprise systems with unstructured data lying in data lakes. “We believe that a data platform should both create new revenue streams while also decreasing technological costs and complexity for customers. And this can only happen with simplicity at the core,” said Raj Verma, CEO, SingleStore. “This isn’t just a product update, it’s a quantum leap… SingleStore is offering truly transformative capabilities in a single platform for customers to build all kinds of real-time applications, AI or otherwise.” “At Adobe, we aim to change the world through digital experiences,” said Matt Newman, Principal Data Architect, Adobe. “SingleStore’s latest release is exciting as it pushes what is possible when it comes to database technology, real-time analytics and building modern applications that support AI workloads. We’re looking forward to these new features as more and more of our customers are seeking ways to take full advantage of generative Al capabilities.” Key new features launched include: Indexed vector search. SingleStore has announced support for vector search using Approximate Nearest Neighbor (ANN) vector indexing algorithms, leading to 800-1,000x faster vector search performance than precise methods (KNN). With both full-text and indexed vector search capabilities, SingleStore offers developers true hybrid search that takes advantage of the full power of SQL for queries, joins, filters and aggregations. These capabilities firmly place SingleStore above vector-only databases that require niche query languages and are not designed to meet enterprise security and resiliency needs. Free shared tier. SingleStore has announced a new cloud-based Free Shared Tier that’s designed for startups and developers to quickly bring their ideas to life — without the need to commit to a paid plan. On-demand compute service for GPUs and CPUs. SingleStore announces a compute service that works alongside SingleStore’s native Notebooks to let developers spin up GPUs and CPUs to run database-adjacent workloads including data preparation, ETL, third-party native application frameworks, etc. This capability brings compute to algorithms, rather than the other way around, enabling developers to build highly performant AI applications safely and securely using SingleStore — without unnecessary data movement. New CDC capabilities for data ingest and egress. To ease the burden and costs of moving data in and out of SingleStore, SingleStore is adding native capabilities for real-time Change Data Capture (CDC) in for MongoDB®, MySQL and ingestion from Apache Iceberg without requiring other third party CDC tools. SingleStore will also support CDC out capabilities that ease migrations and enable the use of SingleStore as a source for other applications and databases like data warehouses and lakehouses. SingleStore Kai™. Now generally available, and ready for both analytical and transactional processing for apps originally built on MongoDB. Announced in public preview in early 2023, SingleStore Kai is an API to deliver over 100x faster analytics on MongoDB® with no query changes or data transformations required. Today, SingleStore Kai supports BSON data format natively, has improved transactional performance, increased performance for arrays and offers industry-leading compatibility with MongoDB query language. Projections: To further advance as the world’s fastest HTAP database, SingleStore has added Projections. Projections allow developers to greatly speed up range filters and group by operations by introducing secondary sort and shard keys. Query performance improvements range from 2-3x or more, depending on the size of the table. With this latest release, SingleStore becomes the industry’s first and only real-time data platform designed for all applications, analytics and AI. SingleStore supports high-throughput ingest performance, ACID transactions and low-latency analytics; and structured, semi-structured (JSON, BSON, text) and unstructured data (vector embeddings of audio, video, images, PDFs, etc.). Finally, SingleStore’s data platform is designed not just with developers in mind, but also ML engineers, data engineers and data scientists. “Our new features and capabilities advance SingleStore’s mission of offering a real-time data platform for the next wave of gen AI and data applications,” said Nadeem Asghar, SVP, Product Management + Strategy at SingleStore. “New features, including vector search, Projections, Apache Iceberg, Scheduled Notebooks, autoscaling, GPU compute services, SingleStore Kai™, and the Free Shared Tier allow startups — as well as global enterprises — to quickly build and scale enterprise-grade real-time AI applications. We make data integration with third-party databases easy with both CDC in and CDC out support.” "Although generative AI, LLM, and vector search capabilities are early stage, they promise to deliver a richer data experience with translytical architecture," states the 2023 report, “Translytical Architecture 2.0 Evolves To Support Distributed, Multimodel, And AI Capabilities,” authored by Noel Yuhanna, Vice President and Principal Analyst at Forrester Research. "Generative AI and LLM can help democratize data through natural language query (NLQ), offering a ChatGPT-like interface. Also, vector storage and index can be leveraged to perform similarity searches to support data intelligence." SingleStore has been on a fast track leading innovation around generative AI. The company’s product evolution has been accompanied by high-momentum growth in customers and surpassing $100M in ARR late last year. SingleStore also recently ranked #2 in the emerging category of vector databases, and was recognized by TrustRadius as a top vector database in 2023. Finally, SingleStore was a winner of InfoWorld’s Technology of the year in the database category. To learn more about SingleStore visit here. About SingleStore SingleStore empowers the world’s leading organizations to build and scale modern applications using the only database that allows you to transact, analyze and contextualize data in real time. With streaming data ingestion, support for both transactions and analytics, horizontal scalability and hybrid vector search capabilities, SingleStore helps deliver 10-100x better performance at 1/3 the costs compared to legacy architectures. Hundreds of customers worldwide — including Fortune 500 companies and global data leaders — use SingleStore to power real-time applications and analytics. Learn more at singlestore.com. Follow us @SingleStoreDB on Twitter or visit www.singlestore.com.

Read More

Big Data Management

The Modern Data Company Recognized in Gartner's Magic Quadrant for Data Integration

The Modern Data Company | January 23, 2024

The Modern Data Company, recognized for its expertise in developing and managing advanced data products, is delighted to announce its distinction as an honorable mention in Gartner's 'Magic Quadrant for Data Integration Tools,' powered by our leading product, DataOS. “This accolade underscores our commitment to productizing data and revolutionizing data management technologies. Our focus extends beyond traditional data management, guiding companies on their journey to effectively utilize data, realize tangible ROI on their data investments, and harness advanced technologies such as AI, ML, and Large Language Models (LLMs). This recognition is a testament to Modern Data’s alignment with the latest industry trends and our dedication to setting new standards in data integration and utilization.” – Srujan Akula, CEO of The Modern Data Company The inclusion in the Gartner report highlights The Modern Data Company's pivotal role in shaping the future of data integration. Our innovative approach, embodied in DataOS, enables businesses to navigate the complexities of data management, transforming data into a strategic asset. By simplifying data access and integration, we empower organizations to unlock the full potential of their data, driving insights and innovation without disruption. "Modern Data's recognition as an Honorable Mention in the Gartner MQ for Data Integration is a testament to the transformative impact their solutions have on businesses like ours. DataOS has been pivotal in allowing us to integrate multiple data sources, enabling our teams to have access to the data needed to make data driven decisions." – Emma Spight, SVP Technology, MIND 24-7 The Modern Data Company simplifies how organizations manage, access, and interact with data using its DataOS (data operating system) that unifies data silos, at scale. It provides ontology support, graph modeling, and a virtual data tier (e.g. a customer 360 model). From a technical point of view, it closes the gap from conceptual to physical data model. Users can define conceptually what they want and its software traverses and integrates data. DataOS provides a structured, repeatable approach to data integration that enhances agility and ensures high-quality outputs. This shift from traditional pipeline management to data products allows for more efficient data operations, as each 'product' is designed with a specific purpose and standardized interfaces, ensuring consistency across different uses and applications. With DataOS, businesses can expect a transformative impact on their data strategies, marked by increased efficiency and a robust framework for handling complex data ecosystems, allowing for more and faster iterations of conceptual models. About The Modern Data Company The Modern Data Company, with its flagship product DataOS, revolutionizes the creation of data products. DataOS® is engineered to build and manage comprehensive data products to foster data mesh adoption, propelling organizations towards a data-driven future. DataOS directly addresses key AI/ML and LLM challenges: ensuring quality data, scaling computational resources, and integrating seamlessly into business processes. In our commitment to provide open systems, we have created an open data developer platform specification that is gaining wide industry support.

Read More