Q&A with Charles Southwood, Vice President, N. Europe and MEA at Denodo

Charles Southwood, Regional VP at Denodo Technologies is responsible for the company’s business revenues in Northern Europe, Middle East and South Africa. He is passionate about working in rapidly moving and innovative markets to support customer success and to align IT solutions that meet the changing business needs. With a degree in engineering from Imperial College London, Charles has over 20 years of experience in data integration, big data, IT infrastructure/IT operations and Business Analytics.

There is a huge buzz around AI and for about 10% of businesses, it’s already being used to give better faster business decision-making processes.



MEDIA 7: Could you please tell us a little bit about yourself and what made you choose this career path?
CHARLES SOUTHWOOD:
Like several friends and colleagues who now work in IT, I started my career as a civil engineer and moved into IT as the industry matured. Over the years I have found that many of the attributes that come from that early engineering background are also well-suited to the logical and analytical approaches needed for solving customer business problems with data and IT solutions and to the efficient operation of a sales and marketing business. After graduating from Imperial College, I spent several years in the industry with Taylor Woodrow both on-site and in the design office. This included a unique opportunity to spend several months in the Amazon jungle designing and building bridges to connect remote villages that would otherwise be cut off during the rainy season; both an unusual and very rewarding experience for someone in their mid-twenties!

Upon returning to the UK I moved into the design office, designing dams, oil rigs, power stations and river diversion schemes. Much of the design included the use of CAD/CAM systems and finite difference and finite element analysis (using large matrix manipulation) to model loadings and stresses in structures and to predicting fluid flow patterns. Although business-related computing was in its infancy, I could see enormous potential and found the prospects for much wider use of IT to be very enticing. An opportunity presented itself to move to a company that sold solutions and consultancy services both for engineering and for general business applications. I joined the company in a sales position and although I still had my links to engineering, the faster growth in use of IT for business inevitably led to my focus on this area. This in turn led to other sales positions and into sales management and then wider business leadership roles.

Over the last 25 years, I have therefore had the pleasure of working in a number of software houses, driving the sales and marketing operations of companies where the predominant focus is on innovative market solutions, often disruptive in their nature. I am privileged to have had the opportunity to discuss business issues and strategies with a vast number of large and mid-sized organisations, across different industries and with varying stages of IT and data maturity. It is great to be able to discuss the ‘big-picture’ perspectives on the trends and needs of those client businesses and to subsequently see the impact that can be achieved by supporting them with the right solutions. For the last 5 years, as Regional Vice President of Denodo Technologies, I’ve been responsible for leading the company’s business growth across northern Europe, the Middle East and Africa.

With ever-increasing data volumes, a greater variety of data formats and more disparate locations coupled with the mounting desire for real-time data insights we’ve seen rapid growth in demand for improved data integration. As the world’s market leader in data virtualisation we’re seeing strong adoption from all sectors, addressing the applications integration requirements and data integration needs for a wide array of use cases from self-service and agile BI to Customer360, cloud migration, applications integration and compliance/risk. It is an exciting time to be at the heart of these industry initiatives!


M7: Your award-winning data virtualization platform is regularly featured in the world's leading business and IT publications. What are the core values that drive the company to be a leader in data virtualization?
CS:
Denodo was founded in 1999 and has been focused on providing data virtualisation throughout that time. In fact, many look upon Denodo as the ‘founding father’ of data virtualisation technology and the Denodo Platform capabilities, as you say, allow it to be placed consistently in leadership categories of the various analyst reports on data and applications integration. For example, in Gartner’s latest Magic Quadrant for Data Integration (June 2021) Denodo is listed as the fastest growing vendor in the Leadership Quadrant. The company is privately owned, headquartered in Palo Alto and is led by its founder and CEO, Angel Vina.

Financially, Denodo is in great shape with some 50-60% year-on-year revenue growth and a business that is both profitable and has no long-term debt. The combination of strength from consistent executive leadership coupled with strong financials means that we’ve been able to enjoy an unrivalled focus on ‘excellence’; excellence in technical capability, excellence in innovation and excellence in customer satisfaction. This is reflected in the vibrant and enthusiastic user community for Denodo, which can be seen in the many references and case studies available on Denodo’s web site as well as the customer satisfaction ratings in the various vendor surveys.


It is no longer practical to move and copy all the data that might be needed into one large curated data warehouse or data lake. It is too slow, too complex and too costly.



M7: Given the changes in today’s rapidly evolving marketplace, what are the key 'climatic' changes taking place in Data Virtualization today? What systematic approach are you using to address these changes?
CS:
When talking to businesses about their wider needs from data integration, the challenges often include siloed data, legacy applications (often designed with no integration in mind), cloud migration and hybrid cloud (on-premise and cloud combinations) or multi-cloud combination, ungoverned ‘shadow IT’ and the wide variety of formats and protocols. Combine this with the difficulties of the rapidly growing data volumes, the disparate nature of modern data stores (on -premise, SaaS, and various cloud locations plus perhaps external sources) as well as the business demand for real-time or near real-time data insights and you have some big challenges! If you are to refer to these as ‘climatic changes’ then most businesses are encountering a heat-wave! This is typically when Denodo gets invited to the party.It is no longer practical to move and copy all the data that might be needed into one large curated data warehouse or data lake. It is too slow, too complex and too costly. The industry is littered with stories of the difficulties encountered with the move-and-copy approach and big data lake projects that never end.

Instead, connecting a virtual layer to all potential sources (without moving and copying the data) means that data consumers can be connected to just the virtual layer and get secure access to all they might need regardless of the location or format. The integration can be provided in real-time inside the virtual layer. It saves on the complexity and cost of all the ‘heavy lifting’ and duplication of vast amounts of data and the users get the agility and real-time data (rather than the batch overnight versions they were getting from physical data warehouses). Of course, IT also benefits from a single layer for audit, security and governance, ensuring compliance and much faster data access for the business. We all do something similar in the home when we watch movies or listen to music with services like Netflix or Spotify. We get what we want when we want it and don’t have to worry about where it is stored. We also get an almost infinite choice as there is no need to hold all the DVDs or CD collections locally, on our shelves. Data virtualisation provides the same kind of paradigm but for business. When it comes to data, ‘Connect, don’t collect’!


M7: Could you please help our readers understand the relationship between data virtualization and digital transformation?
CS:
This is a very good question as it is easy to mix the terms of data virtualisation and digital transformation! Digital transformation is the strategic adoption of digital technology to meet changing market and business needs. It often involves the creation of new business processes or the modification of existing ones. As organisations reduce or eliminate paper-based processes we see the switch from analogue to digital. It brings with it new opportunities and new customer interactions that would previously have been impossible.Interestingly data virtualisation is often used in digital transformation projects, as a key element in the new digitised architecture, giving greater agility to the business, with new products, processes and services available through the combination of data in new and innovative ways. Take for example a new omnichannel service to customer interaction, offering a mobile app, web-based access and call centre service as well as perhaps internal management reports and real-time executive dashboards.

These different data consumers require different technologies and therefore different formats and data delivery protocols. Without data virtualisation each different flavour of consumer would need specific interfaces and manipulation of the data to provide the right exposure. This is time-consuming, requires coding and data movement and is both costly and potentially error-prone. It also makes it hard to make changes in a quick and agile manner. However, by using a data virtualisation layer, all the data combinations and transformations can be defined just once, linking the published data to the different data consumers from the same single data model. Now you have a very agile, easy to maintain system with consistent data to all consumers irrespective of the technology. Oh, and let’s not forget it is now also real-time or near real-time from the source! For many businesses, this can be truly transformational!


Many AI applications have obvious and direct benefits which most people would accept as reasonable but as the level of sophistication grows the issues will inevitably be more complex.



M7: How has COVID-19 affected the market? What was Denodo’s strategy during the pandemic?
CS:
The demand for data virtualisation is driven by projects and strategies requiring faster, more digital business models where data is recognised as having the potential to be a key differentiator in the business; new insights, faster interactions, new products and services, greater customer intelligence, propensity models, improved governance and compliance, reduced risk and the adoption of AI/ML are all typical business drivers for data virtualisation. The solution is very much industry-agnostic and is therefore used widely across most major business sectors including finance, telco, pharma, manufacturing, government, retail and utilities. For this reason, we saw some sectors accelerating their adoption and others holding back. In banking, for example, the branch closures drove greater digitisation and the adoption of online banking and likewise, the move to a virtually cashless economy provided far more comprehensive data sets for the analysis of customer spending patterns than could ever have been achieved when cash was in play.

For our part, Denodo was keen to help combat the global challenges we all faced. We, therefore, spun up the data portal with the sole aim of integrating disparate global data about COVID-19, curating them, and then providing them free of charge to data consumers such as data scientists, analysts, and researchers, who could use them to look at the impact and potential solutions to this disease. The portal operated for some 15 months, with categories of content from several 100 official data sources from around the world (such as NHS England, ONS, WHO etc) for everyone to consume. It was set up and running in just 21 days and demonstrates the agility available from using data virtualisation for analytics. Some views were accessed 10s of 1,000s of times over its period of operation. The object has been to use data insights to help whoever and wherever in the world to fight COVID-19. It wasn’t just medical data; other examples include the ability to highlight the per cent change in visits to places like grocery stores and parks within a geographic area. Another being the ability to look at changes in air quality across regions over time, based on changing citizen movements. Collaboration was needed to maximise the value of this initiative to society and the portal statistics indicate that it was put to good use over the last 18 months. It’s a great example of the use of data virtualisation for high-velocity high volume analytics.

We’ve also had feedback from customers on their own application of the Denodo Platform during the pandemic. NHS National Services Scotland for example used Denodo’s technology to solve challenges, such as:

-The provisioning of data for COVID reporting to the Scottish Government and the First Minister in Scotland

-Helping to monitor oxygen availability at various hospitals

-Enabling data for Test and Trace of COVID patients and general public

-Implementing the vaccination programme, including venue and appointment management, eligibility, uptake, etc.

-The planning and availability of required PPE equipment and other supplies

Overall, given the huge impact on some industry sectors, we were fortunate to have been in IT as our business was able to work remotely, largely without difficulty, and we were able to continue to support our customers’ needs.


M7: How do you prepare for an AI-centric world as a Business Leader?
CS:
One of the fundamental technical requirements for good Artificial Intelligence is the availability of high quality comprehensive and timely data. For this reason, we are often asked to help, by using data virtualisation to abstract data from different sources in real-time and present the transformed results to AI models. Certainly, there is a huge buzz around AI and for about 10% of businesses, it’s already being used to give better faster business decision-making processes. Based on what we see there are 3 common strategies in the market, business-led, and technology-led and a combined iterative approach. The business-led strategy is driven by business demand for a specific capability. Initially, this seems to be largely for personalisation and automated customer service (chatbots), but now we’re also seeing a growing sophistication in use for real-time fraud detection/other “detection and prevention” models as well as improvements in operational efficiencies (e.g. customer onboarding)

Where we see technology-led strategies AI is being used by businesses using data science and business analytics to understand data and the potential for insights – availability, timeliness, quality, veracity & other attributes etc. What is where? Can it be used? What combinations have value? This approach is much more of a data-led initiative. The third strategy is the combination of business-led demand and technology-driven insights into what might be possible to solve certain business needs as an iterative process. In many companies, it is the work of the data scientist that shows the greatest potential for value in the use of AI/ML with new insights and new data combinations. At Denodo we see growing demand for data virtualisation as this substantially accelerates the access to data and therefore the surfacing of new meaningful insights that give a competitive edge to the business.

The value of AI may be compelling, for example for fraud prevention on payments, more advertising revenue from customised adverts but we have to balance this against the need for privacy compliance with GPDR, the Data Protection Act 2018, etc. The largest cost for non-compliance on privacy is often not the fine itself but the brand damage, so we often hear of priority being given to privacy compliance over analytics needs. There are also a host of ethical issues arising from the use of AI. In the short term enterprises have to show value to their own business and society. Many AI applications have obvious and direct benefits which most people would accept as reasonable but as the level of sophistication grows the issues will inevitably be more complex.

ABOUT DENODO

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across every major industry have gained significant business agility and ROI by enabling faster and easier access to unified business information for agile BI, big data analytics, Web, cloud integration, single-view applications, and enterprise data services.

More THOUGHT LEADERS

Q&A with Gil Eyal, Founder at HYPR & Managing Partner at Starfund

Media 7 | February 24, 2021

Gil Eyal, Founder at HYPR & Managing Partner at Starfund, has revolutionized the way many of the world’s biggest agencies and brands are running influencer marketing by focusing on the same data, analytics, and audience demographic information relevant to traditional digital marketing. He was recently selected at #30 on the list of the most influential people in Influencer Marketing. Gil is an accomplished public speaker and has delivered keynotes at notable influencer marketing conferences, including Influencer Marketing Days in New York and Influencer Marketing Hub in London. He was selected as the 2017 recipient of the Digiday Top Boss Award in the technology industry, as one of 10 Israelis impacting the New York Tech Scene, as well as one of 40 must-follow digital media influencers. Gil is also a two-time winner of the MarCom Awards for Excellence in Marketing and Communications....

Read More

Q&A with Vishal Srivastava, Vice President (Model Validation) at Citi

Media 7 | September 8, 2021

Vishal Srivastava, Vice President (Model Validation) at Citi was invited as a keynote speaker to present on Fraud Analytics using Machine Learning at the International Automation in Banking Summit in New York in November 2019. Vishal has experience in quantitative risk modeling using advanced engineering, statistical, and machine learning technologies. His academic qualifications in combination with a Ph.D. in Chemical Engineering and an MBA in Finance have enabled him to challenge quantitative risk models with scientific rigor. Vishal’s doctoral thesis included the development of statistical and machine learning-based risk models—some of which are currently being used commercially. Vishal has 120+ peer-reviewed citations in areas such as risk management, quantitative modeling, machine learning, and predictive analytics....

Read More

Q&A with James Lee, Managing Director and Head of Financial Services, Analytics and Cloud Transformation at PwC

Media 7 | August 16, 2021

James Lee, Managing Director and Head of Financial Services, Analytics and Cloud Transformation at PwC, is a well-recognized management consulting leader and senior technology executive specializing in advising global financial services organizations on “cloud-first, data-driven” digital transformation with data and analytics, AI, and intelligent automation. He has over 20 years of strategy consulting and technology operation experience in North America, Asia, and Europe that spanned across various industries including insurance, banking, asset and wealth management, private equity, and telecommunications....

Read More

Q&A with Gil Eyal, Founder at HYPR & Managing Partner at Starfund

Media 7 | February 24, 2021

Gil Eyal, Founder at HYPR & Managing Partner at Starfund, has revolutionized the way many of the world’s biggest agencies and brands are running influencer marketing by focusing on the same data, analytics, and audience demographic information relevant to traditional digital marketing. He was recently selected at #30 on the list of the most influential people in Influencer Marketing. Gil is an accomplished public speaker and has delivered keynotes at notable influencer marketing conferences, including Influencer Marketing Days in New York and Influencer Marketing Hub in London. He was selected as the 2017 recipient of the Digiday Top Boss Award in the technology industry, as one of 10 Israelis impacting the New York Tech Scene, as well as one of 40 must-follow digital media influencers. Gil is also a two-time winner of the MarCom Awards for Excellence in Marketing and Communications....

Read More

Q&A with Vishal Srivastava, Vice President (Model Validation) at Citi

Media 7 | September 8, 2021

Vishal Srivastava, Vice President (Model Validation) at Citi was invited as a keynote speaker to present on Fraud Analytics using Machine Learning at the International Automation in Banking Summit in New York in November 2019. Vishal has experience in quantitative risk modeling using advanced engineering, statistical, and machine learning technologies. His academic qualifications in combination with a Ph.D. in Chemical Engineering and an MBA in Finance have enabled him to challenge quantitative risk models with scientific rigor. Vishal’s doctoral thesis included the development of statistical and machine learning-based risk models—some of which are currently being used commercially. Vishal has 120+ peer-reviewed citations in areas such as risk management, quantitative modeling, machine learning, and predictive analytics....

Read More

Q&A with James Lee, Managing Director and Head of Financial Services, Analytics and Cloud Transformation at PwC

Media 7 | August 16, 2021

James Lee, Managing Director and Head of Financial Services, Analytics and Cloud Transformation at PwC, is a well-recognized management consulting leader and senior technology executive specializing in advising global financial services organizations on “cloud-first, data-driven” digital transformation with data and analytics, AI, and intelligent automation. He has over 20 years of strategy consulting and technology operation experience in North America, Asia, and Europe that spanned across various industries including insurance, banking, asset and wealth management, private equity, and telecommunications....

Read More

Related News

Data Architecture

SingleStore Announces Real-time Data Platform to Further Accelerate AI, Analytics and Application Development

SingleStore | January 25, 2024

SingleStore, the database that allows you to transact, analyze and contextualize data, today announced powerful new capabilities — making it the industry’s only real-time data platform. With its latest release, dubbed SingleStore Pro Max, the company announced ground-breaking features like indexed vector search, an on-demand compute service for GPUs/ CPUs and a new free shared tier, among several other innovative new products. Together, these capabilities shrink development cycles while providing the performance and scale that customers need for building applications. In an explosive generative AI landscape, companies are looking for a modern data platform that’s ready for enterprise AI use cases — one with best-available tooling to accelerate development, simultaneously allowing them to marry structured or semi-structured data residing in enterprise systems with unstructured data lying in data lakes. “We believe that a data platform should both create new revenue streams while also decreasing technological costs and complexity for customers. And this can only happen with simplicity at the core,” said Raj Verma, CEO, SingleStore. “This isn’t just a product update, it’s a quantum leap… SingleStore is offering truly transformative capabilities in a single platform for customers to build all kinds of real-time applications, AI or otherwise.” “At Adobe, we aim to change the world through digital experiences,” said Matt Newman, Principal Data Architect, Adobe. “SingleStore’s latest release is exciting as it pushes what is possible when it comes to database technology, real-time analytics and building modern applications that support AI workloads. We’re looking forward to these new features as more and more of our customers are seeking ways to take full advantage of generative Al capabilities.” Key new features launched include: Indexed vector search. SingleStore has announced support for vector search using Approximate Nearest Neighbor (ANN) vector indexing algorithms, leading to 800-1,000x faster vector search performance than precise methods (KNN). With both full-text and indexed vector search capabilities, SingleStore offers developers true hybrid search that takes advantage of the full power of SQL for queries, joins, filters and aggregations. These capabilities firmly place SingleStore above vector-only databases that require niche query languages and are not designed to meet enterprise security and resiliency needs. Free shared tier. SingleStore has announced a new cloud-based Free Shared Tier that’s designed for startups and developers to quickly bring their ideas to life — without the need to commit to a paid plan. On-demand compute service for GPUs and CPUs. SingleStore announces a compute service that works alongside SingleStore’s native Notebooks to let developers spin up GPUs and CPUs to run database-adjacent workloads including data preparation, ETL, third-party native application frameworks, etc. This capability brings compute to algorithms, rather than the other way around, enabling developers to build highly performant AI applications safely and securely using SingleStore — without unnecessary data movement. New CDC capabilities for data ingest and egress. To ease the burden and costs of moving data in and out of SingleStore, SingleStore is adding native capabilities for real-time Change Data Capture (CDC) in for MongoDB®, MySQL and ingestion from Apache Iceberg without requiring other third party CDC tools. SingleStore will also support CDC out capabilities that ease migrations and enable the use of SingleStore as a source for other applications and databases like data warehouses and lakehouses. SingleStore Kai™. Now generally available, and ready for both analytical and transactional processing for apps originally built on MongoDB. Announced in public preview in early 2023, SingleStore Kai is an API to deliver over 100x faster analytics on MongoDB® with no query changes or data transformations required. Today, SingleStore Kai supports BSON data format natively, has improved transactional performance, increased performance for arrays and offers industry-leading compatibility with MongoDB query language. Projections: To further advance as the world’s fastest HTAP database, SingleStore has added Projections. Projections allow developers to greatly speed up range filters and group by operations by introducing secondary sort and shard keys. Query performance improvements range from 2-3x or more, depending on the size of the table. With this latest release, SingleStore becomes the industry’s first and only real-time data platform designed for all applications, analytics and AI. SingleStore supports high-throughput ingest performance, ACID transactions and low-latency analytics; and structured, semi-structured (JSON, BSON, text) and unstructured data (vector embeddings of audio, video, images, PDFs, etc.). Finally, SingleStore’s data platform is designed not just with developers in mind, but also ML engineers, data engineers and data scientists. “Our new features and capabilities advance SingleStore’s mission of offering a real-time data platform for the next wave of gen AI and data applications,” said Nadeem Asghar, SVP, Product Management + Strategy at SingleStore. “New features, including vector search, Projections, Apache Iceberg, Scheduled Notebooks, autoscaling, GPU compute services, SingleStore Kai™, and the Free Shared Tier allow startups — as well as global enterprises — to quickly build and scale enterprise-grade real-time AI applications. We make data integration with third-party databases easy with both CDC in and CDC out support.” "Although generative AI, LLM, and vector search capabilities are early stage, they promise to deliver a richer data experience with translytical architecture," states the 2023 report, “Translytical Architecture 2.0 Evolves To Support Distributed, Multimodel, And AI Capabilities,” authored by Noel Yuhanna, Vice President and Principal Analyst at Forrester Research. "Generative AI and LLM can help democratize data through natural language query (NLQ), offering a ChatGPT-like interface. Also, vector storage and index can be leveraged to perform similarity searches to support data intelligence." SingleStore has been on a fast track leading innovation around generative AI. The company’s product evolution has been accompanied by high-momentum growth in customers and surpassing $100M in ARR late last year. SingleStore also recently ranked #2 in the emerging category of vector databases, and was recognized by TrustRadius as a top vector database in 2023. Finally, SingleStore was a winner of InfoWorld’s Technology of the year in the database category. To learn more about SingleStore visit here. About SingleStore SingleStore empowers the world’s leading organizations to build and scale modern applications using the only database that allows you to transact, analyze and contextualize data in real time. With streaming data ingestion, support for both transactions and analytics, horizontal scalability and hybrid vector search capabilities, SingleStore helps deliver 10-100x better performance at 1/3 the costs compared to legacy architectures. Hundreds of customers worldwide — including Fortune 500 companies and global data leaders — use SingleStore to power real-time applications and analytics. Learn more at singlestore.com. Follow us @SingleStoreDB on Twitter or visit www.singlestore.com.

Read More

Business Strategy

Devo Security Data Platform Attains FedRAMP Authorization

Devo | January 09, 2024

Devo Technology, the security data analytics company, today announced that the Devo Security Data Platform received Authorization to Operate (ATO) at the Moderate level under the Federal Risk and Authorization Management Program (FedRAMP). The Devo Security Data Platform successfully completed FedRAMP's rigorous accreditation process, enabling federal agencies to secure their environments with a market-leading security information and event management (SIEM). Agencies and their partners can now leverage Devo to solve their toughest IT and security challenges with unparalleled visibility and a unified view of risk posture, security operations and the threat landscape. The demand to keep pace with rapidly evolving cyber threats at cloud speed and scale has never been higher for the U.S. government. New Office of Management and Budget (OMB) regulations require federal agencies to collect and retain logs for long time periods. These requirements strain legacy SIEM and logging solutions, resulting in higher license and maintenance costs and slower query times. The Devo Security Data Platform's massive ingestion capabilities overcome these challenges and enable agencies to manage petabytes of data—from any device or application—cost-effectively and performantly in the cloud. Kayla Williams, CISO, Devo, said: "Devo relentlessly maintains the highest standards of internal security controls to ensure customers can protect themselves from security threats with peace of mind. Commercial customers have used the Devo Security Data Platform in the cloud for years, and this milestone enables us to continue to extend the same seamless experience to federal agencies and their partners." The Small Business Administration sponsored Devo's authorization. FedRAMP was established to provide a cost-effective, risk-based approach for the adoption and use of cloud services by the federal government. FedRAMP empowers agencies to use modern cloud technologies with an emphasis on the security and protection of federal information. The Devo Security Data Platform is also available in the AWS GovCloud Marketplace, an isolated AWS Region designed to host sensitive data and regulated workloads in the cloud, assisting customers with U.S. federal, state and local government compliance requirements. About Devo Devo unleashes the power of the SOC. The Devo Security Data Platform, powered by our HyperStream technology, is purpose-built to provide the speed and scale, real-time analytics, and actionable intelligence global enterprises need to defend expanding attack surfaces. An ally in keeping your organization secure, Devo combines the power of people and AI to augment security teams, leading to better insights and faster outcomes. Headquartered in Cambridge, Massachusetts, with operations in North America, Europe and Asia Pacific, Devo is backed by Insight Partners, Georgian, TCV, General Atlantic, Bessemer Venture Partners, Kibo Ventures and Eurazeo. Learn more at www.devo.com.

Read More

Big Data Management

data.world Integrates with Snowflake Data Quality Metrics to Bolster Data Trust

data.world | January 24, 2024

data.world, the data catalog platform company, today announced an integration with Snowflake, the Data Cloud company, that brings new data quality metrics and measurement capabilities to enterprises. The data.world Snowflake Collector now empowers enterprise data teams to measure data quality across their organization on-demand, unifying data quality and analytics. Customers can now achieve greater trust in their data quality and downstream analytics to support mission-critical applications, confident data-driven decision-making, and AI initiatives. Data quality remains one of the top concerns for chief data officers and a critical barrier to creating a data-driven culture. Traditionally, data quality assurance has relied on manual oversight – a process that’s tedious and fraught with inefficacy. The data.world Data Catalog Platform now delivers Snowflake data quality metrics directly to customers, streamlining quality assurance timelines and accelerating data-first initiatives. Data consumers can access contextual information in the catalog or directly within tools such as Tableau and PowerBI via Hoots – data.world’s embedded trust badges – that broadcast data health status and catalog context, bolstering transparency and trust. Additionally, teams can link certification and DataOps workflows to Snowflake's data quality metrics to automate manual workflows and quality alerts. Backed by a knowledge graph architecture, data.world provides greater insight into data quality scores via intelligence on data provenance, usage, and context – all of which support DataOps and governance workflows. “Data trust is increasingly crucial to every facet of business and data teams are struggling to verify the quality of their data, facing increased scrutiny from developers and decision-makers alike on the downstream impacts of their work, including analytics – and soon enough, AI applications,” said Jeff Hollan, Director, Product Management at Snowflake. “Our collaboration with data.world enables data teams and decision-makers to verify and trust their data’s quality to use in mission-critical applications and analytics across their business.” “High-quality data has always been a priority among enterprise data teams and decision-makers. As enterprise AI ambitions grow, the number one priority is ensuring the data powering generative AI is clean, consistent, and contextual,” said Bryon Jacob, CTO at data.world. “Alongside Snowflake, we’re taking steps to ensure data scientists, analysts, and leaders can confidently feed AI and analytics applications data that delivers high-quality insights, and supports the type of decision-making that drives their business forward.” The integration builds on the robust collaboration between data.world and Snowflake. Most recently, the companies announced an exclusive offering for joint customers, streamlining adoption timelines and offering a new attractive price point. The data.world's knowledge graph-powered data catalog already offers unique benefits for Snowflake customers, including support for Snowpark. This offering is now available to all data.world enterprise customers using the Snowflake Collector, as well as customers taking advantage of the Snowflake-only offering. To learn more about the data quality integration or the data.world data catalog platform, visit data.world. About data.world data.world is the data catalog platform built for your AI future. Its cloud-native SaaS (software-as-a-service) platform combines a consumer-grade user experience with a powerful Knowledge Graph to deliver enhanced data discovery, agile data governance, and actionable insights. data.world is a Certified B Corporation and public benefit corporation and home to the world’s largest collaborative open data community with more than two million members, including ninety percent of the Fortune 500. Our company has 76 patents and has been named one of Austin’s Best Places to Work seven years in a row.

Read More

Data Architecture

SingleStore Announces Real-time Data Platform to Further Accelerate AI, Analytics and Application Development

SingleStore | January 25, 2024

SingleStore, the database that allows you to transact, analyze and contextualize data, today announced powerful new capabilities — making it the industry’s only real-time data platform. With its latest release, dubbed SingleStore Pro Max, the company announced ground-breaking features like indexed vector search, an on-demand compute service for GPUs/ CPUs and a new free shared tier, among several other innovative new products. Together, these capabilities shrink development cycles while providing the performance and scale that customers need for building applications. In an explosive generative AI landscape, companies are looking for a modern data platform that’s ready for enterprise AI use cases — one with best-available tooling to accelerate development, simultaneously allowing them to marry structured or semi-structured data residing in enterprise systems with unstructured data lying in data lakes. “We believe that a data platform should both create new revenue streams while also decreasing technological costs and complexity for customers. And this can only happen with simplicity at the core,” said Raj Verma, CEO, SingleStore. “This isn’t just a product update, it’s a quantum leap… SingleStore is offering truly transformative capabilities in a single platform for customers to build all kinds of real-time applications, AI or otherwise.” “At Adobe, we aim to change the world through digital experiences,” said Matt Newman, Principal Data Architect, Adobe. “SingleStore’s latest release is exciting as it pushes what is possible when it comes to database technology, real-time analytics and building modern applications that support AI workloads. We’re looking forward to these new features as more and more of our customers are seeking ways to take full advantage of generative Al capabilities.” Key new features launched include: Indexed vector search. SingleStore has announced support for vector search using Approximate Nearest Neighbor (ANN) vector indexing algorithms, leading to 800-1,000x faster vector search performance than precise methods (KNN). With both full-text and indexed vector search capabilities, SingleStore offers developers true hybrid search that takes advantage of the full power of SQL for queries, joins, filters and aggregations. These capabilities firmly place SingleStore above vector-only databases that require niche query languages and are not designed to meet enterprise security and resiliency needs. Free shared tier. SingleStore has announced a new cloud-based Free Shared Tier that’s designed for startups and developers to quickly bring their ideas to life — without the need to commit to a paid plan. On-demand compute service for GPUs and CPUs. SingleStore announces a compute service that works alongside SingleStore’s native Notebooks to let developers spin up GPUs and CPUs to run database-adjacent workloads including data preparation, ETL, third-party native application frameworks, etc. This capability brings compute to algorithms, rather than the other way around, enabling developers to build highly performant AI applications safely and securely using SingleStore — without unnecessary data movement. New CDC capabilities for data ingest and egress. To ease the burden and costs of moving data in and out of SingleStore, SingleStore is adding native capabilities for real-time Change Data Capture (CDC) in for MongoDB®, MySQL and ingestion from Apache Iceberg without requiring other third party CDC tools. SingleStore will also support CDC out capabilities that ease migrations and enable the use of SingleStore as a source for other applications and databases like data warehouses and lakehouses. SingleStore Kai™. Now generally available, and ready for both analytical and transactional processing for apps originally built on MongoDB. Announced in public preview in early 2023, SingleStore Kai is an API to deliver over 100x faster analytics on MongoDB® with no query changes or data transformations required. Today, SingleStore Kai supports BSON data format natively, has improved transactional performance, increased performance for arrays and offers industry-leading compatibility with MongoDB query language. Projections: To further advance as the world’s fastest HTAP database, SingleStore has added Projections. Projections allow developers to greatly speed up range filters and group by operations by introducing secondary sort and shard keys. Query performance improvements range from 2-3x or more, depending on the size of the table. With this latest release, SingleStore becomes the industry’s first and only real-time data platform designed for all applications, analytics and AI. SingleStore supports high-throughput ingest performance, ACID transactions and low-latency analytics; and structured, semi-structured (JSON, BSON, text) and unstructured data (vector embeddings of audio, video, images, PDFs, etc.). Finally, SingleStore’s data platform is designed not just with developers in mind, but also ML engineers, data engineers and data scientists. “Our new features and capabilities advance SingleStore’s mission of offering a real-time data platform for the next wave of gen AI and data applications,” said Nadeem Asghar, SVP, Product Management + Strategy at SingleStore. “New features, including vector search, Projections, Apache Iceberg, Scheduled Notebooks, autoscaling, GPU compute services, SingleStore Kai™, and the Free Shared Tier allow startups — as well as global enterprises — to quickly build and scale enterprise-grade real-time AI applications. We make data integration with third-party databases easy with both CDC in and CDC out support.” "Although generative AI, LLM, and vector search capabilities are early stage, they promise to deliver a richer data experience with translytical architecture," states the 2023 report, “Translytical Architecture 2.0 Evolves To Support Distributed, Multimodel, And AI Capabilities,” authored by Noel Yuhanna, Vice President and Principal Analyst at Forrester Research. "Generative AI and LLM can help democratize data through natural language query (NLQ), offering a ChatGPT-like interface. Also, vector storage and index can be leveraged to perform similarity searches to support data intelligence." SingleStore has been on a fast track leading innovation around generative AI. The company’s product evolution has been accompanied by high-momentum growth in customers and surpassing $100M in ARR late last year. SingleStore also recently ranked #2 in the emerging category of vector databases, and was recognized by TrustRadius as a top vector database in 2023. Finally, SingleStore was a winner of InfoWorld’s Technology of the year in the database category. To learn more about SingleStore visit here. About SingleStore SingleStore empowers the world’s leading organizations to build and scale modern applications using the only database that allows you to transact, analyze and contextualize data in real time. With streaming data ingestion, support for both transactions and analytics, horizontal scalability and hybrid vector search capabilities, SingleStore helps deliver 10-100x better performance at 1/3 the costs compared to legacy architectures. Hundreds of customers worldwide — including Fortune 500 companies and global data leaders — use SingleStore to power real-time applications and analytics. Learn more at singlestore.com. Follow us @SingleStoreDB on Twitter or visit www.singlestore.com.

Read More

Business Strategy

Devo Security Data Platform Attains FedRAMP Authorization

Devo | January 09, 2024

Devo Technology, the security data analytics company, today announced that the Devo Security Data Platform received Authorization to Operate (ATO) at the Moderate level under the Federal Risk and Authorization Management Program (FedRAMP). The Devo Security Data Platform successfully completed FedRAMP's rigorous accreditation process, enabling federal agencies to secure their environments with a market-leading security information and event management (SIEM). Agencies and their partners can now leverage Devo to solve their toughest IT and security challenges with unparalleled visibility and a unified view of risk posture, security operations and the threat landscape. The demand to keep pace with rapidly evolving cyber threats at cloud speed and scale has never been higher for the U.S. government. New Office of Management and Budget (OMB) regulations require federal agencies to collect and retain logs for long time periods. These requirements strain legacy SIEM and logging solutions, resulting in higher license and maintenance costs and slower query times. The Devo Security Data Platform's massive ingestion capabilities overcome these challenges and enable agencies to manage petabytes of data—from any device or application—cost-effectively and performantly in the cloud. Kayla Williams, CISO, Devo, said: "Devo relentlessly maintains the highest standards of internal security controls to ensure customers can protect themselves from security threats with peace of mind. Commercial customers have used the Devo Security Data Platform in the cloud for years, and this milestone enables us to continue to extend the same seamless experience to federal agencies and their partners." The Small Business Administration sponsored Devo's authorization. FedRAMP was established to provide a cost-effective, risk-based approach for the adoption and use of cloud services by the federal government. FedRAMP empowers agencies to use modern cloud technologies with an emphasis on the security and protection of federal information. The Devo Security Data Platform is also available in the AWS GovCloud Marketplace, an isolated AWS Region designed to host sensitive data and regulated workloads in the cloud, assisting customers with U.S. federal, state and local government compliance requirements. About Devo Devo unleashes the power of the SOC. The Devo Security Data Platform, powered by our HyperStream technology, is purpose-built to provide the speed and scale, real-time analytics, and actionable intelligence global enterprises need to defend expanding attack surfaces. An ally in keeping your organization secure, Devo combines the power of people and AI to augment security teams, leading to better insights and faster outcomes. Headquartered in Cambridge, Massachusetts, with operations in North America, Europe and Asia Pacific, Devo is backed by Insight Partners, Georgian, TCV, General Atlantic, Bessemer Venture Partners, Kibo Ventures and Eurazeo. Learn more at www.devo.com.

Read More

Big Data Management

data.world Integrates with Snowflake Data Quality Metrics to Bolster Data Trust

data.world | January 24, 2024

data.world, the data catalog platform company, today announced an integration with Snowflake, the Data Cloud company, that brings new data quality metrics and measurement capabilities to enterprises. The data.world Snowflake Collector now empowers enterprise data teams to measure data quality across their organization on-demand, unifying data quality and analytics. Customers can now achieve greater trust in their data quality and downstream analytics to support mission-critical applications, confident data-driven decision-making, and AI initiatives. Data quality remains one of the top concerns for chief data officers and a critical barrier to creating a data-driven culture. Traditionally, data quality assurance has relied on manual oversight – a process that’s tedious and fraught with inefficacy. The data.world Data Catalog Platform now delivers Snowflake data quality metrics directly to customers, streamlining quality assurance timelines and accelerating data-first initiatives. Data consumers can access contextual information in the catalog or directly within tools such as Tableau and PowerBI via Hoots – data.world’s embedded trust badges – that broadcast data health status and catalog context, bolstering transparency and trust. Additionally, teams can link certification and DataOps workflows to Snowflake's data quality metrics to automate manual workflows and quality alerts. Backed by a knowledge graph architecture, data.world provides greater insight into data quality scores via intelligence on data provenance, usage, and context – all of which support DataOps and governance workflows. “Data trust is increasingly crucial to every facet of business and data teams are struggling to verify the quality of their data, facing increased scrutiny from developers and decision-makers alike on the downstream impacts of their work, including analytics – and soon enough, AI applications,” said Jeff Hollan, Director, Product Management at Snowflake. “Our collaboration with data.world enables data teams and decision-makers to verify and trust their data’s quality to use in mission-critical applications and analytics across their business.” “High-quality data has always been a priority among enterprise data teams and decision-makers. As enterprise AI ambitions grow, the number one priority is ensuring the data powering generative AI is clean, consistent, and contextual,” said Bryon Jacob, CTO at data.world. “Alongside Snowflake, we’re taking steps to ensure data scientists, analysts, and leaders can confidently feed AI and analytics applications data that delivers high-quality insights, and supports the type of decision-making that drives their business forward.” The integration builds on the robust collaboration between data.world and Snowflake. Most recently, the companies announced an exclusive offering for joint customers, streamlining adoption timelines and offering a new attractive price point. The data.world's knowledge graph-powered data catalog already offers unique benefits for Snowflake customers, including support for Snowpark. This offering is now available to all data.world enterprise customers using the Snowflake Collector, as well as customers taking advantage of the Snowflake-only offering. To learn more about the data quality integration or the data.world data catalog platform, visit data.world. About data.world data.world is the data catalog platform built for your AI future. Its cloud-native SaaS (software-as-a-service) platform combines a consumer-grade user experience with a powerful Knowledge Graph to deliver enhanced data discovery, agile data governance, and actionable insights. data.world is a Certified B Corporation and public benefit corporation and home to the world’s largest collaborative open data community with more than two million members, including ninety percent of the Fortune 500. Our company has 76 patents and has been named one of Austin’s Best Places to Work seven years in a row.

Read More

Spotlight

Denodo

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across ...

Events

Resources

resource image

Big Data Management, Business Strategy

Logical Data Fabric

Whitepaper

resource image

Big Data Management, Business Strategy

Logical Data Fabric

Whitepaper

Events