Q&A with Charles Southwood, Vice President, N. Europe and MEA at Denodo

Charles Southwood, Regional VP at Denodo Technologies is responsible for the company’s business revenues in Northern Europe, Middle East and South Africa. He is passionate about working in rapidly moving and innovative markets to support customer success and to align IT solutions that meet the changing business needs. With a degree in engineering from Imperial College London, Charles has over 20 years of experience in data integration, big data, IT infrastructure/IT operations and Business Analytics.

There is a huge buzz around AI and for about 10% of businesses, it’s already being used to give better faster business decision-making processes.



MEDIA 7: Could you please tell us a little bit about yourself and what made you choose this career path?
CHARLES SOUTHWOOD:
Like several friends and colleagues who now work in IT, I started my career as a civil engineer and moved into IT as the industry matured. Over the years I have found that many of the attributes that come from that early engineering background are also well-suited to the logical and analytical approaches needed for solving customer business problems with data and IT solutions and to the efficient operation of a sales and marketing business. After graduating from Imperial College, I spent several years in the industry with Taylor Woodrow both on-site and in the design office. This included a unique opportunity to spend several months in the Amazon jungle designing and building bridges to connect remote villages that would otherwise be cut off during the rainy season; both an unusual and very rewarding experience for someone in their mid-twenties!

Upon returning to the UK I moved into the design office, designing dams, oil rigs, power stations and river diversion schemes. Much of the design included the use of CAD/CAM systems and finite difference and finite element analysis (using large matrix manipulation) to model loadings and stresses in structures and to predicting fluid flow patterns. Although business-related computing was in its infancy, I could see enormous potential and found the prospects for much wider use of IT to be very enticing. An opportunity presented itself to move to a company that sold solutions and consultancy services both for engineering and for general business applications. I joined the company in a sales position and although I still had my links to engineering, the faster growth in use of IT for business inevitably led to my focus on this area. This in turn led to other sales positions and into sales management and then wider business leadership roles.

Over the last 25 years, I have therefore had the pleasure of working in a number of software houses, driving the sales and marketing operations of companies where the predominant focus is on innovative market solutions, often disruptive in their nature. I am privileged to have had the opportunity to discuss business issues and strategies with a vast number of large and mid-sized organisations, across different industries and with varying stages of IT and data maturity. It is great to be able to discuss the ‘big-picture’ perspectives on the trends and needs of those client businesses and to subsequently see the impact that can be achieved by supporting them with the right solutions. For the last 5 years, as Regional Vice President of Denodo Technologies, I’ve been responsible for leading the company’s business growth across northern Europe, the Middle East and Africa.

With ever-increasing data volumes, a greater variety of data formats and more disparate locations coupled with the mounting desire for real-time data insights we’ve seen rapid growth in demand for improved data integration. As the world’s market leader in data virtualisation we’re seeing strong adoption from all sectors, addressing the applications integration requirements and data integration needs for a wide array of use cases from self-service and agile BI to Customer360, cloud migration, applications integration and compliance/risk. It is an exciting time to be at the heart of these industry initiatives!


M7: Your award-winning data virtualization platform is regularly featured in the world's leading business and IT publications. What are the core values that drive the company to be a leader in data virtualization?
CS:
Denodo was founded in 1999 and has been focused on providing data virtualisation throughout that time. In fact, many look upon Denodo as the ‘founding father’ of data virtualisation technology and the Denodo Platform capabilities, as you say, allow it to be placed consistently in leadership categories of the various analyst reports on data and applications integration. For example, in Gartner’s latest Magic Quadrant for Data Integration (June 2021) Denodo is listed as the fastest growing vendor in the Leadership Quadrant. The company is privately owned, headquartered in Palo Alto and is led by its founder and CEO, Angel Vina.

Financially, Denodo is in great shape with some 50-60% year-on-year revenue growth and a business that is both profitable and has no long-term debt. The combination of strength from consistent executive leadership coupled with strong financials means that we’ve been able to enjoy an unrivalled focus on ‘excellence’; excellence in technical capability, excellence in innovation and excellence in customer satisfaction. This is reflected in the vibrant and enthusiastic user community for Denodo, which can be seen in the many references and case studies available on Denodo’s web site as well as the customer satisfaction ratings in the various vendor surveys.


It is no longer practical to move and copy all the data that might be needed into one large curated data warehouse or data lake. It is too slow, too complex and too costly.



M7: Given the changes in today’s rapidly evolving marketplace, what are the key 'climatic' changes taking place in Data Virtualization today? What systematic approach are you using to address these changes?
CS:
When talking to businesses about their wider needs from data integration, the challenges often include siloed data, legacy applications (often designed with no integration in mind), cloud migration and hybrid cloud (on-premise and cloud combinations) or multi-cloud combination, ungoverned ‘shadow IT’ and the wide variety of formats and protocols. Combine this with the difficulties of the rapidly growing data volumes, the disparate nature of modern data stores (on -premise, SaaS, and various cloud locations plus perhaps external sources) as well as the business demand for real-time or near real-time data insights and you have some big challenges! If you are to refer to these as ‘climatic changes’ then most businesses are encountering a heat-wave! This is typically when Denodo gets invited to the party.It is no longer practical to move and copy all the data that might be needed into one large curated data warehouse or data lake. It is too slow, too complex and too costly. The industry is littered with stories of the difficulties encountered with the move-and-copy approach and big data lake projects that never end.

Instead, connecting a virtual layer to all potential sources (without moving and copying the data) means that data consumers can be connected to just the virtual layer and get secure access to all they might need regardless of the location or format. The integration can be provided in real-time inside the virtual layer. It saves on the complexity and cost of all the ‘heavy lifting’ and duplication of vast amounts of data and the users get the agility and real-time data (rather than the batch overnight versions they were getting from physical data warehouses). Of course, IT also benefits from a single layer for audit, security and governance, ensuring compliance and much faster data access for the business. We all do something similar in the home when we watch movies or listen to music with services like Netflix or Spotify. We get what we want when we want it and don’t have to worry about where it is stored. We also get an almost infinite choice as there is no need to hold all the DVDs or CD collections locally, on our shelves. Data virtualisation provides the same kind of paradigm but for business. When it comes to data, ‘Connect, don’t collect’!


M7: Could you please help our readers understand the relationship between data virtualization and digital transformation?
CS:
This is a very good question as it is easy to mix the terms of data virtualisation and digital transformation! Digital transformation is the strategic adoption of digital technology to meet changing market and business needs. It often involves the creation of new business processes or the modification of existing ones. As organisations reduce or eliminate paper-based processes we see the switch from analogue to digital. It brings with it new opportunities and new customer interactions that would previously have been impossible.Interestingly data virtualisation is often used in digital transformation projects, as a key element in the new digitised architecture, giving greater agility to the business, with new products, processes and services available through the combination of data in new and innovative ways. Take for example a new omnichannel service to customer interaction, offering a mobile app, web-based access and call centre service as well as perhaps internal management reports and real-time executive dashboards.

These different data consumers require different technologies and therefore different formats and data delivery protocols. Without data virtualisation each different flavour of consumer would need specific interfaces and manipulation of the data to provide the right exposure. This is time-consuming, requires coding and data movement and is both costly and potentially error-prone. It also makes it hard to make changes in a quick and agile manner. However, by using a data virtualisation layer, all the data combinations and transformations can be defined just once, linking the published data to the different data consumers from the same single data model. Now you have a very agile, easy to maintain system with consistent data to all consumers irrespective of the technology. Oh, and let’s not forget it is now also real-time or near real-time from the source! For many businesses, this can be truly transformational!


Many AI applications have obvious and direct benefits which most people would accept as reasonable but as the level of sophistication grows the issues will inevitably be more complex.



M7: How has COVID-19 affected the market? What was Denodo’s strategy during the pandemic?
CS:
The demand for data virtualisation is driven by projects and strategies requiring faster, more digital business models where data is recognised as having the potential to be a key differentiator in the business; new insights, faster interactions, new products and services, greater customer intelligence, propensity models, improved governance and compliance, reduced risk and the adoption of AI/ML are all typical business drivers for data virtualisation. The solution is very much industry-agnostic and is therefore used widely across most major business sectors including finance, telco, pharma, manufacturing, government, retail and utilities. For this reason, we saw some sectors accelerating their adoption and others holding back. In banking, for example, the branch closures drove greater digitisation and the adoption of online banking and likewise, the move to a virtually cashless economy provided far more comprehensive data sets for the analysis of customer spending patterns than could ever have been achieved when cash was in play.

For our part, Denodo was keen to help combat the global challenges we all faced. We, therefore, spun up the data portal with the sole aim of integrating disparate global data about COVID-19, curating them, and then providing them free of charge to data consumers such as data scientists, analysts, and researchers, who could use them to look at the impact and potential solutions to this disease. The portal operated for some 15 months, with categories of content from several 100 official data sources from around the world (such as NHS England, ONS, WHO etc) for everyone to consume. It was set up and running in just 21 days and demonstrates the agility available from using data virtualisation for analytics. Some views were accessed 10s of 1,000s of times over its period of operation. The object has been to use data insights to help whoever and wherever in the world to fight COVID-19. It wasn’t just medical data; other examples include the ability to highlight the per cent change in visits to places like grocery stores and parks within a geographic area. Another being the ability to look at changes in air quality across regions over time, based on changing citizen movements. Collaboration was needed to maximise the value of this initiative to society and the portal statistics indicate that it was put to good use over the last 18 months. It’s a great example of the use of data virtualisation for high-velocity high volume analytics.

We’ve also had feedback from customers on their own application of the Denodo Platform during the pandemic. NHS National Services Scotland for example used Denodo’s technology to solve challenges, such as:

-The provisioning of data for COVID reporting to the Scottish Government and the First Minister in Scotland

-Helping to monitor oxygen availability at various hospitals

-Enabling data for Test and Trace of COVID patients and general public

-Implementing the vaccination programme, including venue and appointment management, eligibility, uptake, etc.

-The planning and availability of required PPE equipment and other supplies

Overall, given the huge impact on some industry sectors, we were fortunate to have been in IT as our business was able to work remotely, largely without difficulty, and we were able to continue to support our customers’ needs.


M7: How do you prepare for an AI-centric world as a Business Leader?
CS:
One of the fundamental technical requirements for good Artificial Intelligence is the availability of high quality comprehensive and timely data. For this reason, we are often asked to help, by using data virtualisation to abstract data from different sources in real-time and present the transformed results to AI models. Certainly, there is a huge buzz around AI and for about 10% of businesses, it’s already being used to give better faster business decision-making processes. Based on what we see there are 3 common strategies in the market, business-led, and technology-led and a combined iterative approach. The business-led strategy is driven by business demand for a specific capability. Initially, this seems to be largely for personalisation and automated customer service (chatbots), but now we’re also seeing a growing sophistication in use for real-time fraud detection/other “detection and prevention” models as well as improvements in operational efficiencies (e.g. customer onboarding)

Where we see technology-led strategies AI is being used by businesses using data science and business analytics to understand data and the potential for insights – availability, timeliness, quality, veracity & other attributes etc. What is where? Can it be used? What combinations have value? This approach is much more of a data-led initiative. The third strategy is the combination of business-led demand and technology-driven insights into what might be possible to solve certain business needs as an iterative process. In many companies, it is the work of the data scientist that shows the greatest potential for value in the use of AI/ML with new insights and new data combinations. At Denodo we see growing demand for data virtualisation as this substantially accelerates the access to data and therefore the surfacing of new meaningful insights that give a competitive edge to the business.

The value of AI may be compelling, for example for fraud prevention on payments, more advertising revenue from customised adverts but we have to balance this against the need for privacy compliance with GPDR, the Data Protection Act 2018, etc. The largest cost for non-compliance on privacy is often not the fine itself but the brand damage, so we often hear of priority being given to privacy compliance over analytics needs. There are also a host of ethical issues arising from the use of AI. In the short term enterprises have to show value to their own business and society. Many AI applications have obvious and direct benefits which most people would accept as reasonable but as the level of sophistication grows the issues will inevitably be more complex.

ABOUT DENODO

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across every major industry have gained significant business agility and ROI by enabling faster and easier access to unified business information for agile BI, big data analytics, Web, cloud integration, single-view applications, and enterprise data services.

More THOUGHT LEADERS

Q&A with Gil Eyal, Founder at HYPR & Managing Partner at Starfund

Media 7 | February 24, 2021

Gil Eyal, Founder at HYPR & Managing Partner at Starfund, has revolutionized the way many of the world’s biggest agencies and brands are running influencer marketing by focusing on the same data, analytics, and audience demographic information relevant to traditional digital marketing. He was recently selected at #30 on the list of the most influential people in Influencer Marketing. Gil is an accomplished public speaker and has delivered keynotes at notable influencer marketing conferences, including Influencer Marketing Days in New York and Influencer Marketing Hub in London. He was selected as the 2017 recipient of the Digiday Top Boss Award in the technology industry, as one of 10 Israelis impacting the New York Tech Scene, as well as one of 40 must-follow digital media influencers. Gil is also a two-time winner of the MarCom Awards for Excellence in Marketing and Communications....

Read More

Q&A with James Lee, Managing Director and Head of Financial Services, Analytics and Cloud Transformation at PwC

Media 7 | August 16, 2021

James Lee, Managing Director and Head of Financial Services, Analytics and Cloud Transformation at PwC, is a well-recognized management consulting leader and senior technology executive specializing in advising global financial services organizations on “cloud-first, data-driven” digital transformation with data and analytics, AI, and intelligent automation. He has over 20 years of strategy consulting and technology operation experience in North America, Asia, and Europe that spanned across various industries including insurance, banking, asset and wealth management, private equity, and telecommunications....

Read More

Q&A with Alastair Speare-Cole, President of Insurance at QOMPLX

Media 7 | August 20, 2021

Alastair Speare-Cole, President and General Manager of the Insurance Division at QOMPLX, leads the overall strategy for the business unit, the development of QOMPLX’s underwriting-as-a-service platform, the management of the company’s Managing General Agent (MGA), as well as setting the direction for the company’s next-generation insurance decision platform that leverages a wide variety of data and advanced analytics to provide advanced risk and portfolio management solutions. Prior to joining QOMPLX, he served as Chief Underwriting Officer at Qatar, and he served as the CEO of JLT Towers from 2012 to 2015. He was also COO at Aon Re for ten years and has also held board appointments at reinsurance and banking subsidiaries in the United Kingdom....

Read More

Q&A with Gil Eyal, Founder at HYPR & Managing Partner at Starfund

Media 7 | February 24, 2021

Gil Eyal, Founder at HYPR & Managing Partner at Starfund, has revolutionized the way many of the world’s biggest agencies and brands are running influencer marketing by focusing on the same data, analytics, and audience demographic information relevant to traditional digital marketing. He was recently selected at #30 on the list of the most influential people in Influencer Marketing. Gil is an accomplished public speaker and has delivered keynotes at notable influencer marketing conferences, including Influencer Marketing Days in New York and Influencer Marketing Hub in London. He was selected as the 2017 recipient of the Digiday Top Boss Award in the technology industry, as one of 10 Israelis impacting the New York Tech Scene, as well as one of 40 must-follow digital media influencers. Gil is also a two-time winner of the MarCom Awards for Excellence in Marketing and Communications....

Read More

Q&A with James Lee, Managing Director and Head of Financial Services, Analytics and Cloud Transformation at PwC

Media 7 | August 16, 2021

James Lee, Managing Director and Head of Financial Services, Analytics and Cloud Transformation at PwC, is a well-recognized management consulting leader and senior technology executive specializing in advising global financial services organizations on “cloud-first, data-driven” digital transformation with data and analytics, AI, and intelligent automation. He has over 20 years of strategy consulting and technology operation experience in North America, Asia, and Europe that spanned across various industries including insurance, banking, asset and wealth management, private equity, and telecommunications....

Read More

Q&A with Alastair Speare-Cole, President of Insurance at QOMPLX

Media 7 | August 20, 2021

Alastair Speare-Cole, President and General Manager of the Insurance Division at QOMPLX, leads the overall strategy for the business unit, the development of QOMPLX’s underwriting-as-a-service platform, the management of the company’s Managing General Agent (MGA), as well as setting the direction for the company’s next-generation insurance decision platform that leverages a wide variety of data and advanced analytics to provide advanced risk and portfolio management solutions. Prior to joining QOMPLX, he served as Chief Underwriting Officer at Qatar, and he served as the CEO of JLT Towers from 2012 to 2015. He was also COO at Aon Re for ten years and has also held board appointments at reinsurance and banking subsidiaries in the United Kingdom....

Read More

Related News

Big Data Management

data.world Integrates with Snowflake Data Quality Metrics to Bolster Data Trust

data.world | January 24, 2024

data.world, the data catalog platform company, today announced an integration with Snowflake, the Data Cloud company, that brings new data quality metrics and measurement capabilities to enterprises. The data.world Snowflake Collector now empowers enterprise data teams to measure data quality across their organization on-demand, unifying data quality and analytics. Customers can now achieve greater trust in their data quality and downstream analytics to support mission-critical applications, confident data-driven decision-making, and AI initiatives. Data quality remains one of the top concerns for chief data officers and a critical barrier to creating a data-driven culture. Traditionally, data quality assurance has relied on manual oversight – a process that’s tedious and fraught with inefficacy. The data.world Data Catalog Platform now delivers Snowflake data quality metrics directly to customers, streamlining quality assurance timelines and accelerating data-first initiatives. Data consumers can access contextual information in the catalog or directly within tools such as Tableau and PowerBI via Hoots – data.world’s embedded trust badges – that broadcast data health status and catalog context, bolstering transparency and trust. Additionally, teams can link certification and DataOps workflows to Snowflake's data quality metrics to automate manual workflows and quality alerts. Backed by a knowledge graph architecture, data.world provides greater insight into data quality scores via intelligence on data provenance, usage, and context – all of which support DataOps and governance workflows. “Data trust is increasingly crucial to every facet of business and data teams are struggling to verify the quality of their data, facing increased scrutiny from developers and decision-makers alike on the downstream impacts of their work, including analytics – and soon enough, AI applications,” said Jeff Hollan, Director, Product Management at Snowflake. “Our collaboration with data.world enables data teams and decision-makers to verify and trust their data’s quality to use in mission-critical applications and analytics across their business.” “High-quality data has always been a priority among enterprise data teams and decision-makers. As enterprise AI ambitions grow, the number one priority is ensuring the data powering generative AI is clean, consistent, and contextual,” said Bryon Jacob, CTO at data.world. “Alongside Snowflake, we’re taking steps to ensure data scientists, analysts, and leaders can confidently feed AI and analytics applications data that delivers high-quality insights, and supports the type of decision-making that drives their business forward.” The integration builds on the robust collaboration between data.world and Snowflake. Most recently, the companies announced an exclusive offering for joint customers, streamlining adoption timelines and offering a new attractive price point. The data.world's knowledge graph-powered data catalog already offers unique benefits for Snowflake customers, including support for Snowpark. This offering is now available to all data.world enterprise customers using the Snowflake Collector, as well as customers taking advantage of the Snowflake-only offering. To learn more about the data quality integration or the data.world data catalog platform, visit data.world. About data.world data.world is the data catalog platform built for your AI future. Its cloud-native SaaS (software-as-a-service) platform combines a consumer-grade user experience with a powerful Knowledge Graph to deliver enhanced data discovery, agile data governance, and actionable insights. data.world is a Certified B Corporation and public benefit corporation and home to the world’s largest collaborative open data community with more than two million members, including ninety percent of the Fortune 500. Our company has 76 patents and has been named one of Austin’s Best Places to Work seven years in a row.

Read More

Big Data Management

The Modern Data Company Recognized in Gartner's Magic Quadrant for Data Integration

The Modern Data Company | January 23, 2024

The Modern Data Company, recognized for its expertise in developing and managing advanced data products, is delighted to announce its distinction as an honorable mention in Gartner's 'Magic Quadrant for Data Integration Tools,' powered by our leading product, DataOS. “This accolade underscores our commitment to productizing data and revolutionizing data management technologies. Our focus extends beyond traditional data management, guiding companies on their journey to effectively utilize data, realize tangible ROI on their data investments, and harness advanced technologies such as AI, ML, and Large Language Models (LLMs). This recognition is a testament to Modern Data’s alignment with the latest industry trends and our dedication to setting new standards in data integration and utilization.” – Srujan Akula, CEO of The Modern Data Company The inclusion in the Gartner report highlights The Modern Data Company's pivotal role in shaping the future of data integration. Our innovative approach, embodied in DataOS, enables businesses to navigate the complexities of data management, transforming data into a strategic asset. By simplifying data access and integration, we empower organizations to unlock the full potential of their data, driving insights and innovation without disruption. "Modern Data's recognition as an Honorable Mention in the Gartner MQ for Data Integration is a testament to the transformative impact their solutions have on businesses like ours. DataOS has been pivotal in allowing us to integrate multiple data sources, enabling our teams to have access to the data needed to make data driven decisions." – Emma Spight, SVP Technology, MIND 24-7 The Modern Data Company simplifies how organizations manage, access, and interact with data using its DataOS (data operating system) that unifies data silos, at scale. It provides ontology support, graph modeling, and a virtual data tier (e.g. a customer 360 model). From a technical point of view, it closes the gap from conceptual to physical data model. Users can define conceptually what they want and its software traverses and integrates data. DataOS provides a structured, repeatable approach to data integration that enhances agility and ensures high-quality outputs. This shift from traditional pipeline management to data products allows for more efficient data operations, as each 'product' is designed with a specific purpose and standardized interfaces, ensuring consistency across different uses and applications. With DataOS, businesses can expect a transformative impact on their data strategies, marked by increased efficiency and a robust framework for handling complex data ecosystems, allowing for more and faster iterations of conceptual models. About The Modern Data Company The Modern Data Company, with its flagship product DataOS, revolutionizes the creation of data products. DataOS® is engineered to build and manage comprehensive data products to foster data mesh adoption, propelling organizations towards a data-driven future. DataOS directly addresses key AI/ML and LLM challenges: ensuring quality data, scaling computational resources, and integrating seamlessly into business processes. In our commitment to provide open systems, we have created an open data developer platform specification that is gaining wide industry support.

Read More

Business Strategy

Devo Security Data Platform Attains FedRAMP Authorization

Devo | January 09, 2024

Devo Technology, the security data analytics company, today announced that the Devo Security Data Platform received Authorization to Operate (ATO) at the Moderate level under the Federal Risk and Authorization Management Program (FedRAMP). The Devo Security Data Platform successfully completed FedRAMP's rigorous accreditation process, enabling federal agencies to secure their environments with a market-leading security information and event management (SIEM). Agencies and their partners can now leverage Devo to solve their toughest IT and security challenges with unparalleled visibility and a unified view of risk posture, security operations and the threat landscape. The demand to keep pace with rapidly evolving cyber threats at cloud speed and scale has never been higher for the U.S. government. New Office of Management and Budget (OMB) regulations require federal agencies to collect and retain logs for long time periods. These requirements strain legacy SIEM and logging solutions, resulting in higher license and maintenance costs and slower query times. The Devo Security Data Platform's massive ingestion capabilities overcome these challenges and enable agencies to manage petabytes of data—from any device or application—cost-effectively and performantly in the cloud. Kayla Williams, CISO, Devo, said: "Devo relentlessly maintains the highest standards of internal security controls to ensure customers can protect themselves from security threats with peace of mind. Commercial customers have used the Devo Security Data Platform in the cloud for years, and this milestone enables us to continue to extend the same seamless experience to federal agencies and their partners." The Small Business Administration sponsored Devo's authorization. FedRAMP was established to provide a cost-effective, risk-based approach for the adoption and use of cloud services by the federal government. FedRAMP empowers agencies to use modern cloud technologies with an emphasis on the security and protection of federal information. The Devo Security Data Platform is also available in the AWS GovCloud Marketplace, an isolated AWS Region designed to host sensitive data and regulated workloads in the cloud, assisting customers with U.S. federal, state and local government compliance requirements. About Devo Devo unleashes the power of the SOC. The Devo Security Data Platform, powered by our HyperStream technology, is purpose-built to provide the speed and scale, real-time analytics, and actionable intelligence global enterprises need to defend expanding attack surfaces. An ally in keeping your organization secure, Devo combines the power of people and AI to augment security teams, leading to better insights and faster outcomes. Headquartered in Cambridge, Massachusetts, with operations in North America, Europe and Asia Pacific, Devo is backed by Insight Partners, Georgian, TCV, General Atlantic, Bessemer Venture Partners, Kibo Ventures and Eurazeo. Learn more at www.devo.com.

Read More

Big Data Management

data.world Integrates with Snowflake Data Quality Metrics to Bolster Data Trust

data.world | January 24, 2024

data.world, the data catalog platform company, today announced an integration with Snowflake, the Data Cloud company, that brings new data quality metrics and measurement capabilities to enterprises. The data.world Snowflake Collector now empowers enterprise data teams to measure data quality across their organization on-demand, unifying data quality and analytics. Customers can now achieve greater trust in their data quality and downstream analytics to support mission-critical applications, confident data-driven decision-making, and AI initiatives. Data quality remains one of the top concerns for chief data officers and a critical barrier to creating a data-driven culture. Traditionally, data quality assurance has relied on manual oversight – a process that’s tedious and fraught with inefficacy. The data.world Data Catalog Platform now delivers Snowflake data quality metrics directly to customers, streamlining quality assurance timelines and accelerating data-first initiatives. Data consumers can access contextual information in the catalog or directly within tools such as Tableau and PowerBI via Hoots – data.world’s embedded trust badges – that broadcast data health status and catalog context, bolstering transparency and trust. Additionally, teams can link certification and DataOps workflows to Snowflake's data quality metrics to automate manual workflows and quality alerts. Backed by a knowledge graph architecture, data.world provides greater insight into data quality scores via intelligence on data provenance, usage, and context – all of which support DataOps and governance workflows. “Data trust is increasingly crucial to every facet of business and data teams are struggling to verify the quality of their data, facing increased scrutiny from developers and decision-makers alike on the downstream impacts of their work, including analytics – and soon enough, AI applications,” said Jeff Hollan, Director, Product Management at Snowflake. “Our collaboration with data.world enables data teams and decision-makers to verify and trust their data’s quality to use in mission-critical applications and analytics across their business.” “High-quality data has always been a priority among enterprise data teams and decision-makers. As enterprise AI ambitions grow, the number one priority is ensuring the data powering generative AI is clean, consistent, and contextual,” said Bryon Jacob, CTO at data.world. “Alongside Snowflake, we’re taking steps to ensure data scientists, analysts, and leaders can confidently feed AI and analytics applications data that delivers high-quality insights, and supports the type of decision-making that drives their business forward.” The integration builds on the robust collaboration between data.world and Snowflake. Most recently, the companies announced an exclusive offering for joint customers, streamlining adoption timelines and offering a new attractive price point. The data.world's knowledge graph-powered data catalog already offers unique benefits for Snowflake customers, including support for Snowpark. This offering is now available to all data.world enterprise customers using the Snowflake Collector, as well as customers taking advantage of the Snowflake-only offering. To learn more about the data quality integration or the data.world data catalog platform, visit data.world. About data.world data.world is the data catalog platform built for your AI future. Its cloud-native SaaS (software-as-a-service) platform combines a consumer-grade user experience with a powerful Knowledge Graph to deliver enhanced data discovery, agile data governance, and actionable insights. data.world is a Certified B Corporation and public benefit corporation and home to the world’s largest collaborative open data community with more than two million members, including ninety percent of the Fortune 500. Our company has 76 patents and has been named one of Austin’s Best Places to Work seven years in a row.

Read More

Big Data Management

The Modern Data Company Recognized in Gartner's Magic Quadrant for Data Integration

The Modern Data Company | January 23, 2024

The Modern Data Company, recognized for its expertise in developing and managing advanced data products, is delighted to announce its distinction as an honorable mention in Gartner's 'Magic Quadrant for Data Integration Tools,' powered by our leading product, DataOS. “This accolade underscores our commitment to productizing data and revolutionizing data management technologies. Our focus extends beyond traditional data management, guiding companies on their journey to effectively utilize data, realize tangible ROI on their data investments, and harness advanced technologies such as AI, ML, and Large Language Models (LLMs). This recognition is a testament to Modern Data’s alignment with the latest industry trends and our dedication to setting new standards in data integration and utilization.” – Srujan Akula, CEO of The Modern Data Company The inclusion in the Gartner report highlights The Modern Data Company's pivotal role in shaping the future of data integration. Our innovative approach, embodied in DataOS, enables businesses to navigate the complexities of data management, transforming data into a strategic asset. By simplifying data access and integration, we empower organizations to unlock the full potential of their data, driving insights and innovation without disruption. "Modern Data's recognition as an Honorable Mention in the Gartner MQ for Data Integration is a testament to the transformative impact their solutions have on businesses like ours. DataOS has been pivotal in allowing us to integrate multiple data sources, enabling our teams to have access to the data needed to make data driven decisions." – Emma Spight, SVP Technology, MIND 24-7 The Modern Data Company simplifies how organizations manage, access, and interact with data using its DataOS (data operating system) that unifies data silos, at scale. It provides ontology support, graph modeling, and a virtual data tier (e.g. a customer 360 model). From a technical point of view, it closes the gap from conceptual to physical data model. Users can define conceptually what they want and its software traverses and integrates data. DataOS provides a structured, repeatable approach to data integration that enhances agility and ensures high-quality outputs. This shift from traditional pipeline management to data products allows for more efficient data operations, as each 'product' is designed with a specific purpose and standardized interfaces, ensuring consistency across different uses and applications. With DataOS, businesses can expect a transformative impact on their data strategies, marked by increased efficiency and a robust framework for handling complex data ecosystems, allowing for more and faster iterations of conceptual models. About The Modern Data Company The Modern Data Company, with its flagship product DataOS, revolutionizes the creation of data products. DataOS® is engineered to build and manage comprehensive data products to foster data mesh adoption, propelling organizations towards a data-driven future. DataOS directly addresses key AI/ML and LLM challenges: ensuring quality data, scaling computational resources, and integrating seamlessly into business processes. In our commitment to provide open systems, we have created an open data developer platform specification that is gaining wide industry support.

Read More

Business Strategy

Devo Security Data Platform Attains FedRAMP Authorization

Devo | January 09, 2024

Devo Technology, the security data analytics company, today announced that the Devo Security Data Platform received Authorization to Operate (ATO) at the Moderate level under the Federal Risk and Authorization Management Program (FedRAMP). The Devo Security Data Platform successfully completed FedRAMP's rigorous accreditation process, enabling federal agencies to secure their environments with a market-leading security information and event management (SIEM). Agencies and their partners can now leverage Devo to solve their toughest IT and security challenges with unparalleled visibility and a unified view of risk posture, security operations and the threat landscape. The demand to keep pace with rapidly evolving cyber threats at cloud speed and scale has never been higher for the U.S. government. New Office of Management and Budget (OMB) regulations require federal agencies to collect and retain logs for long time periods. These requirements strain legacy SIEM and logging solutions, resulting in higher license and maintenance costs and slower query times. The Devo Security Data Platform's massive ingestion capabilities overcome these challenges and enable agencies to manage petabytes of data—from any device or application—cost-effectively and performantly in the cloud. Kayla Williams, CISO, Devo, said: "Devo relentlessly maintains the highest standards of internal security controls to ensure customers can protect themselves from security threats with peace of mind. Commercial customers have used the Devo Security Data Platform in the cloud for years, and this milestone enables us to continue to extend the same seamless experience to federal agencies and their partners." The Small Business Administration sponsored Devo's authorization. FedRAMP was established to provide a cost-effective, risk-based approach for the adoption and use of cloud services by the federal government. FedRAMP empowers agencies to use modern cloud technologies with an emphasis on the security and protection of federal information. The Devo Security Data Platform is also available in the AWS GovCloud Marketplace, an isolated AWS Region designed to host sensitive data and regulated workloads in the cloud, assisting customers with U.S. federal, state and local government compliance requirements. About Devo Devo unleashes the power of the SOC. The Devo Security Data Platform, powered by our HyperStream technology, is purpose-built to provide the speed and scale, real-time analytics, and actionable intelligence global enterprises need to defend expanding attack surfaces. An ally in keeping your organization secure, Devo combines the power of people and AI to augment security teams, leading to better insights and faster outcomes. Headquartered in Cambridge, Massachusetts, with operations in North America, Europe and Asia Pacific, Devo is backed by Insight Partners, Georgian, TCV, General Atlantic, Bessemer Venture Partners, Kibo Ventures and Eurazeo. Learn more at www.devo.com.

Read More

Spotlight

Denodo

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across ...

Events

Resources