Q&A with Charles Southwood, Vice President, N. Europe and MEA at Denodo

Media 7 | September 15, 2021

Charles Southwood, Regional VP at Denodo Technologies is responsible for the company’s business revenues in Northern Europe, Middle East and South Africa. He is passionate about working in rapidly moving and innovative markets to support customer success and to align IT solutions that meet the changing business needs. With a degree in engineering from Imperial College London, Charles has over 20 years of experience in data integration, big data, IT infrastructure/IT operations and Business Analytics.

There is a huge buzz around AI and for about 10% of businesses, it’s already being used to give better faster business decision-making processes.



MEDIA 7: Could you please tell us a little bit about yourself and what made you choose this career path?
CHARLES SOUTHWOOD:
Like several friends and colleagues who now work in IT, I started my career as a civil engineer and moved into IT as the industry matured. Over the years I have found that many of the attributes that come from that early engineering background are also well-suited to the logical and analytical approaches needed for solving customer business problems with data and IT solutions and to the efficient operation of a sales and marketing business. After graduating from Imperial College, I spent several years in the industry with Taylor Woodrow both on-site and in the design office. This included a unique opportunity to spend several months in the Amazon jungle designing and building bridges to connect remote villages that would otherwise be cut off during the rainy season; both an unusual and very rewarding experience for someone in their mid-twenties!

Upon returning to the UK I moved into the design office, designing dams, oil rigs, power stations and river diversion schemes. Much of the design included the use of CAD/CAM systems and finite difference and finite element analysis (using large matrix manipulation) to model loadings and stresses in structures and to predicting fluid flow patterns. Although business-related computing was in its infancy, I could see enormous potential and found the prospects for much wider use of IT to be very enticing. An opportunity presented itself to move to a company that sold solutions and consultancy services both for engineering and for general business applications. I joined the company in a sales position and although I still had my links to engineering, the faster growth in use of IT for business inevitably led to my focus on this area. This in turn led to other sales positions and into sales management and then wider business leadership roles.

Over the last 25 years, I have therefore had the pleasure of working in a number of software houses, driving the sales and marketing operations of companies where the predominant focus is on innovative market solutions, often disruptive in their nature. I am privileged to have had the opportunity to discuss business issues and strategies with a vast number of large and mid-sized organisations, across different industries and with varying stages of IT and data maturity. It is great to be able to discuss the ‘big-picture’ perspectives on the trends and needs of those client businesses and to subsequently see the impact that can be achieved by supporting them with the right solutions. For the last 5 years, as Regional Vice President of Denodo Technologies, I’ve been responsible for leading the company’s business growth across northern Europe, the Middle East and Africa.

With ever-increasing data volumes, a greater variety of data formats and more disparate locations coupled with the mounting desire for real-time data insights we’ve seen rapid growth in demand for improved data integration. As the world’s market leader in data virtualisation we’re seeing strong adoption from all sectors, addressing the applications integration requirements and data integration needs for a wide array of use cases from self-service and agile BI to Customer360, cloud migration, applications integration and compliance/risk. It is an exciting time to be at the heart of these industry initiatives!


M7: Your award-winning data virtualization platform is regularly featured in the world's leading business and IT publications. What are the core values that drive the company to be a leader in data virtualization?
CS:
Denodo was founded in 1999 and has been focused on providing data virtualisation throughout that time. In fact, many look upon Denodo as the ‘founding father’ of data virtualisation technology and the Denodo Platform capabilities, as you say, allow it to be placed consistently in leadership categories of the various analyst reports on data and applications integration. For example, in Gartner’s latest Magic Quadrant for Data Integration (June 2021) Denodo is listed as the fastest growing vendor in the Leadership Quadrant. The company is privately owned, headquartered in Palo Alto and is led by its founder and CEO, Angel Vina.

Financially, Denodo is in great shape with some 50-60% year-on-year revenue growth and a business that is both profitable and has no long-term debt. The combination of strength from consistent executive leadership coupled with strong financials means that we’ve been able to enjoy an unrivalled focus on ‘excellence’; excellence in technical capability, excellence in innovation and excellence in customer satisfaction. This is reflected in the vibrant and enthusiastic user community for Denodo, which can be seen in the many references and case studies available on Denodo’s web site as well as the customer satisfaction ratings in the various vendor surveys.


It is no longer practical to move and copy all the data that might be needed into one large curated data warehouse or data lake. It is too slow, too complex and too costly.



M7: Given the changes in today’s rapidly evolving marketplace, what are the key 'climatic' changes taking place in Data Virtualization today? What systematic approach are you using to address these changes?
CS:
When talking to businesses about their wider needs from data integration, the challenges often include siloed data, legacy applications (often designed with no integration in mind), cloud migration and hybrid cloud (on-premise and cloud combinations) or multi-cloud combination, ungoverned ‘shadow IT’ and the wide variety of formats and protocols. Combine this with the difficulties of the rapidly growing data volumes, the disparate nature of modern data stores (on -premise, SaaS, and various cloud locations plus perhaps external sources) as well as the business demand for real-time or near real-time data insights and you have some big challenges! If you are to refer to these as ‘climatic changes’ then most businesses are encountering a heat-wave! This is typically when Denodo gets invited to the party.It is no longer practical to move and copy all the data that might be needed into one large curated data warehouse or data lake. It is too slow, too complex and too costly. The industry is littered with stories of the difficulties encountered with the move-and-copy approach and big data lake projects that never end.

Instead, connecting a virtual layer to all potential sources (without moving and copying the data) means that data consumers can be connected to just the virtual layer and get secure access to all they might need regardless of the location or format. The integration can be provided in real-time inside the virtual layer. It saves on the complexity and cost of all the ‘heavy lifting’ and duplication of vast amounts of data and the users get the agility and real-time data (rather than the batch overnight versions they were getting from physical data warehouses). Of course, IT also benefits from a single layer for audit, security and governance, ensuring compliance and much faster data access for the business. We all do something similar in the home when we watch movies or listen to music with services like Netflix or Spotify. We get what we want when we want it and don’t have to worry about where it is stored. We also get an almost infinite choice as there is no need to hold all the DVDs or CD collections locally, on our shelves. Data virtualisation provides the same kind of paradigm but for business. When it comes to data, ‘Connect, don’t collect’!


M7: Could you please help our readers understand the relationship between data virtualization and digital transformation?
CS:
This is a very good question as it is easy to mix the terms of data virtualisation and digital transformation! Digital transformation is the strategic adoption of digital technology to meet changing market and business needs. It often involves the creation of new business processes or the modification of existing ones. As organisations reduce or eliminate paper-based processes we see the switch from analogue to digital. It brings with it new opportunities and new customer interactions that would previously have been impossible.Interestingly data virtualisation is often used in digital transformation projects, as a key element in the new digitised architecture, giving greater agility to the business, with new products, processes and services available through the combination of data in new and innovative ways. Take for example a new omnichannel service to customer interaction, offering a mobile app, web-based access and call centre service as well as perhaps internal management reports and real-time executive dashboards.

These different data consumers require different technologies and therefore different formats and data delivery protocols. Without data virtualisation each different flavour of consumer would need specific interfaces and manipulation of the data to provide the right exposure. This is time-consuming, requires coding and data movement and is both costly and potentially error-prone. It also makes it hard to make changes in a quick and agile manner. However, by using a data virtualisation layer, all the data combinations and transformations can be defined just once, linking the published data to the different data consumers from the same single data model. Now you have a very agile, easy to maintain system with consistent data to all consumers irrespective of the technology. Oh, and let’s not forget it is now also real-time or near real-time from the source! For many businesses, this can be truly transformational!


Many AI applications have obvious and direct benefits which most people would accept as reasonable but as the level of sophistication grows the issues will inevitably be more complex.



M7: How has COVID-19 affected the market? What was Denodo’s strategy during the pandemic?
CS:
The demand for data virtualisation is driven by projects and strategies requiring faster, more digital business models where data is recognised as having the potential to be a key differentiator in the business; new insights, faster interactions, new products and services, greater customer intelligence, propensity models, improved governance and compliance, reduced risk and the adoption of AI/ML are all typical business drivers for data virtualisation. The solution is very much industry-agnostic and is therefore used widely across most major business sectors including finance, telco, pharma, manufacturing, government, retail and utilities. For this reason, we saw some sectors accelerating their adoption and others holding back. In banking, for example, the branch closures drove greater digitisation and the adoption of online banking and likewise, the move to a virtually cashless economy provided far more comprehensive data sets for the analysis of customer spending patterns than could ever have been achieved when cash was in play.

For our part, Denodo was keen to help combat the global challenges we all faced. We, therefore, spun up the data portal with the sole aim of integrating disparate global data about COVID-19, curating them, and then providing them free of charge to data consumers such as data scientists, analysts, and researchers, who could use them to look at the impact and potential solutions to this disease. The portal operated for some 15 months, with categories of content from several 100 official data sources from around the world (such as NHS England, ONS, WHO etc) for everyone to consume. It was set up and running in just 21 days and demonstrates the agility available from using data virtualisation for analytics. Some views were accessed 10s of 1,000s of times over its period of operation. The object has been to use data insights to help whoever and wherever in the world to fight COVID-19. It wasn’t just medical data; other examples include the ability to highlight the per cent change in visits to places like grocery stores and parks within a geographic area. Another being the ability to look at changes in air quality across regions over time, based on changing citizen movements. Collaboration was needed to maximise the value of this initiative to society and the portal statistics indicate that it was put to good use over the last 18 months. It’s a great example of the use of data virtualisation for high-velocity high volume analytics.

We’ve also had feedback from customers on their own application of the Denodo Platform during the pandemic. NHS National Services Scotland for example used Denodo’s technology to solve challenges, such as:

-The provisioning of data for COVID reporting to the Scottish Government and the First Minister in Scotland

-Helping to monitor oxygen availability at various hospitals

-Enabling data for Test and Trace of COVID patients and general public

-Implementing the vaccination programme, including venue and appointment management, eligibility, uptake, etc.

-The planning and availability of required PPE equipment and other supplies

Overall, given the huge impact on some industry sectors, we were fortunate to have been in IT as our business was able to work remotely, largely without difficulty, and we were able to continue to support our customers’ needs.


M7: How do you prepare for an AI-centric world as a Business Leader?
CS:
One of the fundamental technical requirements for good Artificial Intelligence is the availability of high quality comprehensive and timely data. For this reason, we are often asked to help, by using data virtualisation to abstract data from different sources in real-time and present the transformed results to AI models. Certainly, there is a huge buzz around AI and for about 10% of businesses, it’s already being used to give better faster business decision-making processes. Based on what we see there are 3 common strategies in the market, business-led, and technology-led and a combined iterative approach. The business-led strategy is driven by business demand for a specific capability. Initially, this seems to be largely for personalisation and automated customer service (chatbots), but now we’re also seeing a growing sophistication in use for real-time fraud detection/other “detection and prevention” models as well as improvements in operational efficiencies (e.g. customer onboarding)

Where we see technology-led strategies AI is being used by businesses using data science and business analytics to understand data and the potential for insights – availability, timeliness, quality, veracity & other attributes etc. What is where? Can it be used? What combinations have value? This approach is much more of a data-led initiative. The third strategy is the combination of business-led demand and technology-driven insights into what might be possible to solve certain business needs as an iterative process. In many companies, it is the work of the data scientist that shows the greatest potential for value in the use of AI/ML with new insights and new data combinations. At Denodo we see growing demand for data virtualisation as this substantially accelerates the access to data and therefore the surfacing of new meaningful insights that give a competitive edge to the business.

The value of AI may be compelling, for example for fraud prevention on payments, more advertising revenue from customised adverts but we have to balance this against the need for privacy compliance with GPDR, the Data Protection Act 2018, etc. The largest cost for non-compliance on privacy is often not the fine itself but the brand damage, so we often hear of priority being given to privacy compliance over analytics needs. There are also a host of ethical issues arising from the use of AI. In the short term enterprises have to show value to their own business and society. Many AI applications have obvious and direct benefits which most people would accept as reasonable but as the level of sophistication grows the issues will inevitably be more complex.

ABOUT DENODO

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across every major industry have gained significant business agility and ROI by enabling faster and easier access to unified business information for agile BI, big data analytics, Web, cloud integration, single-view applications, and enterprise data services.

More THOUGHT LEADERS

Q&A with Gil Eyal, Founder at HYPR & Managing Partner at Starfund

Media 7 | February 24, 2021

Gil Eyal, Founder at HYPR & Managing Partner at Starfund, has revolutionized the way many of the world’s biggest agencies and brands are running influencer marketing by focusing on the same data, analytics, and audience demographic information relevant to traditional digital marketing. He was recently selected at #30 on the list of the most influential people in Influencer Marketing. Gil is an accomplished public speaker and has delivered keynotes at notable influencer marketing conferences, including Influencer Marketing Days in New York and Influencer Marketing Hub in London. He was selected as the 2017 recipient of the Digiday Top Boss Award in the technology industry, as one of 10 Israelis impacting the New York Tech Scene, as well as one of 40 must-follow digital media influencers. Gil is also a two-time winner of the MarCom Awards for Excellence in Marketing and Communications....

Read More

'Raising the voices of those who may not always be heard is critical,' says Claire Thomas

Media 7 | April 28, 2023

Claire Thomas is responsible for developing and implementing a strategy for diversity, equity, and inclusion (DEI) across Hitachi Vantara through programs that reflect the diverse backgrounds, interests, and passions of their current and future workforce. Continue reading to learn her views on the significance of inclusion and diversity in an organization....

Read More

Q&A with Alastair Speare-Cole, President of Insurance at QOMPLX

Media 7 | August 20, 2021

Alastair Speare-Cole, President and General Manager of the Insurance Division at QOMPLX, leads the overall strategy for the business unit, the development of QOMPLX’s underwriting-as-a-service platform, the management of the company’s Managing General Agent (MGA), as well as setting the direction for the company’s next-generation insurance decision platform that leverages a wide variety of data and advanced analytics to provide advanced risk and portfolio management solutions. Prior to joining QOMPLX, he served as Chief Underwriting Officer at Qatar, and he served as the CEO of JLT Towers from 2012 to 2015. He was also COO at Aon Re for ten years and has also held board appointments at reinsurance and banking subsidiaries in the United Kingdom....

Read More

Q&A with Gil Eyal, Founder at HYPR & Managing Partner at Starfund

Media 7 | February 24, 2021

Gil Eyal, Founder at HYPR & Managing Partner at Starfund, has revolutionized the way many of the world’s biggest agencies and brands are running influencer marketing by focusing on the same data, analytics, and audience demographic information relevant to traditional digital marketing. He was recently selected at #30 on the list of the most influential people in Influencer Marketing. Gil is an accomplished public speaker and has delivered keynotes at notable influencer marketing conferences, including Influencer Marketing Days in New York and Influencer Marketing Hub in London. He was selected as the 2017 recipient of the Digiday Top Boss Award in the technology industry, as one of 10 Israelis impacting the New York Tech Scene, as well as one of 40 must-follow digital media influencers. Gil is also a two-time winner of the MarCom Awards for Excellence in Marketing and Communications....

Read More

'Raising the voices of those who may not always be heard is critical,' says Claire Thomas

Media 7 | April 28, 2023

Claire Thomas is responsible for developing and implementing a strategy for diversity, equity, and inclusion (DEI) across Hitachi Vantara through programs that reflect the diverse backgrounds, interests, and passions of their current and future workforce. Continue reading to learn her views on the significance of inclusion and diversity in an organization....

Read More

Q&A with Alastair Speare-Cole, President of Insurance at QOMPLX

Media 7 | August 20, 2021

Alastair Speare-Cole, President and General Manager of the Insurance Division at QOMPLX, leads the overall strategy for the business unit, the development of QOMPLX’s underwriting-as-a-service platform, the management of the company’s Managing General Agent (MGA), as well as setting the direction for the company’s next-generation insurance decision platform that leverages a wide variety of data and advanced analytics to provide advanced risk and portfolio management solutions. Prior to joining QOMPLX, he served as Chief Underwriting Officer at Qatar, and he served as the CEO of JLT Towers from 2012 to 2015. He was also COO at Aon Re for ten years and has also held board appointments at reinsurance and banking subsidiaries in the United Kingdom....

Read More

Related News

BIG DATA MANAGEMENT, DATA SCIENCE, MACHINE LEARNING

Snorkel AI Announces Third Annual Future of Data-Centric AI Conference

Businesswire | May 30, 2023

Snorkel AI, the data-centric AI platform company, today announced the agenda for Future of Data-Centric AI Conference 2023 (FDCAI). Now in its third year, FDCAI has established itself as the definitive community and conference for AI, bringing together thousands of data scientists, ML engineers, software developers, Fortune 500 organizations, AI-first companies, and academia under one roof to share best practices and the latest in AI innovation. Held on June 7-8, 2023, this year's virtual event will explore how to leverage new AI advances such as large language models (LLMs), foundation models, generative AI, programmatic labeling, weak supervision, prompting, synthetic data and more—all using data-centric AI workflows. "AI has seen an explosion of innovation in the last few months prompted by the fast rise of large language models,” said Devang Sachdev, VP of Marketing, Snorkel AI. “These advancements, while remarkable, also underscore the urgent need for adopting data-centric approaches to ensure high-quality data and effective AI development. This is why we started The Future of Data-Centric AI conference three years ago, and why we’re continuing to build on it today—to create a forum for the world’s AI innovators to discuss how to deploy AI in real-world settings.” FDCAI’s speaker line-up this year includes leading AI practitioners, researchers and academics from such companies as: Arista, Bank of America, Bloomberg, Capital One, Caterpillar, Comcast, Databricks, EY, Georgetown University CSET, Google, Harvard, Hugging Face, JP Morgan & Chase, Kaiser Permanente, Marsh, Mayo Clinic, Microsoft, Natwest and many others. Notable mentions include: DJ Patil, first U.S. Chief Data Scientist and General Partner, Greatpoint Ventures Emad Mostaque, Founder and CEO, Stability AI Yoav Shoham, Co-founder, AI21 Labs Matei Zaharia, Chief Technologist, Databricks Nurtekin Savas, Head of Enterprise Data Science, Capital One Gideon Mann, Head of ML Product and Research, Office of the CTO at Bloomberg LP. The sessions at this year’s conference will focus on the following areas: Data development techniques: programmatic labeling, active learning, weak supervision, data cleaning, and synthetic data augmentation. Enterprise use cases: predictive AI, generative AI, NLP, computer vision, conversational AI. AI development stack: AutoML, ML frameworks, no-code/low-code development. Foundation models/LLMs: fine-tuning, prompt engineering, prompt chaining, and enterprise adoption This year’s conference also features an AI poster competition with $15K in prizes and submissions from Argonne National Lab, Columbia University, Cornell University, Medidata, Stanford University, TitanML, University of Cambridge, University of Toronto, University of Utah and University of Wisconsin-Madison. The Future of Data-Centric AI conference is hosted by Snorkel AI in partnership with Gretel AI, Hugging Face, Lambda Labs, Microsoft, Predibase, Seldon AI, and Together. All the sessions are virtual and free to attend. For the full conference schedule or to register, visit: https://future.snorkel.ai/ About Snorkel AI Founded by a team spun out of the Stanford AI Lab, Snorkel AI makes AI application development fast and practical by unlocking the power of machine learning without the bottleneck of manually-labeled training data. Snorkel Flow is the first data-centric AI platform powered by programmatic labeling. Backed by Addition, Greylock, GV, In-Q-Tel, Lightspeed Venture Partners and funds and accounts managed by BlackRock, the company is based in Palo Alto. For more information on Snorkel AI, please visit: https://www.snorkel.ai/ or follow @SnorkelAI.

Read More

BUSINESS INTELLIGENCE, BIG DATA MANAGEMENT, DATA SCIENCE

Exasol Unveils the No-Compromise Analytics Database Unlocking Greater Productivity, Cost-Savings, and Flexibility

Businesswire | May 31, 2023

Exasol today unveiled its no-compromise analytics database, which delivers more productivity, savings, and flexibility for enterprises to better manage data in the cloud, SaaS, on-premises, or hybrid. With processing times up to 20 times faster1 than any other analytics database, Exasol provides an unmatched price/performance ratio, helping customers achieve 320% 2 ROI in reduced licensing, implementation, maintenance, and training costs. Businesses interested in trying Exasol in their own tech stack with their own data can do so at no cost for a limited time through its Accelerator Program. Under the leadership of Exasol’s recently appointed Chief Executive Officer, Joerg Tewes, the new release underscores the company’s commitment to delivering what customers need – a solution that doesn’t require them to make trade-offs between cost, efficiencies, and flexibility. With Exasol, customers can run analytics anywhere their data lives – on-premises, cloud or across multiple clouds – with no rip-and-replace, no need to move data sets, no cost shocks. With the latest enhancements, Exasol seamlessly integrates with any data stack and analytics ecosystem, and dynamically scales to accommodate even the most complex data sets, removing friction and unburdening data teams to accelerate business outcomes. “Exasol believes customers shouldn’t ever have to make compromises with their analytics databases, especially during these times of economic uncertainty and reduced IT budgets. This is why our offering allows users to see significant performance and efficiency gains, while working within their budgets and existing tech environments,” said Joerg Tewes, CEO of Exasol. “We have hundreds of global customers using Exasol with extremely complex data, at scale. From financial services and retail customers reducing queries from hours to seconds, to agriculture firms working with complicated models supporting DNA sequencing, our customers spend more time analyzing and optimizing with less time and headcount.” The latest enhancements to the database enable organizations to: Avoid replacing databases and get more out of existing tech stacks: Through Exasol’s performance enhancements, scaling optimization, and real-time processing, their entire tech stacks are more efficient. Gain the best price/performance ratio: Market-leading concurrency, fast in-memory processing and query compute distribution provides greater performance on less hardware infrastructure. Run analytics or machine learning (ML): Exasol's ML capabilities are built directly into the in-memory database engine, to deliver even greater efficiencies and cost savings. From uncovering hidden patterns in an organization’s complete data that significantly cuts down data preparation time, to enabling customers to effortlessly utilize open-source machine learning models, Exasol's ML capabilities provide customers with unmatched efficiency and scalability – Exasol makes ML actionable at scale. Manage data where it lives: Workloads can be moved between platforms and uniformly automated and managed through a modern technology stack, whether it’s between self or fully managed SaaS, even on an ad-hoc basis. “Exasol is a strong member of the AWS Partner Network, with its evolution and expansion of capabilities on AWS cloud,” said Mor Hezi, Head of EMEA Technology Partnerships at AWS. “We’re thrilled to continue building our relationship with Exasol, and look forward to fostering further collaboration and mutual growth. Together, we are transforming businesses." “Exasol has allowed us to push the boundaries of how we harness value out of our data, at a speed and scale which we never imagined,” said Cesar Picco, Senior Software Engineer at T-Mobile, USA. “We were immediately impressed with Exasol’s performance as we were able to handle multiple data science workloads simultaneously with over 200% improvement at significantly reduced effort. And this was just the beginning of our journey. In order to improve our 5G network decisions, we leverage our already existing massive amounts of business and network data to more accurately analyze potential network infrastructure. In the world of telcom, more coverage means happier customers, a superior customer experience, and ultimately a better value for our customers. Now, with these latest enhancements, we’ll be able to further explore the possibilities that lie in BI visualization, data mapping and modeling, and so much more. As the only database that offers this level of speed, efficiency, cost, and flexibility, Exasol continues to raise the bar, and we’re looking forward to more innovation, market disruption and customer satisfaction with this rollout.” “Exasol plays an important role in our HR Data Factory and is the central data platform which, as the "digital twin" of the employees, manages data at Mercedes in a DSGVO-compliant manner. With its in-memory architecture, the Exasol database offers excellent opportunities to analyse and securely manage data with unprecedented performance,” said Jochen Linkohr, Head of HR DataFactory Team at Mercedes-Benz. “We appreciate the almost maintenance-free use of the database for many years. It enables us to increase efficiency and optimise costs, which in turn has had a positive impact on our business. We look forward to continuing our long-standing relationship with Exasol and using the new features to sustain our business." To take advantage of the limited time Accelerator Program, and try Exasol in their own tech stack and data sets, businesses interested can go to www.exasol.com/poc. The Accelerator Program allows them to immediately start a complimentary SaaS trial or participate in a proof of concept with three months and five terabytes for free, so they can experience what no-compromises means for their organization. About Exasol Exasol is the no-compromise analytics database that provides increased productivity, cost-savings and flexibility, redefining how businesses use data. With 20x faster processing, Exasol provides insights in record time, empowering businesses to solve complex problems, become more data-driven and bring innovation to life. Exasol also delivers ROI of more than 300% with reduced licensing, implementation, maintenance and training fees, eliminating cost shock and vendor-lock in. With Exasol, businesses have flexibility to manage data the way they want – in the cloud, SaaS, on-premises, or hybrid – without rip-and-replace disruption. Hundreds of global brands like T-Mobile, Revolut, and Allianz rely on Exasol to innovate, grow and win. Join them – try Exasol for free to experience what no-compromise can do for your organization.

Read More

BIG DATA MANAGEMENT, DATA SCIENCE

HighByte Releases New API Gateway to OT Systems, Unlocking Industrial Data for the Enterprise

Prnewswire | May 30, 2023

HighByte®, an industrial software company, today announced the release of HighByte Intelligence Hub version 3.1 that expands operational technology (OT) data access, governance, and scale capabilities for industrial companies. The release includes a new REST Data Server that acts as an API gateway for industrial data residing in OT systems, so any application or service with an HTTP client can securely request OT data in raw or modeled form directly from the Intelligence Hub—without requiring domain knowledge of the underlying systems. "With its initial release in 2020, HighByte Intelligence Hub entered the market as a client-based application capable of publishing data to consuming systems. Earlier this year, we released server-based capabilities with the addition of an embedded MQTT broker, allowing systems to subscribe to industrial data sources. In a few short months, we have expanded data access yet again with the ability to request data on-demand from the Intelligence Hub through a REST API," said HighByte CEO Tony Paine. "This technical evolution reflects the maturing Industrial DataOps market and the increasingly sophisticated use cases of our customers. We are listening and committed to staying ahead of market needs." HighByte Intelligence Hub version 3.1 adds additional capabilities to support data governance and scale, including more flexible configuration management. The Intelligence Hub now allows easy export and import of full or partial project configurations, which can be configured through the user interface, configuration files, or the configuration API. Users can also import models and instances directly from third party systems like Element Unify, supporting AWS's Industrial Data Fabric architecture. In addition, users can dynamically drive template definitions from third party sources and trigger flows for individual assets, streamlining configuration across similar assets and use cases. "We are particularly excited about the new ability to import and export full or partial projects directly through the user interface. This functionality holds special significance for us as we are running HighByte Intelligence Hub in multiple Docker containers," said Matthew Venter, Solutions Architect at Gousto, an online meal-kit manufacturer and retailer and certified B Corporation. "With the ability to seamlessly import and export projects, we can effortlessly replicate and distribute our configurations across different container instances." Other features include a new Apache Kafka connector and significant improvements to existing connectors including PI System, Sparkplug, OPC UA, REST Client, AWS IoT SiteWise, Modbus, and the File connector (now with FTP support). HighByte Intelligence Hub version 3.1 is now commercially available. All new features and capabilities introduced in version 3.1 are included in standard pricing. Please contact HighByte or an authorized distributor to request a trial or purchase an annual subscription license. About HighByte HighByte is an industrial software company in Portland, Maine USA building solutions that address the data architecture and integration challenges created by Industry 4.0. HighByte Intelligence Hub, the company's award-winning Industrial DataOps software, provides modeled, ready-to-use data to the Cloud using a codeless interface to speed integration time and accelerate analytics. Learn more at https://www.highbyte.com.

Read More

BIG DATA MANAGEMENT, DATA SCIENCE, MACHINE LEARNING

Snorkel AI Announces Third Annual Future of Data-Centric AI Conference

Businesswire | May 30, 2023

Snorkel AI, the data-centric AI platform company, today announced the agenda for Future of Data-Centric AI Conference 2023 (FDCAI). Now in its third year, FDCAI has established itself as the definitive community and conference for AI, bringing together thousands of data scientists, ML engineers, software developers, Fortune 500 organizations, AI-first companies, and academia under one roof to share best practices and the latest in AI innovation. Held on June 7-8, 2023, this year's virtual event will explore how to leverage new AI advances such as large language models (LLMs), foundation models, generative AI, programmatic labeling, weak supervision, prompting, synthetic data and more—all using data-centric AI workflows. "AI has seen an explosion of innovation in the last few months prompted by the fast rise of large language models,” said Devang Sachdev, VP of Marketing, Snorkel AI. “These advancements, while remarkable, also underscore the urgent need for adopting data-centric approaches to ensure high-quality data and effective AI development. This is why we started The Future of Data-Centric AI conference three years ago, and why we’re continuing to build on it today—to create a forum for the world’s AI innovators to discuss how to deploy AI in real-world settings.” FDCAI’s speaker line-up this year includes leading AI practitioners, researchers and academics from such companies as: Arista, Bank of America, Bloomberg, Capital One, Caterpillar, Comcast, Databricks, EY, Georgetown University CSET, Google, Harvard, Hugging Face, JP Morgan & Chase, Kaiser Permanente, Marsh, Mayo Clinic, Microsoft, Natwest and many others. Notable mentions include: DJ Patil, first U.S. Chief Data Scientist and General Partner, Greatpoint Ventures Emad Mostaque, Founder and CEO, Stability AI Yoav Shoham, Co-founder, AI21 Labs Matei Zaharia, Chief Technologist, Databricks Nurtekin Savas, Head of Enterprise Data Science, Capital One Gideon Mann, Head of ML Product and Research, Office of the CTO at Bloomberg LP. The sessions at this year’s conference will focus on the following areas: Data development techniques: programmatic labeling, active learning, weak supervision, data cleaning, and synthetic data augmentation. Enterprise use cases: predictive AI, generative AI, NLP, computer vision, conversational AI. AI development stack: AutoML, ML frameworks, no-code/low-code development. Foundation models/LLMs: fine-tuning, prompt engineering, prompt chaining, and enterprise adoption This year’s conference also features an AI poster competition with $15K in prizes and submissions from Argonne National Lab, Columbia University, Cornell University, Medidata, Stanford University, TitanML, University of Cambridge, University of Toronto, University of Utah and University of Wisconsin-Madison. The Future of Data-Centric AI conference is hosted by Snorkel AI in partnership with Gretel AI, Hugging Face, Lambda Labs, Microsoft, Predibase, Seldon AI, and Together. All the sessions are virtual and free to attend. For the full conference schedule or to register, visit: https://future.snorkel.ai/ About Snorkel AI Founded by a team spun out of the Stanford AI Lab, Snorkel AI makes AI application development fast and practical by unlocking the power of machine learning without the bottleneck of manually-labeled training data. Snorkel Flow is the first data-centric AI platform powered by programmatic labeling. Backed by Addition, Greylock, GV, In-Q-Tel, Lightspeed Venture Partners and funds and accounts managed by BlackRock, the company is based in Palo Alto. For more information on Snorkel AI, please visit: https://www.snorkel.ai/ or follow @SnorkelAI.

Read More

BUSINESS INTELLIGENCE, BIG DATA MANAGEMENT, DATA SCIENCE

Exasol Unveils the No-Compromise Analytics Database Unlocking Greater Productivity, Cost-Savings, and Flexibility

Businesswire | May 31, 2023

Exasol today unveiled its no-compromise analytics database, which delivers more productivity, savings, and flexibility for enterprises to better manage data in the cloud, SaaS, on-premises, or hybrid. With processing times up to 20 times faster1 than any other analytics database, Exasol provides an unmatched price/performance ratio, helping customers achieve 320% 2 ROI in reduced licensing, implementation, maintenance, and training costs. Businesses interested in trying Exasol in their own tech stack with their own data can do so at no cost for a limited time through its Accelerator Program. Under the leadership of Exasol’s recently appointed Chief Executive Officer, Joerg Tewes, the new release underscores the company’s commitment to delivering what customers need – a solution that doesn’t require them to make trade-offs between cost, efficiencies, and flexibility. With Exasol, customers can run analytics anywhere their data lives – on-premises, cloud or across multiple clouds – with no rip-and-replace, no need to move data sets, no cost shocks. With the latest enhancements, Exasol seamlessly integrates with any data stack and analytics ecosystem, and dynamically scales to accommodate even the most complex data sets, removing friction and unburdening data teams to accelerate business outcomes. “Exasol believes customers shouldn’t ever have to make compromises with their analytics databases, especially during these times of economic uncertainty and reduced IT budgets. This is why our offering allows users to see significant performance and efficiency gains, while working within their budgets and existing tech environments,” said Joerg Tewes, CEO of Exasol. “We have hundreds of global customers using Exasol with extremely complex data, at scale. From financial services and retail customers reducing queries from hours to seconds, to agriculture firms working with complicated models supporting DNA sequencing, our customers spend more time analyzing and optimizing with less time and headcount.” The latest enhancements to the database enable organizations to: Avoid replacing databases and get more out of existing tech stacks: Through Exasol’s performance enhancements, scaling optimization, and real-time processing, their entire tech stacks are more efficient. Gain the best price/performance ratio: Market-leading concurrency, fast in-memory processing and query compute distribution provides greater performance on less hardware infrastructure. Run analytics or machine learning (ML): Exasol's ML capabilities are built directly into the in-memory database engine, to deliver even greater efficiencies and cost savings. From uncovering hidden patterns in an organization’s complete data that significantly cuts down data preparation time, to enabling customers to effortlessly utilize open-source machine learning models, Exasol's ML capabilities provide customers with unmatched efficiency and scalability – Exasol makes ML actionable at scale. Manage data where it lives: Workloads can be moved between platforms and uniformly automated and managed through a modern technology stack, whether it’s between self or fully managed SaaS, even on an ad-hoc basis. “Exasol is a strong member of the AWS Partner Network, with its evolution and expansion of capabilities on AWS cloud,” said Mor Hezi, Head of EMEA Technology Partnerships at AWS. “We’re thrilled to continue building our relationship with Exasol, and look forward to fostering further collaboration and mutual growth. Together, we are transforming businesses." “Exasol has allowed us to push the boundaries of how we harness value out of our data, at a speed and scale which we never imagined,” said Cesar Picco, Senior Software Engineer at T-Mobile, USA. “We were immediately impressed with Exasol’s performance as we were able to handle multiple data science workloads simultaneously with over 200% improvement at significantly reduced effort. And this was just the beginning of our journey. In order to improve our 5G network decisions, we leverage our already existing massive amounts of business and network data to more accurately analyze potential network infrastructure. In the world of telcom, more coverage means happier customers, a superior customer experience, and ultimately a better value for our customers. Now, with these latest enhancements, we’ll be able to further explore the possibilities that lie in BI visualization, data mapping and modeling, and so much more. As the only database that offers this level of speed, efficiency, cost, and flexibility, Exasol continues to raise the bar, and we’re looking forward to more innovation, market disruption and customer satisfaction with this rollout.” “Exasol plays an important role in our HR Data Factory and is the central data platform which, as the "digital twin" of the employees, manages data at Mercedes in a DSGVO-compliant manner. With its in-memory architecture, the Exasol database offers excellent opportunities to analyse and securely manage data with unprecedented performance,” said Jochen Linkohr, Head of HR DataFactory Team at Mercedes-Benz. “We appreciate the almost maintenance-free use of the database for many years. It enables us to increase efficiency and optimise costs, which in turn has had a positive impact on our business. We look forward to continuing our long-standing relationship with Exasol and using the new features to sustain our business." To take advantage of the limited time Accelerator Program, and try Exasol in their own tech stack and data sets, businesses interested can go to www.exasol.com/poc. The Accelerator Program allows them to immediately start a complimentary SaaS trial or participate in a proof of concept with three months and five terabytes for free, so they can experience what no-compromises means for their organization. About Exasol Exasol is the no-compromise analytics database that provides increased productivity, cost-savings and flexibility, redefining how businesses use data. With 20x faster processing, Exasol provides insights in record time, empowering businesses to solve complex problems, become more data-driven and bring innovation to life. Exasol also delivers ROI of more than 300% with reduced licensing, implementation, maintenance and training fees, eliminating cost shock and vendor-lock in. With Exasol, businesses have flexibility to manage data the way they want – in the cloud, SaaS, on-premises, or hybrid – without rip-and-replace disruption. Hundreds of global brands like T-Mobile, Revolut, and Allianz rely on Exasol to innovate, grow and win. Join them – try Exasol for free to experience what no-compromise can do for your organization.

Read More

BIG DATA MANAGEMENT, DATA SCIENCE

HighByte Releases New API Gateway to OT Systems, Unlocking Industrial Data for the Enterprise

Prnewswire | May 30, 2023

HighByte®, an industrial software company, today announced the release of HighByte Intelligence Hub version 3.1 that expands operational technology (OT) data access, governance, and scale capabilities for industrial companies. The release includes a new REST Data Server that acts as an API gateway for industrial data residing in OT systems, so any application or service with an HTTP client can securely request OT data in raw or modeled form directly from the Intelligence Hub—without requiring domain knowledge of the underlying systems. "With its initial release in 2020, HighByte Intelligence Hub entered the market as a client-based application capable of publishing data to consuming systems. Earlier this year, we released server-based capabilities with the addition of an embedded MQTT broker, allowing systems to subscribe to industrial data sources. In a few short months, we have expanded data access yet again with the ability to request data on-demand from the Intelligence Hub through a REST API," said HighByte CEO Tony Paine. "This technical evolution reflects the maturing Industrial DataOps market and the increasingly sophisticated use cases of our customers. We are listening and committed to staying ahead of market needs." HighByte Intelligence Hub version 3.1 adds additional capabilities to support data governance and scale, including more flexible configuration management. The Intelligence Hub now allows easy export and import of full or partial project configurations, which can be configured through the user interface, configuration files, or the configuration API. Users can also import models and instances directly from third party systems like Element Unify, supporting AWS's Industrial Data Fabric architecture. In addition, users can dynamically drive template definitions from third party sources and trigger flows for individual assets, streamlining configuration across similar assets and use cases. "We are particularly excited about the new ability to import and export full or partial projects directly through the user interface. This functionality holds special significance for us as we are running HighByte Intelligence Hub in multiple Docker containers," said Matthew Venter, Solutions Architect at Gousto, an online meal-kit manufacturer and retailer and certified B Corporation. "With the ability to seamlessly import and export projects, we can effortlessly replicate and distribute our configurations across different container instances." Other features include a new Apache Kafka connector and significant improvements to existing connectors including PI System, Sparkplug, OPC UA, REST Client, AWS IoT SiteWise, Modbus, and the File connector (now with FTP support). HighByte Intelligence Hub version 3.1 is now commercially available. All new features and capabilities introduced in version 3.1 are included in standard pricing. Please contact HighByte or an authorized distributor to request a trial or purchase an annual subscription license. About HighByte HighByte is an industrial software company in Portland, Maine USA building solutions that address the data architecture and integration challenges created by Industry 4.0. HighByte Intelligence Hub, the company's award-winning Industrial DataOps software, provides modeled, ready-to-use data to the Cloud using a codeless interface to speed integration time and accelerate analytics. Learn more at https://www.highbyte.com.

Read More

Spotlight

Denodo

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across ...

Events

Resources