Q&A with Charles Southwood, Vice President, N. Europe and MEA at Denodo

Media 7 | September 15, 2021

Charles Southwood, Regional VP at Denodo Technologies is responsible for the company’s business revenues in Northern Europe, Middle East and South Africa. He is passionate about working in rapidly moving and innovative markets to support customer success and to align IT solutions that meet the changing business needs. With a degree in engineering from Imperial College London, Charles has over 20 years of experience in data integration, big data, IT infrastructure/IT operations and Business Analytics.

There is a huge buzz around AI and for about 10% of businesses, it’s already being used to give better faster business decision-making processes.



MEDIA 7: Could you please tell us a little bit about yourself and what made you choose this career path?
CHARLES SOUTHWOOD:
Like several friends and colleagues who now work in IT, I started my career as a civil engineer and moved into IT as the industry matured. Over the years I have found that many of the attributes that come from that early engineering background are also well-suited to the logical and analytical approaches needed for solving customer business problems with data and IT solutions and to the efficient operation of a sales and marketing business. After graduating from Imperial College, I spent several years in the industry with Taylor Woodrow both on-site and in the design office. This included a unique opportunity to spend several months in the Amazon jungle designing and building bridges to connect remote villages that would otherwise be cut off during the rainy season; both an unusual and very rewarding experience for someone in their mid-twenties!

Upon returning to the UK I moved into the design office, designing dams, oil rigs, power stations and river diversion schemes. Much of the design included the use of CAD/CAM systems and finite difference and finite element analysis (using large matrix manipulation) to model loadings and stresses in structures and to predicting fluid flow patterns. Although business-related computing was in its infancy, I could see enormous potential and found the prospects for much wider use of IT to be very enticing. An opportunity presented itself to move to a company that sold solutions and consultancy services both for engineering and for general business applications. I joined the company in a sales position and although I still had my links to engineering, the faster growth in use of IT for business inevitably led to my focus on this area. This in turn led to other sales positions and into sales management and then wider business leadership roles.

Over the last 25 years, I have therefore had the pleasure of working in a number of software houses, driving the sales and marketing operations of companies where the predominant focus is on innovative market solutions, often disruptive in their nature. I am privileged to have had the opportunity to discuss business issues and strategies with a vast number of large and mid-sized organisations, across different industries and with varying stages of IT and data maturity. It is great to be able to discuss the ‘big-picture’ perspectives on the trends and needs of those client businesses and to subsequently see the impact that can be achieved by supporting them with the right solutions. For the last 5 years, as Regional Vice President of Denodo Technologies, I’ve been responsible for leading the company’s business growth across northern Europe, the Middle East and Africa.

With ever-increasing data volumes, a greater variety of data formats and more disparate locations coupled with the mounting desire for real-time data insights we’ve seen rapid growth in demand for improved data integration. As the world’s market leader in data virtualisation we’re seeing strong adoption from all sectors, addressing the applications integration requirements and data integration needs for a wide array of use cases from self-service and agile BI to Customer360, cloud migration, applications integration and compliance/risk. It is an exciting time to be at the heart of these industry initiatives!


M7: Your award-winning data virtualization platform is regularly featured in the world's leading business and IT publications. What are the core values that drive the company to be a leader in data virtualization?
CS:
Denodo was founded in 1999 and has been focused on providing data virtualisation throughout that time. In fact, many look upon Denodo as the ‘founding father’ of data virtualisation technology and the Denodo Platform capabilities, as you say, allow it to be placed consistently in leadership categories of the various analyst reports on data and applications integration. For example, in Gartner’s latest Magic Quadrant for Data Integration (June 2021) Denodo is listed as the fastest growing vendor in the Leadership Quadrant. The company is privately owned, headquartered in Palo Alto and is led by its founder and CEO, Angel Vina.

Financially, Denodo is in great shape with some 50-60% year-on-year revenue growth and a business that is both profitable and has no long-term debt. The combination of strength from consistent executive leadership coupled with strong financials means that we’ve been able to enjoy an unrivalled focus on ‘excellence’; excellence in technical capability, excellence in innovation and excellence in customer satisfaction. This is reflected in the vibrant and enthusiastic user community for Denodo, which can be seen in the many references and case studies available on Denodo’s web site as well as the customer satisfaction ratings in the various vendor surveys.


It is no longer practical to move and copy all the data that might be needed into one large curated data warehouse or data lake. It is too slow, too complex and too costly.



M7: Given the changes in today’s rapidly evolving marketplace, what are the key 'climatic' changes taking place in Data Virtualization today? What systematic approach are you using to address these changes?
CS:
When talking to businesses about their wider needs from data integration, the challenges often include siloed data, legacy applications (often designed with no integration in mind), cloud migration and hybrid cloud (on-premise and cloud combinations) or multi-cloud combination, ungoverned ‘shadow IT’ and the wide variety of formats and protocols. Combine this with the difficulties of the rapidly growing data volumes, the disparate nature of modern data stores (on -premise, SaaS, and various cloud locations plus perhaps external sources) as well as the business demand for real-time or near real-time data insights and you have some big challenges! If you are to refer to these as ‘climatic changes’ then most businesses are encountering a heat-wave! This is typically when Denodo gets invited to the party.It is no longer practical to move and copy all the data that might be needed into one large curated data warehouse or data lake. It is too slow, too complex and too costly. The industry is littered with stories of the difficulties encountered with the move-and-copy approach and big data lake projects that never end.

Instead, connecting a virtual layer to all potential sources (without moving and copying the data) means that data consumers can be connected to just the virtual layer and get secure access to all they might need regardless of the location or format. The integration can be provided in real-time inside the virtual layer. It saves on the complexity and cost of all the ‘heavy lifting’ and duplication of vast amounts of data and the users get the agility and real-time data (rather than the batch overnight versions they were getting from physical data warehouses). Of course, IT also benefits from a single layer for audit, security and governance, ensuring compliance and much faster data access for the business. We all do something similar in the home when we watch movies or listen to music with services like Netflix or Spotify. We get what we want when we want it and don’t have to worry about where it is stored. We also get an almost infinite choice as there is no need to hold all the DVDs or CD collections locally, on our shelves. Data virtualisation provides the same kind of paradigm but for business. When it comes to data, ‘Connect, don’t collect’!


M7: Could you please help our readers understand the relationship between data virtualization and digital transformation?
CS:
This is a very good question as it is easy to mix the terms of data virtualisation and digital transformation! Digital transformation is the strategic adoption of digital technology to meet changing market and business needs. It often involves the creation of new business processes or the modification of existing ones. As organisations reduce or eliminate paper-based processes we see the switch from analogue to digital. It brings with it new opportunities and new customer interactions that would previously have been impossible.Interestingly data virtualisation is often used in digital transformation projects, as a key element in the new digitised architecture, giving greater agility to the business, with new products, processes and services available through the combination of data in new and innovative ways. Take for example a new omnichannel service to customer interaction, offering a mobile app, web-based access and call centre service as well as perhaps internal management reports and real-time executive dashboards.

These different data consumers require different technologies and therefore different formats and data delivery protocols. Without data virtualisation each different flavour of consumer would need specific interfaces and manipulation of the data to provide the right exposure. This is time-consuming, requires coding and data movement and is both costly and potentially error-prone. It also makes it hard to make changes in a quick and agile manner. However, by using a data virtualisation layer, all the data combinations and transformations can be defined just once, linking the published data to the different data consumers from the same single data model. Now you have a very agile, easy to maintain system with consistent data to all consumers irrespective of the technology. Oh, and let’s not forget it is now also real-time or near real-time from the source! For many businesses, this can be truly transformational!


Many AI applications have obvious and direct benefits which most people would accept as reasonable but as the level of sophistication grows the issues will inevitably be more complex.



M7: How has COVID-19 affected the market? What was Denodo’s strategy during the pandemic?
CS:
The demand for data virtualisation is driven by projects and strategies requiring faster, more digital business models where data is recognised as having the potential to be a key differentiator in the business; new insights, faster interactions, new products and services, greater customer intelligence, propensity models, improved governance and compliance, reduced risk and the adoption of AI/ML are all typical business drivers for data virtualisation. The solution is very much industry-agnostic and is therefore used widely across most major business sectors including finance, telco, pharma, manufacturing, government, retail and utilities. For this reason, we saw some sectors accelerating their adoption and others holding back. In banking, for example, the branch closures drove greater digitisation and the adoption of online banking and likewise, the move to a virtually cashless economy provided far more comprehensive data sets for the analysis of customer spending patterns than could ever have been achieved when cash was in play.

For our part, Denodo was keen to help combat the global challenges we all faced. We, therefore, spun up the data portal with the sole aim of integrating disparate global data about COVID-19, curating them, and then providing them free of charge to data consumers such as data scientists, analysts, and researchers, who could use them to look at the impact and potential solutions to this disease. The portal operated for some 15 months, with categories of content from several 100 official data sources from around the world (such as NHS England, ONS, WHO etc) for everyone to consume. It was set up and running in just 21 days and demonstrates the agility available from using data virtualisation for analytics. Some views were accessed 10s of 1,000s of times over its period of operation. The object has been to use data insights to help whoever and wherever in the world to fight COVID-19. It wasn’t just medical data; other examples include the ability to highlight the per cent change in visits to places like grocery stores and parks within a geographic area. Another being the ability to look at changes in air quality across regions over time, based on changing citizen movements. Collaboration was needed to maximise the value of this initiative to society and the portal statistics indicate that it was put to good use over the last 18 months. It’s a great example of the use of data virtualisation for high-velocity high volume analytics.

We’ve also had feedback from customers on their own application of the Denodo Platform during the pandemic. NHS National Services Scotland for example used Denodo’s technology to solve challenges, such as:

-The provisioning of data for COVID reporting to the Scottish Government and the First Minister in Scotland

-Helping to monitor oxygen availability at various hospitals

-Enabling data for Test and Trace of COVID patients and general public

-Implementing the vaccination programme, including venue and appointment management, eligibility, uptake, etc.

-The planning and availability of required PPE equipment and other supplies

Overall, given the huge impact on some industry sectors, we were fortunate to have been in IT as our business was able to work remotely, largely without difficulty, and we were able to continue to support our customers’ needs.


M7: How do you prepare for an AI-centric world as a Business Leader?
CS:
One of the fundamental technical requirements for good Artificial Intelligence is the availability of high quality comprehensive and timely data. For this reason, we are often asked to help, by using data virtualisation to abstract data from different sources in real-time and present the transformed results to AI models. Certainly, there is a huge buzz around AI and for about 10% of businesses, it’s already being used to give better faster business decision-making processes. Based on what we see there are 3 common strategies in the market, business-led, and technology-led and a combined iterative approach. The business-led strategy is driven by business demand for a specific capability. Initially, this seems to be largely for personalisation and automated customer service (chatbots), but now we’re also seeing a growing sophistication in use for real-time fraud detection/other “detection and prevention” models as well as improvements in operational efficiencies (e.g. customer onboarding)

Where we see technology-led strategies AI is being used by businesses using data science and business analytics to understand data and the potential for insights – availability, timeliness, quality, veracity & other attributes etc. What is where? Can it be used? What combinations have value? This approach is much more of a data-led initiative. The third strategy is the combination of business-led demand and technology-driven insights into what might be possible to solve certain business needs as an iterative process. In many companies, it is the work of the data scientist that shows the greatest potential for value in the use of AI/ML with new insights and new data combinations. At Denodo we see growing demand for data virtualisation as this substantially accelerates the access to data and therefore the surfacing of new meaningful insights that give a competitive edge to the business.

The value of AI may be compelling, for example for fraud prevention on payments, more advertising revenue from customised adverts but we have to balance this against the need for privacy compliance with GPDR, the Data Protection Act 2018, etc. The largest cost for non-compliance on privacy is often not the fine itself but the brand damage, so we often hear of priority being given to privacy compliance over analytics needs. There are also a host of ethical issues arising from the use of AI. In the short term enterprises have to show value to their own business and society. Many AI applications have obvious and direct benefits which most people would accept as reasonable but as the level of sophistication grows the issues will inevitably be more complex.

ABOUT DENODO

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across every major industry have gained significant business agility and ROI by enabling faster and easier access to unified business information for agile BI, big data analytics, Web, cloud integration, single-view applications, and enterprise data services.

More THOUGHT LEADERS

Q&A with Gil Eyal, Founder at HYPR & Managing Partner at Starfund

Media 7 | February 24, 2021

Gil Eyal, Founder at HYPR & Managing Partner at Starfund, has revolutionized the way many of the world’s biggest agencies and brands are running influencer marketing by focusing on the same data, analytics, and audience demographic information relevant to traditional digital marketing. He was recently selected at #30 on the list of the most influential people in Influencer Marketing. Gil is an accomplished public speaker and has delivered keynotes at notable influencer marketing conferences, including Influencer Marketing Days in New York and Influencer Marketing Hub in London. He was selected as the 2017 recipient of the Digiday Top Boss Award in the technology industry, as one of 10 Israelis impacting the New York Tech Scene, as well as one of 40 must-follow digital media influencers. Gil is also a two-time winner of the MarCom Awards for Excellence in Marketing and Communications....

Read More

Q&A with Alastair Speare-Cole, President of Insurance at QOMPLX

Media 7 | August 20, 2021

Alastair Speare-Cole, President and General Manager of the Insurance Division at QOMPLX, leads the overall strategy for the business unit, the development of QOMPLX’s underwriting-as-a-service platform, the management of the company’s Managing General Agent (MGA), as well as setting the direction for the company’s next-generation insurance decision platform that leverages a wide variety of data and advanced analytics to provide advanced risk and portfolio management solutions. Prior to joining QOMPLX, he served as Chief Underwriting Officer at Qatar, and he served as the CEO of JLT Towers from 2012 to 2015. He was also COO at Aon Re for ten years and has also held board appointments at reinsurance and banking subsidiaries in the United Kingdom....

Read More

Q&A with Sadiqah Musa, Co-Founder at Black In Data

Media 7 | September 1, 2021

Sadiqah Musa, Co-Founder at Black In Data, is also an experienced Senior Data Analyst at Guardian News and Media with a demonstrated history of working in the energy and publishing sectors. She is skilled in Advanced Excel, SQL, Python, data visualization, project management, and Data Analysis and has a strong professional background with a Master of Science (MSc) from The University of Manchester....

Read More

Q&A with Gil Eyal, Founder at HYPR & Managing Partner at Starfund

Media 7 | February 24, 2021

Gil Eyal, Founder at HYPR & Managing Partner at Starfund, has revolutionized the way many of the world’s biggest agencies and brands are running influencer marketing by focusing on the same data, analytics, and audience demographic information relevant to traditional digital marketing. He was recently selected at #30 on the list of the most influential people in Influencer Marketing. Gil is an accomplished public speaker and has delivered keynotes at notable influencer marketing conferences, including Influencer Marketing Days in New York and Influencer Marketing Hub in London. He was selected as the 2017 recipient of the Digiday Top Boss Award in the technology industry, as one of 10 Israelis impacting the New York Tech Scene, as well as one of 40 must-follow digital media influencers. Gil is also a two-time winner of the MarCom Awards for Excellence in Marketing and Communications....

Read More

Q&A with Alastair Speare-Cole, President of Insurance at QOMPLX

Media 7 | August 20, 2021

Alastair Speare-Cole, President and General Manager of the Insurance Division at QOMPLX, leads the overall strategy for the business unit, the development of QOMPLX’s underwriting-as-a-service platform, the management of the company’s Managing General Agent (MGA), as well as setting the direction for the company’s next-generation insurance decision platform that leverages a wide variety of data and advanced analytics to provide advanced risk and portfolio management solutions. Prior to joining QOMPLX, he served as Chief Underwriting Officer at Qatar, and he served as the CEO of JLT Towers from 2012 to 2015. He was also COO at Aon Re for ten years and has also held board appointments at reinsurance and banking subsidiaries in the United Kingdom....

Read More

Q&A with Sadiqah Musa, Co-Founder at Black In Data

Media 7 | September 1, 2021

Sadiqah Musa, Co-Founder at Black In Data, is also an experienced Senior Data Analyst at Guardian News and Media with a demonstrated history of working in the energy and publishing sectors. She is skilled in Advanced Excel, SQL, Python, data visualization, project management, and Data Analysis and has a strong professional background with a Master of Science (MSc) from The University of Manchester....

Read More

Related News

Data Visualization

Oracle Enhances Database 23c with AI Vector Search Capabilities

Oracle | September 20, 2023

Oracle introduces AI Vector Search, enabling semantic search and fast similarity queries by storing semantic content as vectors. Oracle Database 23c, "App Simple," streamlines interactions by declaring outcomes, incorporating AI Vector Search, and offering natural language interfaces. RAG combines large language models (LLMs) with private business data for precise responses to natural language queries while maintaining data privacy. Oracle has announced a significant enhancement to its Oracle Database 23c, introducing semantic search capabilities powered by AI vectors. This innovative collection of features, dubbed AI Vector Search, encompasses a suite of functionalities, including a novel vector data type, vector indexes, and SQL operators. This empowers Oracle Database to store semantic content from various sources, such as documents and images, as vectors and use them to run fast similarity queries. Notably, these advancements also facilitate Retrieval Augmented Generation (RAG), a groundbreaking generative AI technique. RAG combines large language models (LLMs) with private business data to deliver precise responses to natural language queries. Importantly, this approach maintains data privacy by excluding sensitive information from LLM training data. Furthermore, Oracle will enable applications built on Oracle Database and Autonomous Database to add an LLM-based natural language interface. Thus allowing end-users to gain a simplified and intuitive way to request the data they need by framing natural language questions. Additionally, Oracle Database tools such as APEX and SQL Developer will receive enhancements with generative AI capabilities, empowering developers to use natural language for creating applications and SQL queries with ease, eliminating the need for manual coding. Oracle Database 23c, codenamed "App Simple," simplifies the way data professionals, developers, and data users interact with data by stating their desired outcomes rather than hand coding. Data systems will generate solutions using new database technologies such as JSON Relational Duality Views and AI Vector Search with new natural language interface capabilities. Additionally, by merging these technologies with Oracle's low-code APEX development framework, developers will be able to create complete apps. This method represents the future of data and application development and will offer huge productivity increases. Juan Loaiza, Executive Vice President of Mission-Critical Database Technologies, Oracle, stated: Oracle Database is the leading repository of business data, and the combination of business data and semantic data is what enterprises need to implement artificial intelligence solutions, [Source – Cision PR Newswire] Searches on a combination of business and semantic data became easier, faster, and more precise when a single database managed both types of data, stated Loaiza. He further explained that by adding AI Vector Search to Oracle Database, Oracle enables customers to quickly and easily access the benefits of artificial intelligence without compromising security, data integrity, or performance. He emphasized that using Oracle AI Vector Search does not require machine learning expertise and that all database users, including developers and administrators, could learn to use it in less than 30 minutes. The latest updates to Oracle Database services and products include: Modern Oracle Database and AI Application Development Oracle Autonomous Database GoldenGate 23c Free Oracle Autonomous Database Free Container Image Oracle APEX Next-generation Oracle Database Product and Services Oracle Database 23c Oracle Globally Distributed Autonomous Database Oracle Exadata Exascale Autonomous Database Elastic Resource Pools Trusted Data Fabric for AI GoldenGate 23c Oracle GoldenGate Veridata 23c (Beta) OCI GoldenGate Oracle Database Appliance X10 Oracle Database infrastructure for small and medium businesses

Read More

Big Data Management

Microsoft's AI Data Exposure Highlights Challenges in AI Integration

Microsoft | September 22, 2023

AI models rely heavily on vast data volumes for their functionality, thus increasing risks associated with mishandling data in AI projects. Microsoft's AI research team accidentally exposed 38 terabytes of private data on GitHub. Many companies feel compelled to adopt generative AI but lack the expertise to do so effectively. Artificial intelligence (AI) models are renowned for their enormous appetite for data, making them among the most data-intensive computing platforms in existence. While AI holds the potential to revolutionize the world, it is utterly dependent on the availability and ingestion of vast volumes of data. An alarming incident involving Microsoft's AI research team recently highlighted the immense data exposure risks inherent in this technology. The team inadvertently exposed a staggering 38 terabytes of private data when publishing open-source AI training data on the cloud-based code hosting platform GitHub. This exposed data included a complete backup of two Microsoft employees' workstations, containing highly sensitive personal information such as private keys, passwords to internal Microsoft services, and over 30,000 messages from 359 Microsoft employees. The exposure was a result of an accidental configuration, which granted "full control" access instead of "read-only" permissions. This oversight meant that potential attackers could not only view the exposed files but also manipulate, overwrite, or delete them. Although a crisis was narrowly averted in this instance, it serves as a glaring example of the new risks organizations face as they integrate AI more extensively into their operations. With staff engineers increasingly handling vast amounts of specialized and sensitive data to train AI models, it is imperative for companies to establish robust governance policies and educational safeguards to mitigate security risks. Training specialized AI models necessitates specialized data. As organizations of all sizes embrace the advantages AI offers in their day-to-day workflows, IT, data, and security teams must grasp the inherent exposure risks associated with each stage of the AI development process. Open data sharing plays a critical role in AI training, with researchers gathering and disseminating extensive amounts of both external and internal data to build the necessary training datasets for their AI models. However, the more data that is shared, the greater the risk if it is not handled correctly, as evidenced by the Microsoft incident. AI, in many ways, challenges an organization's internal corporate policies like no other technology has done before. To harness AI tools effectively and securely, businesses must first establish a robust data infrastructure to avoid the fundamental pitfalls of AI. Securing the future of AI requires a nuanced approach. Despite concerns about AI's potential risks, organizations should be more concerned about the quality of AI software than the technology turning rogue. PYMNTS Intelligence's research indicates that many companies are uncertain about their readiness for generative AI but still feel compelled to adopt it. A substantial 62% of surveyed executives believe their companies lack the expertise to harness the technology effectively, according to 'Understanding the Future of Generative AI,' a collaboration between PYMNTS and AI-ID. The rapid advancement of computing power and cloud storage infrastructure has reshaped the business landscape, setting the stage for data-driven innovations like AI to revolutionize business processes. While tech giants or well-funded startups primarily produce today's AI models, computing power costs are continually decreasing. In a few years, AI models may become so advanced that everyday consumers can run them on personal devices at home, akin to today's cutting-edge platforms. This juncture signifies a tipping point, where the ever-increasing zettabytes of proprietary data produced each year must be addressed promptly. If not, the risks associated with future innovations will scale up in sync with their capabilities.

Read More

Big Data Management

Ocient's Report Reveals Surge in Hyperscale Data's Impact on Firms

Ocient | September 25, 2023

Ocient, a renowned hyperscale data analytics platform, has recently announced the release of its second annual industry report, titled "Beyond Big Data: Hyperscale Takes Flight." The 2023 report, which is the result of a survey conducted among 500 data and IT leaders responsible for managing data workloads exceeding 150 terabytes, sheds light on the escalating significance of hyperscale data management within enterprises. It also underscores the critical requisites of time, talent, and cutting-edge technology necessary for the effective harnessing of data at scale. Building on the foundation of Ocient's inaugural Beyond Big Data survey, the 2023 report delves into the minds of IT decision-makers, seeking to unravel the challenges, investment priorities, and future prospects occupying the forefront of their agendas for 2023 and beyond. The report offers valuable year-over-year comparisons, drawing on trends identified in 2022 while also presenting fresh, timely insights that mirror the immediate concerns of enterprise leaders in the United States. Among the pivotal insights featured in this year's report are the following: Immediate Emphasis on Data Quality: Organizations committed to leveraging their hyperscale data for critical business decisions are placing paramount importance on ensuring the highest data quality standards. Data Workload Growth as a Driving Force: Data and IT leaders are increasingly recognizing data warehousing and analytics as pivotal elements of their IT strategies, a sentiment vividly reflected in their budget allocations. AI Readiness at the Forefront: Leaders are eager participants in the AI revolution, yet they grapple with concerns surrounding security, accuracy, and trust. Innovation Hindered by Talent and Technology Gaps: Many leaders continue to struggle with the challenges of optimizing their toolsets and scaling their teams swiftly enough to meet the demands posed by hyperscale data volumes. Stephen Catanzano, Senior Analyst, Enterprise Strategy Group, commented: It's clear enterprises are investing in data analytics and warehousing, especially given their costs are being driven up so high with older systems that can't handle the data that's being pushed to them. [Source – Business Wire] Chris Gladwin, Co-Founder and CEO of Ocient, stated that data was not slowing down, and he emphasized that the results of the 2023 Beyond Big Data report confirmed the significance of hyperscale data workloads for enterprises across various industries. He also noted that data volumes were on the rise, as was the importance of comprehending one's data. Nevertheless, the challenges related to data quality, the proliferation of tools, and staffing constraints persisted and were impeding progress in the industry. Furthermore, Gladwin mentioned that the frontier beyond big data had arrived and that Ocient's annual report illustrated the challenges and opportunities that were shaping the enterprise data strategies of the future. About Ocient Ocient is a pioneering hyperscale data analytics solutions company dedicated to empowering organizations to unlock substantial value through the analysis of trillions of data records, achieving performance levels and cost efficiencies previously deemed unattainable. The company is entrusted by leading organizations around the globe to leverage the expertise of its industry professionals in crafting and implementing sophisticated solutions. These solutions not only enable the rapid exploration of new revenue avenues but also streamline operational processes and enhance security measures, all while managing five to 10 times more data and significantly reducing storage requirements by up to 80%.

Read More

Data Visualization

Oracle Enhances Database 23c with AI Vector Search Capabilities

Oracle | September 20, 2023

Oracle introduces AI Vector Search, enabling semantic search and fast similarity queries by storing semantic content as vectors. Oracle Database 23c, "App Simple," streamlines interactions by declaring outcomes, incorporating AI Vector Search, and offering natural language interfaces. RAG combines large language models (LLMs) with private business data for precise responses to natural language queries while maintaining data privacy. Oracle has announced a significant enhancement to its Oracle Database 23c, introducing semantic search capabilities powered by AI vectors. This innovative collection of features, dubbed AI Vector Search, encompasses a suite of functionalities, including a novel vector data type, vector indexes, and SQL operators. This empowers Oracle Database to store semantic content from various sources, such as documents and images, as vectors and use them to run fast similarity queries. Notably, these advancements also facilitate Retrieval Augmented Generation (RAG), a groundbreaking generative AI technique. RAG combines large language models (LLMs) with private business data to deliver precise responses to natural language queries. Importantly, this approach maintains data privacy by excluding sensitive information from LLM training data. Furthermore, Oracle will enable applications built on Oracle Database and Autonomous Database to add an LLM-based natural language interface. Thus allowing end-users to gain a simplified and intuitive way to request the data they need by framing natural language questions. Additionally, Oracle Database tools such as APEX and SQL Developer will receive enhancements with generative AI capabilities, empowering developers to use natural language for creating applications and SQL queries with ease, eliminating the need for manual coding. Oracle Database 23c, codenamed "App Simple," simplifies the way data professionals, developers, and data users interact with data by stating their desired outcomes rather than hand coding. Data systems will generate solutions using new database technologies such as JSON Relational Duality Views and AI Vector Search with new natural language interface capabilities. Additionally, by merging these technologies with Oracle's low-code APEX development framework, developers will be able to create complete apps. This method represents the future of data and application development and will offer huge productivity increases. Juan Loaiza, Executive Vice President of Mission-Critical Database Technologies, Oracle, stated: Oracle Database is the leading repository of business data, and the combination of business data and semantic data is what enterprises need to implement artificial intelligence solutions, [Source – Cision PR Newswire] Searches on a combination of business and semantic data became easier, faster, and more precise when a single database managed both types of data, stated Loaiza. He further explained that by adding AI Vector Search to Oracle Database, Oracle enables customers to quickly and easily access the benefits of artificial intelligence without compromising security, data integrity, or performance. He emphasized that using Oracle AI Vector Search does not require machine learning expertise and that all database users, including developers and administrators, could learn to use it in less than 30 minutes. The latest updates to Oracle Database services and products include: Modern Oracle Database and AI Application Development Oracle Autonomous Database GoldenGate 23c Free Oracle Autonomous Database Free Container Image Oracle APEX Next-generation Oracle Database Product and Services Oracle Database 23c Oracle Globally Distributed Autonomous Database Oracle Exadata Exascale Autonomous Database Elastic Resource Pools Trusted Data Fabric for AI GoldenGate 23c Oracle GoldenGate Veridata 23c (Beta) OCI GoldenGate Oracle Database Appliance X10 Oracle Database infrastructure for small and medium businesses

Read More

Big Data Management

Microsoft's AI Data Exposure Highlights Challenges in AI Integration

Microsoft | September 22, 2023

AI models rely heavily on vast data volumes for their functionality, thus increasing risks associated with mishandling data in AI projects. Microsoft's AI research team accidentally exposed 38 terabytes of private data on GitHub. Many companies feel compelled to adopt generative AI but lack the expertise to do so effectively. Artificial intelligence (AI) models are renowned for their enormous appetite for data, making them among the most data-intensive computing platforms in existence. While AI holds the potential to revolutionize the world, it is utterly dependent on the availability and ingestion of vast volumes of data. An alarming incident involving Microsoft's AI research team recently highlighted the immense data exposure risks inherent in this technology. The team inadvertently exposed a staggering 38 terabytes of private data when publishing open-source AI training data on the cloud-based code hosting platform GitHub. This exposed data included a complete backup of two Microsoft employees' workstations, containing highly sensitive personal information such as private keys, passwords to internal Microsoft services, and over 30,000 messages from 359 Microsoft employees. The exposure was a result of an accidental configuration, which granted "full control" access instead of "read-only" permissions. This oversight meant that potential attackers could not only view the exposed files but also manipulate, overwrite, or delete them. Although a crisis was narrowly averted in this instance, it serves as a glaring example of the new risks organizations face as they integrate AI more extensively into their operations. With staff engineers increasingly handling vast amounts of specialized and sensitive data to train AI models, it is imperative for companies to establish robust governance policies and educational safeguards to mitigate security risks. Training specialized AI models necessitates specialized data. As organizations of all sizes embrace the advantages AI offers in their day-to-day workflows, IT, data, and security teams must grasp the inherent exposure risks associated with each stage of the AI development process. Open data sharing plays a critical role in AI training, with researchers gathering and disseminating extensive amounts of both external and internal data to build the necessary training datasets for their AI models. However, the more data that is shared, the greater the risk if it is not handled correctly, as evidenced by the Microsoft incident. AI, in many ways, challenges an organization's internal corporate policies like no other technology has done before. To harness AI tools effectively and securely, businesses must first establish a robust data infrastructure to avoid the fundamental pitfalls of AI. Securing the future of AI requires a nuanced approach. Despite concerns about AI's potential risks, organizations should be more concerned about the quality of AI software than the technology turning rogue. PYMNTS Intelligence's research indicates that many companies are uncertain about their readiness for generative AI but still feel compelled to adopt it. A substantial 62% of surveyed executives believe their companies lack the expertise to harness the technology effectively, according to 'Understanding the Future of Generative AI,' a collaboration between PYMNTS and AI-ID. The rapid advancement of computing power and cloud storage infrastructure has reshaped the business landscape, setting the stage for data-driven innovations like AI to revolutionize business processes. While tech giants or well-funded startups primarily produce today's AI models, computing power costs are continually decreasing. In a few years, AI models may become so advanced that everyday consumers can run them on personal devices at home, akin to today's cutting-edge platforms. This juncture signifies a tipping point, where the ever-increasing zettabytes of proprietary data produced each year must be addressed promptly. If not, the risks associated with future innovations will scale up in sync with their capabilities.

Read More

Big Data Management

Ocient's Report Reveals Surge in Hyperscale Data's Impact on Firms

Ocient | September 25, 2023

Ocient, a renowned hyperscale data analytics platform, has recently announced the release of its second annual industry report, titled "Beyond Big Data: Hyperscale Takes Flight." The 2023 report, which is the result of a survey conducted among 500 data and IT leaders responsible for managing data workloads exceeding 150 terabytes, sheds light on the escalating significance of hyperscale data management within enterprises. It also underscores the critical requisites of time, talent, and cutting-edge technology necessary for the effective harnessing of data at scale. Building on the foundation of Ocient's inaugural Beyond Big Data survey, the 2023 report delves into the minds of IT decision-makers, seeking to unravel the challenges, investment priorities, and future prospects occupying the forefront of their agendas for 2023 and beyond. The report offers valuable year-over-year comparisons, drawing on trends identified in 2022 while also presenting fresh, timely insights that mirror the immediate concerns of enterprise leaders in the United States. Among the pivotal insights featured in this year's report are the following: Immediate Emphasis on Data Quality: Organizations committed to leveraging their hyperscale data for critical business decisions are placing paramount importance on ensuring the highest data quality standards. Data Workload Growth as a Driving Force: Data and IT leaders are increasingly recognizing data warehousing and analytics as pivotal elements of their IT strategies, a sentiment vividly reflected in their budget allocations. AI Readiness at the Forefront: Leaders are eager participants in the AI revolution, yet they grapple with concerns surrounding security, accuracy, and trust. Innovation Hindered by Talent and Technology Gaps: Many leaders continue to struggle with the challenges of optimizing their toolsets and scaling their teams swiftly enough to meet the demands posed by hyperscale data volumes. Stephen Catanzano, Senior Analyst, Enterprise Strategy Group, commented: It's clear enterprises are investing in data analytics and warehousing, especially given their costs are being driven up so high with older systems that can't handle the data that's being pushed to them. [Source – Business Wire] Chris Gladwin, Co-Founder and CEO of Ocient, stated that data was not slowing down, and he emphasized that the results of the 2023 Beyond Big Data report confirmed the significance of hyperscale data workloads for enterprises across various industries. He also noted that data volumes were on the rise, as was the importance of comprehending one's data. Nevertheless, the challenges related to data quality, the proliferation of tools, and staffing constraints persisted and were impeding progress in the industry. Furthermore, Gladwin mentioned that the frontier beyond big data had arrived and that Ocient's annual report illustrated the challenges and opportunities that were shaping the enterprise data strategies of the future. About Ocient Ocient is a pioneering hyperscale data analytics solutions company dedicated to empowering organizations to unlock substantial value through the analysis of trillions of data records, achieving performance levels and cost efficiencies previously deemed unattainable. The company is entrusted by leading organizations around the globe to leverage the expertise of its industry professionals in crafting and implementing sophisticated solutions. These solutions not only enable the rapid exploration of new revenue avenues but also streamline operational processes and enhance security measures, all while managing five to 10 times more data and significantly reducing storage requirements by up to 80%.

Read More

Spotlight

Denodo

Denodo is the leader in data virtualization providing agile, high performance data integration, data abstraction, and real-time data services across the broadest range of enterprise, cloud, big data, and unstructured data sources at half the cost of traditional approaches. Denodo’s customers across ...

Events

Resources