Data-Centric Approach for AI Development

Aashish Yadav | July 11, 2022 | 138 views | Read Time : 2 min

Data-Centric Approach for AI Development
As AI has grown in popularity over the past decade, practitioners have concentrated on gathering as much data as possible, classifying it, preparing it for usage, and then iterating on model architectures and hyper-parameters to attain our desired objectives. While dealing with all of this data has long been known as laborious and time-consuming, it has typically been seen as an upfront, one-time step we take before entering into the essential modeling phase of machine learning. Data quality concerns, label noise, model drift, and other biases are all addressed in the same way: by collecting and labeling more data, followed by additional model iterations.

The foregoing technique has worked successfully for firms with unlimited resources or strategic challenges. It doesn't work well for machine learning's long-tail issues, particularly those with fewer users and little training data.

The discovery that the prevailing method of deep learning doesn't "scale down" to industry challenges has given birth to a new "trend" in the area termed "Data-Centric AI."

Implementing a Data-Centric Approach for AI Development

Leverage MLOps Practices
Data-centric AI prioritizes data over models. Model selection, hyper-parameter tuning, experiment tracking, deployment, and monitoring take time. Data-centric approaches emphasize automating and simplifying ML lifecycle operations.

Standardizing and automating model-building requires MLOps. MLOps automates machine learning lifecycle management pipelines.
An organizational structure improves communication and cooperation.

Involve Domain Expertise
Data-centric AI development requires domain-specific datasets. Data scientists can overlook intricacies in various sectors, business processes, or even the same domain. Domain experts can give ground truth for the AI use case and verify whether the dataset truly portrays the situation.

Complete and Accurate Data
Data gaps cause misleading results. It's crucial to have a training dataset that correctly depicts the underlying real-world phenomenon. Data augmentation or creating synthetic data might be helpful if gathering comprehensive and representative data is costly or challenging for your use case.

Spotlight

Paradigm4

Paradigm4’s SciDB – the latest innovation from Turing Laureate, entrepreneur, and MIT Professor Michael Stonebraker – is a radically new computational database for mining insights from genomic, clinical, trading, image, RWE, sensor, and device data. Paradigm4 is changing what’s possible with big and diverse data by answering bigger, harder questions.

OTHER ARTICLES
BIG DATA MANAGEMENT,DATA VISUALIZATION,DATA ARCHITECTURE

Enhance Your Customer Experience with Data-Centric AI

Article | August 18, 2022

Data-centric AI is a unique approach to machine learning that depends on the data scientist to design the complete pipeline from data purification and intake through model training. There is no need for a detailed understanding of AI algorithms in this method; instead, it is all about the data. The principle behind data-centric AI is simple: rather than training an algorithm first and then cleaning up the dirty dataset, begin with clean data and train an algorithm on that dataset. Why Is It Necessary to Centralize Datasets? A consolidated data platform can be utilized to produce a single source of truth, therefore simplifying and assuring accuracy. When a team concentrates on continual improvement, wasted time and resources are reduced. You can improve optimization by centralizing data. This is due to the increased opportunity for your team to enhance procedures and make better judgments. The capacity to exploit a single platform that promotes constant improvement in processes, products and operationalization models is provided by centralizing data. Data-Centric AI for Personalized Customer Experience Data-centric AI connects your data and analytics. It's used to detect common habits and preferences, tailor marketing campaigns, provide better suggestions, and much more. Data-Centric AI is being used to evaluate various types of data in order to assist organizations in making quicker, more efficient choices. It can be used to analyze client behavior and trends across several channels in order to provide personalized experiences. It enables applications and websites to adjust the information that individuals view according to their preferences, as well as advertisers to target specific consumers with tailored offers. What Will the Future of Data-Centric AI Look Like? Data-centric AI strives to provide a systematic approach to a wide range of domains, including product design and user experience. Data-centric AI is a systematic technique and technology that enables engineers and other data scientists to employ machine learning models in their own data studies. Moreover, the goal of data-centric AI is to build best practices that make data analysis approaches less expensive and easier for businesses to implement effortlessly.

Read More
BIG DATA MANAGEMENT

Why Adaptive AI Can Overtake Traditional AI

Article | June 9, 2022

With the ever-changing technology world, company demands and results are no longer the norm. Businesses in a variety of sectors are using artificial intelligence (AI) technologies to solve complicated business challenges, build intelligent and self-sustaining solutions, and, ultimately, remain competitive at all times. To that aim, ongoing attempts are being made to reinvent AI systems in order to do more with less. Adaptive AI is a significant step in that direction. It has the potential to outperform standard machine learning (ML) models in the near future because of its ability to enable organizations to get greater results while spending less time, effort, and resources. The capacity of adaptive AI to enable enterprises to achieve greater outcomes while investing less time, effort, and assets is why it can overtake traditional AI models. Why Adaptive AI Overtakes Traditional AI Robust, Efficient and Agile Robustness, efficiency, and agility are the three basic pillars of Adaptive AI. The ability to achieve great algorithmic accuracy is referred to as robustness. The capacity to achieve reduced resource utilization is referred to as efficiency (for example, computer, memory, and power). Agility manages the ability to change operational circumstances in response to changing demands. Together, these three Adaptive AI principles provide the groundwork for super-capable AI inference for edge devices. Data-Informed Predictions A single pipeline is used by the adaptive learning approach. With this method, you can use a continually advanced learning approach that maintains the framework up-to-date and encourages it to achieve high levels of performance. The Adaptive Learning method examines and learns new changes made to the information and produces values, as well as their associated attributes. Moreover, it benefits from events that can modify market behavior in real-time and, as a result, maintains its accuracy consistently. Adaptive AI recognizes information from the operational environment and uses it to produce data-informed predictions. Closing Lines Adaptive AI will be utilized to meet changing AI computing requirements. Operational effectiveness depends on algorithmic performance and available computer resources. Edge AI frameworks that can change their computing demands effectively reduce compute and memory requirements. Adaptive AI is robust in CSPs' dynamic software environments, where inputs and outputs alter with each framework revamp. It can assist with network operations, marketing, customer service, IoT, security, and customer experience.

Read More
BIG DATA MANAGEMENT

A Modern Application Must Have: Multi-cloud Database

Article | June 24, 2022

To function well, modern apps require enormous amounts of diverse data from sensors, processes, interactions, etc. However, these apps cannot understand the unstructured big data and extract commercial value for effective operations unless this data is maintained properly. In today's age of cloud computing, apps gather and analyze data from various sources, but the data isn't always kept in the same database or format. While increasing overall complexity, several formats make it more difficult for apps to retain and use various data. Multi-model databases, a cutting-edge management system, provide a sophisticated approach to handling varied and unstructured data. A multi-model database allows various data models to natively utilize a single, integrated backend, as opposed to combining different database models. Why Has Multi-Model Database Become a Necessity for Modern Applications? Modern applications can store diverse data in a single repository owing to the flexible approach to database management, which improves agility and reduces data redundancy. Improve Reliability Each database might be a single point of failure for a larger system or application. Multi-model databases reduce failure points, enhancing data dependability and recovery time. Such recovery minimizes expenses and maintains customer engagement and application experience. Simplify Data Management Fragmented database systems benefit contemporary applications but complicate development and operations. Multi-model databases provide a single backend that maintains data integrity and fault tolerance, eliminating the need for different database systems, software licenses, developers, and administrators. Improve Fault Tolerance Modern apps must be fault-tolerant and respond promptly to failures promptly. Multi-model databases enable this by integrating several systems into a single backend. The integration provides system-wide failure tolerance. Closing Lines As applications get more complicated, so do their database requirements. However, connecting many databases and preserving consistency between data gathered from various sources is a time-consuming and expensive undertaking. Fortunately, multi-model databases provide an excellent option for generating the data models you want on a single backend.

Read More
BIG DATA MANAGEMENT

Why Increasing Data Maturity Help Businesses Unlock Digital Potential?

Article | July 5, 2022

There is no dispute that brands that harness and invest in data capabilities will be the ones to realize their maximum revenue potential. However, while today's marketers have access to a multitude of data sources, understanding what data to use and how to utilize it are two of the biggest challenges for all. Data utilization in companies is an inconsistent experience. Some businesses have sensibly invested in improving their data maturity. They can pivot quickly to maximize income potential in an unstable economic environment. Others face a cycle of declining returns as they try to reproduce previous achievements with variable outcomes. Importance of Data Maturity for Businesses Understanding your organization's data maturity is critical for five reasons. An understanding of data maturity may assist marketers in: Align Recognize which problems and challenges the wider organization is attempting to solve and modify techniques to support those goals. Appreciate Analyze honestly what the company is good at doing presently and where adjustments are needed to create better data decision-making. Evaluate Measure data literacy levels while implementing training and upskilling resources to facilitate the implementation of an open learning environment to encourage innovative thinking. Anticipate As the company's data capabilities develop, look forward to significantly advanced analytics possibilities. Calibrate Optimize technology and infrastructure to extract maximum value now while also appropriately planning for future resources. Future-Proof Your Business with Data Maturity Data maturity applies to the whole organization. It is a company-wide effort that extends beyond the goals of a single sales or marketing team. As a result, it's critical to bring together diverse internal influencers to determine how improvements to your data strategy can assist everyone in achieving the same objectives. The mix of stakeholders is unique to each organization, so it will be determined by your company's priorities.

Read More

Spotlight

Paradigm4

Paradigm4’s SciDB – the latest innovation from Turing Laureate, entrepreneur, and MIT Professor Michael Stonebraker – is a radically new computational database for mining insights from genomic, clinical, trading, image, RWE, sensor, and device data. Paradigm4 is changing what’s possible with big and diverse data by answering bigger, harder questions.

Related News

BUSINESS INTELLIGENCE,BIG DATA MANAGEMENT,DATA SCIENCE

ACA Group Acquires Data Specialist Ethos ESG to Offer First Data Analytics Product

ACA Group | September 26, 2022

ACA Group (ACA), the leading governance, risk, and compliance (GRC) advisor in financial services, today announced that it has acquired Ethos ESG (Ethos), a provider of environmental, social, and governance (ESG) ratings data and software for financial advisors, asset managers, institutions, and investors. This acquisition marks ACA’s first analytics offering, which will be paired with ACA’s ESG experts to form an integrated tech and advisory offering under the ESG Advisory practice. ACA’s existing ESG Advisory practice supports with a range of programmatic needs for firms that integrate ESG into their business or investment activities. This currently includes advice and implementation support around strategy, policies/procedures, regulations and frameworks, training, and external reporting, among other areas. With Ethos, ACA’s clients will now also be able to easily analyze investments and automate several elements of ESG reporting. Founded in 2019, Ethos offers an interactive platform that allows for the evaluation of over 350,000 impact ratings including companies, stocks, and funds through a prism of 45 ESG causes such as climate change, racial justice, mental health and more. Providing full transparency into how each impact score is calculated and the ability to upload portfolios and create models, Ethos allows for GRC professionals to understand the ESG characteristics of their investments and make responsible decisions that align with their firm's values and ESG commitments. Ethos uses a proprietary set of approximately 100 underlying databases to generate its ratings. These databases provide a unique impact view of ratings, as well as provide insight into key metrics where available. The databases are fully transparent, so clients can see which underlying database source for each data point. Ethos also has capabilities developed to quickly scrape the public domain for material publicly available information to include in the ratings. These state-of-the-art capabilities allow Ethos to quickly add company coverage to help clients achieve full coverage of their investment portfolio. Ethos has invested in innovation through the recent launch of its Impact Calculator, an embeddable widget that takes a dollar amount and immediately calculates the real-world equivalent impact of investing that amount in a specific fund or other product, compared to a benchmark. Additionally, Ethos recently introduced its Carbon Neutral Certification program for mutual funds and ETFs, developed in conjunction with Change Finance. Through the certification, Ethos performs an independent analysis of a funds carbon footprint (covering Scope 1 and Scope 2 emission) and carbon credits (offsets) to verify whether the fund is carbon neutral during a specified period. “This is an exciting step in helping to grow our presence in the ESG space and is ACA Group’s first foray into analytics as a service,” said Shvetank Shah, CEO of ACA Group. “We are invigorated to be building out and launching our data capabilities, starting with Ethos ESG. Combining data with our scalable solutions will continue to empower our clients to reimagine GRC and protect and grow their business.” “We are thrilled to partner with ACA Group, as their brand and reach in the GRC space is well-known. “Not only is taking into consideration the ESG impact of your decisions right on its merits, but greater transparency into ESG issues helps firms mitigate risk and make informed choices while growing sustainably.” Luke Wilcox, Founder and CEO of Ethos ESG “This pairing will help us to leverage data in a new way to help firms of all sizes develop and monitor their ESG programs to mitigate risk, make informed choices, combat greenwashing, and grow profitably and sustainably in the process. Access to high-quality, transparent ESG data is an essential part of any ESG endeavor, and our partnership with Ethos will allow us to build and protect our clients’ ESG strategies in ways few others can,” said Dan Mistler, Head of ESG Advisory at ACA Group. About ACA Group ACA Group (ACA) is the leading governance, risk, and compliance (GRC) advisor in financial services. We empower our clients to reimagine GRC and protect and grow their business. Our innovative approach integrates advisory, managed services, distribution solutions, and analytics with our ComplianceAlpha® technology platform with the specialized expertise of former regulators and practitioners and our deep understanding of the global regulatory landscape. About Ethos ESG Founded in 2019, Ethos ESG provides data and analytics for financial advisors, asset managers, institutions, and investors. With over 350,000 impact ratings of stocks and funds across 45 causes, Ethos ESG helps firms offer robust impact reporting, monitor and address sustainability risks, and enhance quantitative research and modelling with transparent ESG data.

Read More

BUSINESS INTELLIGENCE,BIG DATA MANAGEMENT

Data Integration Platform Dataddo Launches First-Ever Free Plan with No Extraction Limits

Dataddo | September 28, 2022

Dataddo, an SaaS startup that provides an automated, no-code data integration platform, today announced the launch of its Free plan—the first-ever free data integration plan with no extraction limits. This is a no-investment way for any professional, regardless of technical skill, to access and start working with data from disparate sources. The plan comes as a response to company cultural obstacles that hinder adoption of data initiatives within organizations across industries, such as low data literacy. For at least the last five years, the grand majority of leading companies across industries has been investing in data initiatives, with the percentage today being as high as 97%. Yet only 26.5% of these companies claim to actually be data-driven. Mounting evidence suggests that the main reason for this discrepancy is company culture, meaning both lack of executive buy-in on initiatives and lack of employee confidence in data skills. In a recent survey by Accenture of 9,000 employees from companies across industries, nearly three quarters of respondents (74%) claimed to feel “overwhelmed or unhappy when working with data.” This is critical because, as Gartner states, “the real drivers of [data-driven culture] are the people.” Dataddo’s Free plan aims to help businesses overcome these obstacles by making it easier for them to share data and familiarize employees with visualization tools before investing in paid tools. The plan is the first free data integration plan on the market that puts no cap on extraction limits, and it can be used for an unlimited period of time. Under the plan, subscribers can access any ten of Dataddo’s 200+ connectors and automate the synchronization of any volume of marketing, sales, financial, and other cloud data to up to three dashboarding applications and/or Google Sheets weekly. The company offers a growing library of free templates for popular dashboarding applications, enabling professionals with any level of data skill to start analyzing immediately. The platform is SOC 2 Type II certified and compliant with all major data privacy laws around the globe, including ISO 27001, GDPR for Europe, CCPA and HIPAA in the US, LGPD for Brazil, POPIA for South Africa, and more.

Read More

BIG DATA MANAGEMENT,DATA SCIENCE

Qlik Expands Strategic Alignment with Databricks Through SQL-Based Ingestion to Databricks Lakehouse and Partner Connect Integration

Qlik | September 27, 2022

Qlik® today announced two significant enhancements to its partnership with Databricks that make it easier than ever for customers to combine Qlik’s solutions and Databricks to advance their cloud analytics strategies. First is the launch of the Databricks Lakehouse (Delta) Endpoint, a new capability in Qlik Data Integration, which will simplify and improve customers' ability to ingest and deliver data to the Databricks Lakehouse. Second is the integration of Qlik Cloud® with Databricks Partner Connect, enhancing the Qlik Data Analytics trial experience with Databricks. Both deepen and expand the ability of customers to combine Qlik and Databricks in their efforts to leverage the cloud for impact. “We’re excited about the potential of Qlik’s Databricks Lakehouse (Delta) Endpoint to seamlessly and efficiently deliver the data customers need to drive more value from their investment in the Databricks Lakehouse. “And, with Qlik Analytics now integrated with Databricks Partner Connect, we are making it even easier for customers to discover, use and share data-driven insights across their organizations.” Roger Murff, VP of Technology Partners at Databricks Leveraging Databricks SQL, Qlik’s Databricks (Delta) Endpoint optimizes the continuous and real-time data ingestion through Qlik Data Integration into Delta Lake on Databricks. This gives organizations the ability to cost effectively drive more data from a wide range of enterprise data sources, including SAP and Mainframe, into the Databricks Lakehouse while leveraging their cloud provider of choice such as Amazon Web Services (AWS), Google Cloud Platform (GCP) or Microsoft Azure. Qlik has also recently integrated Qlik Cloud with Databricks Partner Connect. Databricks customers can now seamlessly experience Qlik Sense® SaaS within the Databricks Lakehouse Platform through an existing Qlik tenant or a free Qlik trial. The experience includes automatic configuration of connectivity to the customer’s Databricks environment, making it easier for Databricks customers to experience the power of Qlik Cloud. Both the new Databricks Lakehouse (Delta) Endpoint and Partner Connect integration demonstrate Qlik’s commitment to supporting customers like J.B. Hunt in their efforts to combine Qlik and Databricks for impact. “We’re seeing more demand for real-time data related to shippers and carriers in order to provide up-to-the-minute information on how they are performing. Qlik and Databricks help us to meet those demands,” said Joe Spinelle, Director of Engineering and Technology at J.B. Hunt. “As they migrate more and more to the cloud, Databricks customers want strategic partners that make it as easy as possible to deliver and analyze data for impact,” said Itamar Ankorion, SVP of Technology Alliances at Qlik. “With our new Databricks Lakehouse Endpoint and Databricks Partner Connect integration, Qlik is clearly demonstrating its alignment with Databricks and our dedication to the Databricks community to deliver an amazing experience that furthers their overall data strategies.” About Qlik Qlik’s vision is a data-literate world, where everyone can use data and analytics to improve decision-making and solve their most challenging problems. A private company, Qlik offers real-time data integration and analytics solutions, powered by Qlik Cloud®, to close the gaps between data, insights and action. By transforming data into Active Intelligence, businesses can drive better decisions, improve revenue and profitability, and optimize customer relationships. Qlik serves more than 38,000 active customers in over 100 countries.

Read More

BUSINESS INTELLIGENCE,BIG DATA MANAGEMENT,DATA SCIENCE

ACA Group Acquires Data Specialist Ethos ESG to Offer First Data Analytics Product

ACA Group | September 26, 2022

ACA Group (ACA), the leading governance, risk, and compliance (GRC) advisor in financial services, today announced that it has acquired Ethos ESG (Ethos), a provider of environmental, social, and governance (ESG) ratings data and software for financial advisors, asset managers, institutions, and investors. This acquisition marks ACA’s first analytics offering, which will be paired with ACA’s ESG experts to form an integrated tech and advisory offering under the ESG Advisory practice. ACA’s existing ESG Advisory practice supports with a range of programmatic needs for firms that integrate ESG into their business or investment activities. This currently includes advice and implementation support around strategy, policies/procedures, regulations and frameworks, training, and external reporting, among other areas. With Ethos, ACA’s clients will now also be able to easily analyze investments and automate several elements of ESG reporting. Founded in 2019, Ethos offers an interactive platform that allows for the evaluation of over 350,000 impact ratings including companies, stocks, and funds through a prism of 45 ESG causes such as climate change, racial justice, mental health and more. Providing full transparency into how each impact score is calculated and the ability to upload portfolios and create models, Ethos allows for GRC professionals to understand the ESG characteristics of their investments and make responsible decisions that align with their firm's values and ESG commitments. Ethos uses a proprietary set of approximately 100 underlying databases to generate its ratings. These databases provide a unique impact view of ratings, as well as provide insight into key metrics where available. The databases are fully transparent, so clients can see which underlying database source for each data point. Ethos also has capabilities developed to quickly scrape the public domain for material publicly available information to include in the ratings. These state-of-the-art capabilities allow Ethos to quickly add company coverage to help clients achieve full coverage of their investment portfolio. Ethos has invested in innovation through the recent launch of its Impact Calculator, an embeddable widget that takes a dollar amount and immediately calculates the real-world equivalent impact of investing that amount in a specific fund or other product, compared to a benchmark. Additionally, Ethos recently introduced its Carbon Neutral Certification program for mutual funds and ETFs, developed in conjunction with Change Finance. Through the certification, Ethos performs an independent analysis of a funds carbon footprint (covering Scope 1 and Scope 2 emission) and carbon credits (offsets) to verify whether the fund is carbon neutral during a specified period. “This is an exciting step in helping to grow our presence in the ESG space and is ACA Group’s first foray into analytics as a service,” said Shvetank Shah, CEO of ACA Group. “We are invigorated to be building out and launching our data capabilities, starting with Ethos ESG. Combining data with our scalable solutions will continue to empower our clients to reimagine GRC and protect and grow their business.” “We are thrilled to partner with ACA Group, as their brand and reach in the GRC space is well-known. “Not only is taking into consideration the ESG impact of your decisions right on its merits, but greater transparency into ESG issues helps firms mitigate risk and make informed choices while growing sustainably.” Luke Wilcox, Founder and CEO of Ethos ESG “This pairing will help us to leverage data in a new way to help firms of all sizes develop and monitor their ESG programs to mitigate risk, make informed choices, combat greenwashing, and grow profitably and sustainably in the process. Access to high-quality, transparent ESG data is an essential part of any ESG endeavor, and our partnership with Ethos will allow us to build and protect our clients’ ESG strategies in ways few others can,” said Dan Mistler, Head of ESG Advisory at ACA Group. About ACA Group ACA Group (ACA) is the leading governance, risk, and compliance (GRC) advisor in financial services. We empower our clients to reimagine GRC and protect and grow their business. Our innovative approach integrates advisory, managed services, distribution solutions, and analytics with our ComplianceAlpha® technology platform with the specialized expertise of former regulators and practitioners and our deep understanding of the global regulatory landscape. About Ethos ESG Founded in 2019, Ethos ESG provides data and analytics for financial advisors, asset managers, institutions, and investors. With over 350,000 impact ratings of stocks and funds across 45 causes, Ethos ESG helps firms offer robust impact reporting, monitor and address sustainability risks, and enhance quantitative research and modelling with transparent ESG data.

Read More

BUSINESS INTELLIGENCE,BIG DATA MANAGEMENT

Data Integration Platform Dataddo Launches First-Ever Free Plan with No Extraction Limits

Dataddo | September 28, 2022

Dataddo, an SaaS startup that provides an automated, no-code data integration platform, today announced the launch of its Free plan—the first-ever free data integration plan with no extraction limits. This is a no-investment way for any professional, regardless of technical skill, to access and start working with data from disparate sources. The plan comes as a response to company cultural obstacles that hinder adoption of data initiatives within organizations across industries, such as low data literacy. For at least the last five years, the grand majority of leading companies across industries has been investing in data initiatives, with the percentage today being as high as 97%. Yet only 26.5% of these companies claim to actually be data-driven. Mounting evidence suggests that the main reason for this discrepancy is company culture, meaning both lack of executive buy-in on initiatives and lack of employee confidence in data skills. In a recent survey by Accenture of 9,000 employees from companies across industries, nearly three quarters of respondents (74%) claimed to feel “overwhelmed or unhappy when working with data.” This is critical because, as Gartner states, “the real drivers of [data-driven culture] are the people.” Dataddo’s Free plan aims to help businesses overcome these obstacles by making it easier for them to share data and familiarize employees with visualization tools before investing in paid tools. The plan is the first free data integration plan on the market that puts no cap on extraction limits, and it can be used for an unlimited period of time. Under the plan, subscribers can access any ten of Dataddo’s 200+ connectors and automate the synchronization of any volume of marketing, sales, financial, and other cloud data to up to three dashboarding applications and/or Google Sheets weekly. The company offers a growing library of free templates for popular dashboarding applications, enabling professionals with any level of data skill to start analyzing immediately. The platform is SOC 2 Type II certified and compliant with all major data privacy laws around the globe, including ISO 27001, GDPR for Europe, CCPA and HIPAA in the US, LGPD for Brazil, POPIA for South Africa, and more.

Read More

BIG DATA MANAGEMENT,DATA SCIENCE

Qlik Expands Strategic Alignment with Databricks Through SQL-Based Ingestion to Databricks Lakehouse and Partner Connect Integration

Qlik | September 27, 2022

Qlik® today announced two significant enhancements to its partnership with Databricks that make it easier than ever for customers to combine Qlik’s solutions and Databricks to advance their cloud analytics strategies. First is the launch of the Databricks Lakehouse (Delta) Endpoint, a new capability in Qlik Data Integration, which will simplify and improve customers' ability to ingest and deliver data to the Databricks Lakehouse. Second is the integration of Qlik Cloud® with Databricks Partner Connect, enhancing the Qlik Data Analytics trial experience with Databricks. Both deepen and expand the ability of customers to combine Qlik and Databricks in their efforts to leverage the cloud for impact. “We’re excited about the potential of Qlik’s Databricks Lakehouse (Delta) Endpoint to seamlessly and efficiently deliver the data customers need to drive more value from their investment in the Databricks Lakehouse. “And, with Qlik Analytics now integrated with Databricks Partner Connect, we are making it even easier for customers to discover, use and share data-driven insights across their organizations.” Roger Murff, VP of Technology Partners at Databricks Leveraging Databricks SQL, Qlik’s Databricks (Delta) Endpoint optimizes the continuous and real-time data ingestion through Qlik Data Integration into Delta Lake on Databricks. This gives organizations the ability to cost effectively drive more data from a wide range of enterprise data sources, including SAP and Mainframe, into the Databricks Lakehouse while leveraging their cloud provider of choice such as Amazon Web Services (AWS), Google Cloud Platform (GCP) or Microsoft Azure. Qlik has also recently integrated Qlik Cloud with Databricks Partner Connect. Databricks customers can now seamlessly experience Qlik Sense® SaaS within the Databricks Lakehouse Platform through an existing Qlik tenant or a free Qlik trial. The experience includes automatic configuration of connectivity to the customer’s Databricks environment, making it easier for Databricks customers to experience the power of Qlik Cloud. Both the new Databricks Lakehouse (Delta) Endpoint and Partner Connect integration demonstrate Qlik’s commitment to supporting customers like J.B. Hunt in their efforts to combine Qlik and Databricks for impact. “We’re seeing more demand for real-time data related to shippers and carriers in order to provide up-to-the-minute information on how they are performing. Qlik and Databricks help us to meet those demands,” said Joe Spinelle, Director of Engineering and Technology at J.B. Hunt. “As they migrate more and more to the cloud, Databricks customers want strategic partners that make it as easy as possible to deliver and analyze data for impact,” said Itamar Ankorion, SVP of Technology Alliances at Qlik. “With our new Databricks Lakehouse Endpoint and Databricks Partner Connect integration, Qlik is clearly demonstrating its alignment with Databricks and our dedication to the Databricks community to deliver an amazing experience that furthers their overall data strategies.” About Qlik Qlik’s vision is a data-literate world, where everyone can use data and analytics to improve decision-making and solve their most challenging problems. A private company, Qlik offers real-time data integration and analytics solutions, powered by Qlik Cloud®, to close the gaps between data, insights and action. By transforming data into Active Intelligence, businesses can drive better decisions, improve revenue and profitability, and optimize customer relationships. Qlik serves more than 38,000 active customers in over 100 countries.

Read More

Events