Metadata Driven Data Fabric

Aashish Yadav | July 22, 2022 | 282 views | Read Time : 02:00 min

Metadata Driven Data Fabric
The primary purpose of developing an enterprise data fabric is not new. It is the capacity to provide the appropriate data to the right data consumer at the right moment, in the correct form, and irrespective of how or where it is kept. Data fabric is the common "net" that connects and distributes integrated data from many data and application sources to diverse data consumers.

What distinguishes the data fabric method from prior, more conventional data integration architectures? The primary distinction of data fabric is its dependence on metadata to achieve this purpose. Developing a metadata-driven architecture capable of providing integrated and enhanced data to data consumers is required when implementing a data fabric. Gartner invented the term "active metadata" to highlight this concept.

Data Fabric Relies on Active Metadata
Metadata is used to describe many properties of data. The larger the sets of metadata we gather, the better they will help our application scenarios. Historically, metadata categories included:

  • Business metadata - gives meaning to data by mapping it to business terminology.
  • Technical metadata - contains information on the data's format and structure, like physical database structures, data types, and data models.
  • Operational metadata - includes the specifics of data processing and access, like data sharing regulations, performance, maintenance plans, archiving and retention policies.

Recently, a new kind of metadata has emerged: social metadata. It usually involves conversations and comments from technical and business users on the data. Business metadata has progressed from just mapping words to now including taxonomies to aid in the interpretation of data context and meaning.

What distinguishes active metadata from passive metadata? According to Gartner, passive metadata is any metadata that is gathered. According to some Gartner analysts, active metadata is metadata that is being utilized. We imply the use of metadata by software (like software components inside the data fabric) to enable a wide variety of data integration, analysis, reporting, and other data processing situations. Other analysts take this idea a step further, claiming that the data fabric generates active metadata by evaluating passive information and utilizing the findings to propose or automate operations.

Closing Lines
In this blog, we emphasized metadata management in data platforms in this blog series. We demonstrated how passive metadata management does not meet the demands of current data platform designs and why it must be supplemented, if not replaced, with independent processes of active data management systems.

Spotlight

Musstara Technologies Inc.

Musstara Technologies is a Canada based data science company, providing following services to domestic and international clients; - Predictive Analytics in health, finance, marketing and real estate. - Geographical Information Systems: spatial data modeling, web GIS – interactive web, Mapping – digitization and generation - Optical and Radar Remote Sensing: Georeferencing, Image processing, data format conversion, classification, time series analysis, change detection - Computational Biology and Bioinformatics: protein sequence analysis and design, protein structure prediction, protein sequence Informatics, process automation, exploratory analysis, phylogenetics, statistical modeling, scientific software and database development

OTHER ARTICLES
BUSINESS STRATEGY

Why Adaptive AI Can Overtake Traditional AI

Article | April 4, 2022

With the ever-changing technology world, company demands and results are no longer the norm. Businesses in a variety of sectors are using artificial intelligence (AI) technologies to solve complicated business challenges, build intelligent and self-sustaining solutions, and, ultimately, remain competitive at all times. To that aim, ongoing attempts are being made to reinvent AI systems in order to do more with less. Adaptive AI is a significant step in that direction. It has the potential to outperform standard machine learning (ML) models in the near future because of its ability to enable organizations to get greater results while spending less time, effort, and resources. The capacity of adaptive AI to enable enterprises to achieve greater outcomes while investing less time, effort, and assets is why it can overtake traditional AI models. Why Adaptive AI Overtakes Traditional AI Robust, Efficient and Agile Robustness, efficiency, and agility are the three basic pillars of Adaptive AI. The ability to achieve great algorithmic accuracy is referred to as robustness. The capacity to achieve reduced resource utilization is referred to as efficiency (for example, computer, memory, and power). Agility manages the ability to change operational circumstances in response to changing demands. Together, these three Adaptive AI principles provide the groundwork for super-capable AI inference for edge devices. Data-Informed Predictions A single pipeline is used by the adaptive learning approach. With this method, you can use a continually advanced learning approach that maintains the framework up-to-date and encourages it to achieve high levels of performance. The Adaptive Learning method examines and learns new changes made to the information and produces values, as well as their associated attributes. Moreover, it benefits from events that can modify market behavior in real-time and, as a result, maintains its accuracy consistently. Adaptive AI recognizes information from the operational environment and uses it to produce data-informed predictions. Closing Lines Adaptive AI will be utilized to meet changing AI computing requirements. Operational effectiveness depends on algorithmic performance and available computer resources. Edge AI frameworks that can change their computing demands effectively reduce compute and memory requirements. Adaptive AI is robust in CSPs' dynamic software environments, where inputs and outputs alter with each framework revamp. It can assist with network operations, marketing, customer service, IoT, security, and customer experience.

Read More
BUSINESS STRATEGY

Data-Centric Approach for AI Development

Article | May 13, 2022

As AI has grown in popularity over the past decade, practitioners have concentrated on gathering as much data as possible, classifying it, preparing it for usage, and then iterating on model architectures and hyper-parameters to attain our desired objectives. While dealing with all of this data has long been known as laborious and time-consuming, it has typically been seen as an upfront, one-time step we take before entering into the essential modeling phase of machine learning. Data quality concerns, label noise, model drift, and other biases are all addressed in the same way: by collecting and labeling more data, followed by additional model iterations. The foregoing technique has worked successfully for firms with unlimited resources or strategic challenges. It doesn't work well for machine learning's long-tail issues, particularly those with fewer users and little training data. The discovery that the prevailing method of deep learning doesn't "scale down" to industry challenges has given birth to a new "trend" in the area termed "Data-Centric AI." Implementing a Data-Centric Approach for AI Development Leverage MLOps Practices Data-centric AI prioritizes data over models. Model selection, hyper-parameter tuning, experiment tracking, deployment, and monitoring take time. Data-centric approaches emphasize automating and simplifying ML lifecycle operations. Standardizing and automating model-building requires MLOps. MLOps automates machine learning lifecycle management pipelines. An organizational structure improves communication and cooperation. Involve Domain Expertise Data-centric AI development requires domain-specific datasets. Data scientists can overlook intricacies in various sectors, business processes, or even the same domain. Domain experts can give ground truth for the AI use case and verify whether the dataset truly portrays the situation. Complete and Accurate Data Data gaps cause misleading results. It's crucial to have a training dataset that correctly depicts the underlying real-world phenomenon. Data augmentation or creating synthetic data might be helpful if gathering comprehensive and representative data is costly or challenging for your use case.

Read More
BUSINESS STRATEGY

A Modern Application Must Have: Multi-cloud Database

Article | May 31, 2022

To function well, modern apps require enormous amounts of diverse data from sensors, processes, interactions, etc. However, these apps cannot understand the unstructured big data and extract commercial value for effective operations unless this data is maintained properly. In today's age of cloud computing, apps gather and analyze data from various sources, but the data isn't always kept in the same database or format. While increasing overall complexity, several formats make it more difficult for apps to retain and use various data. Multi-model databases, a cutting-edge management system, provide a sophisticated approach to handling varied and unstructured data. A multi-model database allows various data models to natively utilize a single, integrated backend, as opposed to combining different database models. Why Has Multi-Model Database Become a Necessity for Modern Applications? Modern applications can store diverse data in a single repository owing to the flexible approach to database management, which improves agility and reduces data redundancy. Improve Reliability Each database might be a single point of failure for a larger system or application. Multi-model databases reduce failure points, enhancing data dependability and recovery time. Such recovery minimizes expenses and maintains customer engagement and application experience. Simplify Data Management Fragmented database systems benefit contemporary applications but complicate development and operations. Multi-model databases provide a single backend that maintains data integrity and fault tolerance, eliminating the need for different database systems, software licenses, developers, and administrators. Improve Fault Tolerance Modern apps must be fault-tolerant and respond promptly to failures promptly. Multi-model databases enable this by integrating several systems into a single backend. The integration provides system-wide failure tolerance. Closing Lines As applications get more complicated, so do their database requirements. However, connecting many databases and preserving consistency between data gathered from various sources is a time-consuming and expensive undertaking. Fortunately, multi-model databases provide an excellent option for generating the data models you want on a single backend.

Read More
BIG DATA MANAGEMENT

Why Increasing Data Maturity Help Businesses Unlock Digital Potential?

Article | July 5, 2022

There is no dispute that brands that harness and invest in data capabilities will be the ones to realize their maximum revenue potential. However, while today's marketers have access to a multitude of data sources, understanding what data to use and how to utilize it are two of the biggest challenges for all. Data utilization in companies is an inconsistent experience. Some businesses have sensibly invested in improving their data maturity. They can pivot quickly to maximize income potential in an unstable economic environment. Others face a cycle of declining returns as they try to reproduce previous achievements with variable outcomes. Importance of Data Maturity for Businesses Understanding your organization's data maturity is critical for five reasons. An understanding of data maturity may assist marketers in: Align Recognize which problems and challenges the wider organization is attempting to solve and modify techniques to support those goals. Appreciate Analyze honestly what the company is good at doing presently and where adjustments are needed to create better data decision-making. Evaluate Measure data literacy levels while implementing training and upskilling resources to facilitate the implementation of an open learning environment to encourage innovative thinking. Anticipate As the company's data capabilities develop, look forward to significantly advanced analytics possibilities. Calibrate Optimize technology and infrastructure to extract maximum value now while also appropriately planning for future resources. Future-Proof Your Business with Data Maturity Data maturity applies to the whole organization. It is a company-wide effort that extends beyond the goals of a single sales or marketing team. As a result, it's critical to bring together diverse internal influencers to determine how improvements to your data strategy can assist everyone in achieving the same objectives. The mix of stakeholders is unique to each organization, so it will be determined by your company's priorities.

Read More

Spotlight

Musstara Technologies Inc.

Musstara Technologies is a Canada based data science company, providing following services to domestic and international clients; - Predictive Analytics in health, finance, marketing and real estate. - Geographical Information Systems: spatial data modeling, web GIS – interactive web, Mapping – digitization and generation - Optical and Radar Remote Sensing: Georeferencing, Image processing, data format conversion, classification, time series analysis, change detection - Computational Biology and Bioinformatics: protein sequence analysis and design, protein structure prediction, protein sequence Informatics, process automation, exploratory analysis, phylogenetics, statistical modeling, scientific software and database development

Related News

BIG DATA MANAGEMENT,BUSINESS STRATEGY

New Relic Announces Support for Amazon VPC Flow Logs on Amazon Kinesis Data Firehose

New Relic | September 17, 2022

New Relic , the observability company, announced support for Amazon Virtual Private Cloud (Amazon VPC) Flow Logs on Amazon Kinesis Data Firehose to reduce the friction of sending logs to New Relic. Amazon VPC Flow Logs from AWS is a feature that allows customers to capture information about the IP traffic going to and from network interfaces in their Virtual Private Cloud (VPC). With New Relic support for Amazon VPC Flow Logs, both AWS and New Relic customers can quickly gain a clear understanding of a network’s performance and troubleshoot activity without impacting the network throughput or latency Network telemetry is challenging even for network engineers. To unlock cloud-scale observability, engineers need to explore VPC performance and connectivity across multiple accounts and regions to understand if an issue started in the network or somewhere else. To solve this, New Relic has streamlined the delivery of Amazon VPC Flow Logs by allowing engineers to send them to New Relic via Kinesis Data Firehose, which reliably captures, transforms, and delivers streaming data to data lakes, data stores, and analytics services. With New Relic’s simple “add data” interface, it only takes moments to configure Amazon VPC Flow Logs using the AWS Command Line Interface (AWS CLI) or an AWS CloudFormation template. Instead of digging through raw logs across multiple accounts, any engineer can begin with an Amazon Elastic Compute Cloud (Amazon EC2) instance they own and begin to explore the data that matters, regardless of the AWS account or AWS Region. “New Relic continues to invest in our relationship with AWS. Helping customers gain visibility into their cloud networking environment increases their overall application observability. “Our support for Amazon VPC shows our commitment to enhancing our joint customers’ observability experience.” Riya Shanmugam, GVP, Global Alliances and Channels at New Relic “AWS is delighted to continue our strategic collaboration with New Relic to help customers innovate and migrate faster to the cloud,” said Nishant Mehta, Director of PM – EC2 and VPC Networking at AWS. “New Relic’s connected experience for Amazon VPC Flow Logs, paired with the simplicity of using Kinesis Data Firehose, enables our joint customers to easily understand how their networks are performing, troubleshoot networking issues more quickly, and explore their VPC resources more readily.” With the New Relic support for Amazon VPC Flow Logs on Kinesis Data Firehose, customers can: Monitor and alert on network traffic from within New Relic. Visualize network performance metrics such as bytes and packets per second, as well as accepts and rejects per second across every TCP or UDP port. Explore flow log deviations to look for unexpected changes in network volume or health. Diagnose overly restrictive security group rules or potentially malicious traffic issues. ”Our architecture contains above 200 microservices running on AWS. When something goes wrong, we need to find the root cause quickly to put out what we at Gett term as ‘fires,’” said Dani Konstantinovski, Global Support Manager at Gett. “With New Relic capabilities we can identify the problem, understand exactly what services were affected, what’s the reason, and what we need to do to resolve it. New Relic gives us this observability—it helps us to provide better service for our customers.” “Proactively managing customer experience is essential to all businesses that provide part or all of their services through applications. Therefore it’s essential for engineers to have a clear understanding of their network performance and have the data needed to troubleshoot activity before it impacts customers. Also, the quality of the data is fundamental to making good decisions,” said Stephen Elliot, IDC Group Vice President, I&O, Cloud Operations and DevOps. “Solutions that ensure fast delivery of high-quality data provide engineers with the ability to act quickly and decisively with confidence, saving businesses from the costs associated with negative customer experiences.” About New Relic As a leader in observability, New Relic empowers engineers with a data-driven approach to planning, building, deploying, and running great software. New Relic delivers the only unified data platform that empowers engineers to get all telemetry—metrics, events, logs, and traces—paired with powerful full stack analysis tools to help engineers do their best work with data, not opinions. Delivered through the industry’s first usage-based consumption pricing that’s intuitive and predictable, New Relic gives engineers more value for the money by helping improve planning cycle times, change failure rates, release frequency, and mean time to resolution. This helps the world’s leading brands including Adidas Runtastic, American Red Cross, Australia Post, Banco Inter, Chegg, GoTo Group, Ryanair, Sainsbury’s, Signify Health, TopGolf, and World Fuel Services (WFS) improve uptime, reliability, and operational efficiency to deliver exceptional customer experiences that fuel innovation and growth.

Read More

BIG DATA MANAGEMENT,BUSINESS STRATEGY,DATA SCIENCE

Informatica Powers HelloFresh’s Data-Driven Expansion

Informatica | August 26, 2022

Informatica® , an enterprise cloud data management leader, today announced that HelloFresh, integrated food solutions group and the world’s leading meal kit company, is leveraging Informatica’s data management solutions to improve forecasting, scale to meet demand and manage data as a strategic asset. Today HelloFresh operates in 17 countries and delivered 287.3 million meals to 8.5 million customers in Q1 2022. It has seen exponential growth in the five years since it went public, launching in new countries, opening new brands, and acquiring Greenchef, Youfoodz and Factor. This rapid expansion saw the business tackling increasing challenges around the volume, complexity and obscurity of data. Recognising the value of being data-driven, HelloFresh embarked on a company-wide initiative to shift how data is perceived, managed, and used across the business. Core to this transformation is the global technology department HelloTech, one of the company’s fastest-growing teams, having doubled its size in the past two years and now home to almost 1,000 engineers, designers, data engineers, data scientists, product managers, and analysts. This team plays a critical role in democratizing data across the business, enabling teams to make business decisions and deliver product innovations that are based on solid analysis of millions of data points. Informatica enables enterprises to manage, own and derive insights from their data, which made it a good fit for HelloFresh’s data-driven journey to treating data as a product. Leveraging Informatica’s Enterprise Data Catalog, HelloFresh first increased transparency on its data footprint and then set out to make it discoverable and interoperable. Today, with the help of Informatica’s Axon Data Governance solutions, HelloFresh has increased its ability to deliver trusted, relevant data to users across the business, reducing the time from data to insight. This is noticeable in domains and teams that embarked on the Data Governance journey as early adopters. HelloFresh operates in a data-driven way. Data is not only a resource for technological improvements and scaling the business but is also the base of every team's day-to-day work. All business decisions, including product innovations, are based on the analysis of millions of data-points to build a desirable product for customers. The company’s intelligent data management approach expands beyond technology. HelloFresh recognized the shift required in the organizational mindset and established a Data Literacy Program to change how data is perceived and embed success criteria for data-related initiatives. There has been great success in the process of moving away from excel spreadsheets and embedding data directly into business operations. “Data is HelloFresh’s most precious resource and, working with Informatica, we’ve been able to serve it up as a strategic asset to teams across the business, who use it to accelerate our mission to change the way people eat. “By empowering our teams with access to the right data at the right time and the know-how to use it, we can continue to offer customers fresh, healthy, affordable meals tailored to their tastes.” David Castro-Gavino, Global VP Data at HelloFresh In celebration of its success in driving a data-led transformation, HelloFresh was recognized as a data innovator in the 2022 Informatica Innovation Awards. “HelloFresh is a fantastic example of an intelligent data enterprise – harnessing data as a strategic asset to fuel innovation, improve customer service and drive impressive growth,” said Jitesh Ghai, Chief Product Officer, Informatica. “We were pleased to recognize the HelloFresh team in our Innovation Awards. We look forward to working together as they continue their data-driven journey, democratizing data at scale across the business to foster fast-paced innovation.” About Informatica Informatica , an Enterprise Cloud Data Management leader, empowers businesses to realize the transformative power of data. We have pioneered a new category of software, the Informatica Intelligent Data Management Cloud™(IDMC), powered by AI and a cloud-first, cloud-native, end-to-end data management platform that connects, manages, and unifies data across any multi-cloud, hybrid system, empowering enterprises to modernize and advance their data strategies. Over 5,000 customers in more than 100 countries and 85 of the Fortune 100 rely on Informatica to drive data-led digital transformation. About HelloFresh HelloFresh SE is a global food solutions group and the world's leading meal kit company. The HelloFresh Group consists of six brands that provide customers with high quality food and recipes for different meal occasions. The Company was founded in Berlin in November 2011 and operates in the USA, the UK, Germany, the Netherlands, Belgium, Luxembourg, Australia, Austria, Switzerland, Canada, New Zealand, Sweden, France, Denmark, Norway, Italy and Japan. In Q1 2022 HelloFresh delivered 287 million meals and reached 8.52 million active customers. HelloFresh went public on the Frankfurt Stock Exchange in November 2017 and has been traded on the DAX (German Stock Market Index) since September 2021. HelloFresh has offices in New York, Berlin, London, Amsterdam, Sydney, Toronto, Auckland, Paris, Copenhagen, Milan and Tokyo.

Read More

BIG DATA MANAGEMENT,BUSINESS STRATEGY,DATA SCIENCE

Data Engineering Automation Pioneer Nexla Acquires Fidap

Nexla | August 26, 2022

Nexla , a pioneer in data engineering automation whose customers include JPMorgan, Johnson & Johnson, LinkedIn, Doordash, Poshmark, and Instacart, announced today the acquisition of Fidap. Fidap provides clean data for advanced analytics and machine learning. Nexla is a pioneer in data engineering automation that simplifies the process of making data ready-to-use for enterprise data users. Fidap was started by former Google and Bloomberg product leader Ashish Singal based on the challenges he had seen in getting readily usable external data for machine learning. Fidap was backed by Google's AI-focused fund Gradient Ventures, Engineering Capital, and several key angels including Keenan Rice, Sabrina Hahn, and Ankit Jain. The global big data and data engineering services market is expected to grow to $77.37 billion by 2023 and the Financial Data Service Providers industry is valued at $19.8 billion. "Nexla has been helping scale data engineering at enterprises through automation. "With Fidap's acquisition, we will be solving last mile challenges in getting ready-to-use data into the hands of data users. It will also broaden the capabilities of Nexsets, our data product solution." Saket Saurabh, Co-founder and CEO at Nexla In addition to data engineering tools that help users make data usable, Nexla will now have pre-packaged ready-to-use data sets. These include public datasets such as US Census and Federal Reserve data, and commercial data sets such as equities market data. These datasets will be available in multiple velocities and formats and in major cloud warehouses, live streams, and files in popular cloud storage systems. Nexla originally started with tools to simplify how companies integrate external and third-party data. This acquisition will bring in Fidap's team and technology to help Nexla go a step further with prepared datasets for data scientists and analysts. "Despite the economic downturn, Nexla has been running a responsible cash-flow positive operation for the last fifteen months while maintaining a 300% year over year growth rate and a 400% growth rate in its team," says Saurabh. "At a time of industry wide layoffs and venture capitalists wanting to see a revenue model from the outset, our company has benefited from disciplined execution since day one. We are now leveraging our strategic mergers and acquisitions capabilities to expand both our product portfolio and team." About Nexla Nexla is a pioneer in data engineering automation and makes data ready to use for enterprises. Nexla's platform helps teams create scalable, repeatable, and predictable data flows for any data use case. Nexla's customers include JPMorgan, Johnson & Johnson, LinkedIn, Instacart, and Doordash. Analysts, business users, and data engineers across any sector including e-commerce, insurance, travel, and healthcare can use Nexla to integrate, automate and monitor their incoming and outgoing data flows. The end result is predictable and reliable data access inside and outside the organization. In 2021, Nexla was voted a Gartner Cool Vendor for Data Fabric.

Read More

BIG DATA MANAGEMENT,BUSINESS STRATEGY

New Relic Announces Support for Amazon VPC Flow Logs on Amazon Kinesis Data Firehose

New Relic | September 17, 2022

New Relic , the observability company, announced support for Amazon Virtual Private Cloud (Amazon VPC) Flow Logs on Amazon Kinesis Data Firehose to reduce the friction of sending logs to New Relic. Amazon VPC Flow Logs from AWS is a feature that allows customers to capture information about the IP traffic going to and from network interfaces in their Virtual Private Cloud (VPC). With New Relic support for Amazon VPC Flow Logs, both AWS and New Relic customers can quickly gain a clear understanding of a network’s performance and troubleshoot activity without impacting the network throughput or latency Network telemetry is challenging even for network engineers. To unlock cloud-scale observability, engineers need to explore VPC performance and connectivity across multiple accounts and regions to understand if an issue started in the network or somewhere else. To solve this, New Relic has streamlined the delivery of Amazon VPC Flow Logs by allowing engineers to send them to New Relic via Kinesis Data Firehose, which reliably captures, transforms, and delivers streaming data to data lakes, data stores, and analytics services. With New Relic’s simple “add data” interface, it only takes moments to configure Amazon VPC Flow Logs using the AWS Command Line Interface (AWS CLI) or an AWS CloudFormation template. Instead of digging through raw logs across multiple accounts, any engineer can begin with an Amazon Elastic Compute Cloud (Amazon EC2) instance they own and begin to explore the data that matters, regardless of the AWS account or AWS Region. “New Relic continues to invest in our relationship with AWS. Helping customers gain visibility into their cloud networking environment increases their overall application observability. “Our support for Amazon VPC shows our commitment to enhancing our joint customers’ observability experience.” Riya Shanmugam, GVP, Global Alliances and Channels at New Relic “AWS is delighted to continue our strategic collaboration with New Relic to help customers innovate and migrate faster to the cloud,” said Nishant Mehta, Director of PM – EC2 and VPC Networking at AWS. “New Relic’s connected experience for Amazon VPC Flow Logs, paired with the simplicity of using Kinesis Data Firehose, enables our joint customers to easily understand how their networks are performing, troubleshoot networking issues more quickly, and explore their VPC resources more readily.” With the New Relic support for Amazon VPC Flow Logs on Kinesis Data Firehose, customers can: Monitor and alert on network traffic from within New Relic. Visualize network performance metrics such as bytes and packets per second, as well as accepts and rejects per second across every TCP or UDP port. Explore flow log deviations to look for unexpected changes in network volume or health. Diagnose overly restrictive security group rules or potentially malicious traffic issues. ”Our architecture contains above 200 microservices running on AWS. When something goes wrong, we need to find the root cause quickly to put out what we at Gett term as ‘fires,’” said Dani Konstantinovski, Global Support Manager at Gett. “With New Relic capabilities we can identify the problem, understand exactly what services were affected, what’s the reason, and what we need to do to resolve it. New Relic gives us this observability—it helps us to provide better service for our customers.” “Proactively managing customer experience is essential to all businesses that provide part or all of their services through applications. Therefore it’s essential for engineers to have a clear understanding of their network performance and have the data needed to troubleshoot activity before it impacts customers. Also, the quality of the data is fundamental to making good decisions,” said Stephen Elliot, IDC Group Vice President, I&O, Cloud Operations and DevOps. “Solutions that ensure fast delivery of high-quality data provide engineers with the ability to act quickly and decisively with confidence, saving businesses from the costs associated with negative customer experiences.” About New Relic As a leader in observability, New Relic empowers engineers with a data-driven approach to planning, building, deploying, and running great software. New Relic delivers the only unified data platform that empowers engineers to get all telemetry—metrics, events, logs, and traces—paired with powerful full stack analysis tools to help engineers do their best work with data, not opinions. Delivered through the industry’s first usage-based consumption pricing that’s intuitive and predictable, New Relic gives engineers more value for the money by helping improve planning cycle times, change failure rates, release frequency, and mean time to resolution. This helps the world’s leading brands including Adidas Runtastic, American Red Cross, Australia Post, Banco Inter, Chegg, GoTo Group, Ryanair, Sainsbury’s, Signify Health, TopGolf, and World Fuel Services (WFS) improve uptime, reliability, and operational efficiency to deliver exceptional customer experiences that fuel innovation and growth.

Read More

BIG DATA MANAGEMENT,BUSINESS STRATEGY,DATA SCIENCE

Informatica Powers HelloFresh’s Data-Driven Expansion

Informatica | August 26, 2022

Informatica® , an enterprise cloud data management leader, today announced that HelloFresh, integrated food solutions group and the world’s leading meal kit company, is leveraging Informatica’s data management solutions to improve forecasting, scale to meet demand and manage data as a strategic asset. Today HelloFresh operates in 17 countries and delivered 287.3 million meals to 8.5 million customers in Q1 2022. It has seen exponential growth in the five years since it went public, launching in new countries, opening new brands, and acquiring Greenchef, Youfoodz and Factor. This rapid expansion saw the business tackling increasing challenges around the volume, complexity and obscurity of data. Recognising the value of being data-driven, HelloFresh embarked on a company-wide initiative to shift how data is perceived, managed, and used across the business. Core to this transformation is the global technology department HelloTech, one of the company’s fastest-growing teams, having doubled its size in the past two years and now home to almost 1,000 engineers, designers, data engineers, data scientists, product managers, and analysts. This team plays a critical role in democratizing data across the business, enabling teams to make business decisions and deliver product innovations that are based on solid analysis of millions of data points. Informatica enables enterprises to manage, own and derive insights from their data, which made it a good fit for HelloFresh’s data-driven journey to treating data as a product. Leveraging Informatica’s Enterprise Data Catalog, HelloFresh first increased transparency on its data footprint and then set out to make it discoverable and interoperable. Today, with the help of Informatica’s Axon Data Governance solutions, HelloFresh has increased its ability to deliver trusted, relevant data to users across the business, reducing the time from data to insight. This is noticeable in domains and teams that embarked on the Data Governance journey as early adopters. HelloFresh operates in a data-driven way. Data is not only a resource for technological improvements and scaling the business but is also the base of every team's day-to-day work. All business decisions, including product innovations, are based on the analysis of millions of data-points to build a desirable product for customers. The company’s intelligent data management approach expands beyond technology. HelloFresh recognized the shift required in the organizational mindset and established a Data Literacy Program to change how data is perceived and embed success criteria for data-related initiatives. There has been great success in the process of moving away from excel spreadsheets and embedding data directly into business operations. “Data is HelloFresh’s most precious resource and, working with Informatica, we’ve been able to serve it up as a strategic asset to teams across the business, who use it to accelerate our mission to change the way people eat. “By empowering our teams with access to the right data at the right time and the know-how to use it, we can continue to offer customers fresh, healthy, affordable meals tailored to their tastes.” David Castro-Gavino, Global VP Data at HelloFresh In celebration of its success in driving a data-led transformation, HelloFresh was recognized as a data innovator in the 2022 Informatica Innovation Awards. “HelloFresh is a fantastic example of an intelligent data enterprise – harnessing data as a strategic asset to fuel innovation, improve customer service and drive impressive growth,” said Jitesh Ghai, Chief Product Officer, Informatica. “We were pleased to recognize the HelloFresh team in our Innovation Awards. We look forward to working together as they continue their data-driven journey, democratizing data at scale across the business to foster fast-paced innovation.” About Informatica Informatica , an Enterprise Cloud Data Management leader, empowers businesses to realize the transformative power of data. We have pioneered a new category of software, the Informatica Intelligent Data Management Cloud™(IDMC), powered by AI and a cloud-first, cloud-native, end-to-end data management platform that connects, manages, and unifies data across any multi-cloud, hybrid system, empowering enterprises to modernize and advance their data strategies. Over 5,000 customers in more than 100 countries and 85 of the Fortune 100 rely on Informatica to drive data-led digital transformation. About HelloFresh HelloFresh SE is a global food solutions group and the world's leading meal kit company. The HelloFresh Group consists of six brands that provide customers with high quality food and recipes for different meal occasions. The Company was founded in Berlin in November 2011 and operates in the USA, the UK, Germany, the Netherlands, Belgium, Luxembourg, Australia, Austria, Switzerland, Canada, New Zealand, Sweden, France, Denmark, Norway, Italy and Japan. In Q1 2022 HelloFresh delivered 287 million meals and reached 8.52 million active customers. HelloFresh went public on the Frankfurt Stock Exchange in November 2017 and has been traded on the DAX (German Stock Market Index) since September 2021. HelloFresh has offices in New York, Berlin, London, Amsterdam, Sydney, Toronto, Auckland, Paris, Copenhagen, Milan and Tokyo.

Read More

BIG DATA MANAGEMENT,BUSINESS STRATEGY,DATA SCIENCE

Data Engineering Automation Pioneer Nexla Acquires Fidap

Nexla | August 26, 2022

Nexla , a pioneer in data engineering automation whose customers include JPMorgan, Johnson & Johnson, LinkedIn, Doordash, Poshmark, and Instacart, announced today the acquisition of Fidap. Fidap provides clean data for advanced analytics and machine learning. Nexla is a pioneer in data engineering automation that simplifies the process of making data ready-to-use for enterprise data users. Fidap was started by former Google and Bloomberg product leader Ashish Singal based on the challenges he had seen in getting readily usable external data for machine learning. Fidap was backed by Google's AI-focused fund Gradient Ventures, Engineering Capital, and several key angels including Keenan Rice, Sabrina Hahn, and Ankit Jain. The global big data and data engineering services market is expected to grow to $77.37 billion by 2023 and the Financial Data Service Providers industry is valued at $19.8 billion. "Nexla has been helping scale data engineering at enterprises through automation. "With Fidap's acquisition, we will be solving last mile challenges in getting ready-to-use data into the hands of data users. It will also broaden the capabilities of Nexsets, our data product solution." Saket Saurabh, Co-founder and CEO at Nexla In addition to data engineering tools that help users make data usable, Nexla will now have pre-packaged ready-to-use data sets. These include public datasets such as US Census and Federal Reserve data, and commercial data sets such as equities market data. These datasets will be available in multiple velocities and formats and in major cloud warehouses, live streams, and files in popular cloud storage systems. Nexla originally started with tools to simplify how companies integrate external and third-party data. This acquisition will bring in Fidap's team and technology to help Nexla go a step further with prepared datasets for data scientists and analysts. "Despite the economic downturn, Nexla has been running a responsible cash-flow positive operation for the last fifteen months while maintaining a 300% year over year growth rate and a 400% growth rate in its team," says Saurabh. "At a time of industry wide layoffs and venture capitalists wanting to see a revenue model from the outset, our company has benefited from disciplined execution since day one. We are now leveraging our strategic mergers and acquisitions capabilities to expand both our product portfolio and team." About Nexla Nexla is a pioneer in data engineering automation and makes data ready to use for enterprises. Nexla's platform helps teams create scalable, repeatable, and predictable data flows for any data use case. Nexla's customers include JPMorgan, Johnson & Johnson, LinkedIn, Instacart, and Doordash. Analysts, business users, and data engineers across any sector including e-commerce, insurance, travel, and healthcare can use Nexla to integrate, automate and monitor their incoming and outgoing data flows. The end result is predictable and reliable data access inside and outside the organization. In 2021, Nexla was voted a Gartner Cool Vendor for Data Fabric.

Read More

Events