Augmented Analytics: The Next-Big Thing in Data Analytics

Aashish Yadav | June 9, 2022 | 173 views

Augmented Analytics: The Next-Big Thing in Data Analytics
The next significant wave in data and analytics will empower businesses to achieve business value quicker, more effectively, and on a far wider scale. The integration of artificial intelligence and predictive analytics alters the way analytical material is produced, consumed, and shared. Yes, we are discussing augmented analytics. Augmented analytics assists in digging deeper into the "why" of the result and produces more accurate predictions.

Augmented Analytics: A Valuable Tech for Businesses



How Augmented Analytics Empowers Marketers in Making Better Decisions and Converting Prospects

Marketing teams all across the globe are battling with frozen or declining budgets, yet they are still expected to generate pipelines and increase revenue. The great news is that, due to the advantages of augmented analytics, marketers no longer have to depend just on gut sense, previous experience, estimations, or trial and error. Instead, they can depend on data-driven marketing choices that are based on insights generated by augmented analytics.

  • Augmented analytics uses AI and ML to discover key drivers and helps marketers understand why metrics change.
  • Augmented analytics provides recommendations and actionable insights to marketers and helps them improve campaign outcomes.
  • Augmented analytics reduces time-to-insight by automatically surfacing actionable insights on various customer data points to increase conversion and win.
  • Augmented analytics minimize human work and turnaround time on insights, assisting marketers in recognizing areas of greatest potential and increasing ROI on marketing expenditures.
  • According to Salesforce Research, the top reason marketers embraced AI in 2021 was to drive the next best actions. That is exactly what augmented analytics offers marketers: the ability to provide actionable insights to teams.


Closing Notes

Every day, the incredible rise of IoT devices generates massive amounts of data. The AI-powered analytical tools can extract the maximum value from the data.

Spotlight

Samara

Samara Technology Group LLC, services the performance needs of companies in the HPC and embedded computing space. We develop performance tool stacks and optimized scientific libraries for a variety of architectures. We also provide performance analysis and custom code tuning for our clients.

OTHER ARTICLES
BUSINESS INTELLIGENCE, BIG DATA MANAGEMENT, DATA VISUALIZATION

How is Data Virtualization Shifting the Tailwind in Data Management?

Article | November 16, 2022

Over the past couple of years, a significant rise in the trend of digitalization has been witnessed across almost all industries, resulting in the creation of large volumes of data. In addition, an unprecedented proliferation of applications and the rise in the use of social media, cloud and mobile computing, the Internet of Things, and others have created the need for collecting, combining, and curating massive amounts of data. As the importance of data continues to grow across businesses, companies aim to collect data from the web, social media, AI-powered devices, and other sources in different formats, making it trickier for them to manage this unstructured data. Hence, smarter companies are investing in innovative solutions, such as data virtualization, to access and modify data stored across siloed, disparate systems through a unified view. This helps them bridge critical decision-making data together, fuel analytics, and make strategic and well-informed decisions. Why is Data Virtualization Emerging as A New Frontier in Data Management? In the current competitive corporate world, where data needs are increasing at the same rate as the volume of data companies hold, it is becoming essential to manage and harness data effectively. As enterprises focus on accumulating multiple types of data, the effort of managing it has outgrown the capacity of traditional data integration tools, such as data warehouse software and Extract Transform Load (ETL) systems. With the growing need for more effective data integration solutions, high-speed information sharing, and non-stop data transmission, advanced tools such as data virtualization are gaining massive popularity among corporate firms and other IT infrastructures. Data virtualization empowers organizations to accumulate and integrate data from multiple channels, locations, sources, and formats to create a unified stream of data without any redundancy or overlap, resulting in faster integration speeds and enhanced decision-making. What are the key features that make data virtualization a new frontier in data management? Let's see: Modernize Information Infrastructure With the ability to hide the underlying systems, data virtualization allows companies to replace their old infrastructure with cutting-edge cloud applications without affecting day-to-day business operations. Enhance Data Protection Data virtualization enables CxOs to identify and isolate vital source systems from users and applications, which assists organizations in preventing the latter from making unintended changes to the data, as well as allowing them to enforce data governance and security. Deliver Information Faster and Cheaper Data replication takes time and costs money; the "zero replication" method used by data virtualization allows businesses to obtain up-to-the-minute information without having to invest in additional storage space, thereby saving on the operation cost. Increase Business Productivity By delivering data in real time, the integration of data virtualization empowers businesses to access the most recent data during regular business operations. In addition, it enhances the utilization of servers and storage resources and allows data engineering teams to do more in less time, thereby increasing productivity. Use Fewer Development Resources Data virtualization lowers the need for human coding, allowing developers to focus on the faster delivery of information at scale. With its simplified view-based methodology, data virtualization also enables CxOs to reduce development resources by around one-fourth. Data Virtualization: The Future Ahead With the growing significance of data across enterprises and increasing data volume, variety, complexity, compliance requirements, and others, every organization is looking for well-governed, consistent, and secure data that is easy to access and use. As data virtualization unifies and integrates the data from different systems, providing new ways to access, manage, and deliver data without replicating it, more and more organizations are investing in data virtualization software and solutions and driving greater business value from their data.

Read More
BIG DATA MANAGEMENT

How Artificial Intelligence Is Transforming Businesses

Article | July 12, 2022

Whilst there are many people that associate AI with sci-fi novels and films, its reputation as an antagonist to fictional dystopic worlds is now becoming a thing of the past, as the technology becomes more and more integrated into our everyday lives.AI technologies have become increasingly more present in our daily lives, not just with Alexa’s in the home, but also throughout businesses everywhere, disrupting a variety of different industries with often tremendous results. The technology has helped to streamline even the most mundane of tasks whilst having a breath-taking impact on a company’s efficiency and productivity.However, AI has not only transformed administrative processes and freed up more time for companies, it has also contributed to some ground-breaking moments in business, being a must-have for many in order to keep up with the competition.

Read More
BIG DATA MANAGEMENT

DRIVING DIGITAL TRANSFORMATION WITH RPA, ML AND WORKFLOW AUTOMATION

Article | July 6, 2022

The latest pace of advancements in technology paves way for businesses to pay attention to digital strategy in order to drive effective digital transformation. Digital strategy focuses on leveraging technology to enhance business performance, specifying the direction where organizations can create new competitive advantages with it. Despite a lot of buzz around its advancement, digital transformation initiatives in most businesses are still in its infancy.Organizations that have successfully implemented and are effectively navigating their way towards digital transformation have seen that deploying a low-code workflow automation platform makes them more efficient.

Read More

AI and Predictive Analytics: Myth, Math, or Magic

Article | February 10, 2020

We are a species invested in predicting the future as if our lives depended on it. Indeed, good predictions of where wolves might lurk were once a matter of survival. Even as civilization made us physically safer, prediction has remained a mainstay of culture, from the haruspices of ancient Rome inspecting animal entrails to business analysts dissecting a wealth of transactions to foretell future sales. With these caveats in mind, I predict that in 2020 (and the decade ahead) we will struggle if we unquestioningly adopt artificial intelligence (AI) in predictive analytics, founded on an unjustified overconfidence in the almost mythical power of AI's mathematical foundations. This is another form of the disease of technochauvinism I discussed in a previous article.

Read More

Spotlight

Samara

Samara Technology Group LLC, services the performance needs of companies in the HPC and embedded computing space. We develop performance tool stacks and optimized scientific libraries for a variety of architectures. We also provide performance analysis and custom code tuning for our clients.

Related News

BIG DATA MANAGEMENT

ClickHouse Launches Cloud Offering For World’s Fastest OLAP Database Management System

ClickHouse | December 07, 2022

Today, ClickHouse, Inc, creators of the online analytical processing (OLAP) database management system, announced the general availability of their newest offering, ClickHouse Cloud, a lightning-fast cloud-based database that simplifies and accelerates insights and analytics for modern digital enterprises. With no infrastructure to manage, ClickHouse Cloud architecture decouples storage and compute and scales automatically to accommodate modern workloads, so users do not have to size and tune their clusters to achieve blazing-fast query speeds. This launch includes a host of new product features, enhancing the security, reliability and usability of ClickHouse Cloud. ClickHouse technology allows a company to turn their data into insights and innovation in near real-time, whether it's a bank trying to detect fraud or a streaming service tracking the next blockbuster. With every modern business relying on massive volumes of data, the ability to derive insights in milliseconds from petabytes of data becomes critically important. The launch of ClickHouse Cloud, available at both www.clickhouse.com and through the AWS Marketplace, allows any company to access this technology on demand for the first time. ClickHouse Cloud is production-ready with a host of new features, including SOC 2 Type II compliance and uptime SLAs for production workloads, with a public trust center and status page that provide reassurance to customers building mission-critical data-based apps on the service. Following the successful acquisition of database client Arctype, users will benefit from a new SQL console that will enable them to connect, explore, and query their databases easily. “The advantage of ClickHouse is speed and simplicity, and ClickHouse Cloud takes that to a new level, enabling businesses to start a service and analyze data a fraction of the cost of other solutions on the market, In just a few months, the ClickHouse Cloud beta has gained over 100 customers and thousands of new users spanning across developers, data analysts, marketing and other critical areas of business where data is analyzed and stored.” -Aaron Katz, CEO of ClickHouse. ClickHouse Cloud aligns with our desire to empower our developers to go from concept to delivery on real-time analytics use cases in days, speeding up the product innovation cycle drastically. We are thrilled to collaborate with ClickHouse Cloud when it comes to performance, scalability, and security, says Eyal Manor, Chief Product Officer of Twilio. Over 100 paying customers have already adopted ClickHouse Cloud during the two-month beta phase, and are experiencing the ability to focus on developing business-critical data applications without the burden of operations resources. In addition, the serverless, production-ready ClickHouse Cloud offering adds a tier optimized for development use cases. This is tuned for smaller workloads and recognizes the importance of lower-cost services enabling rapid prototyping of new features and data products. A user can now architect, model, and experiment with ClickHouse Cloud in preparation for a full production deployment. Alongside these fundamental product announcements, ClickHouse is delighted to further validate its market opportunity, team, and business model following fresh investment from leading technology investor Thrive Capital, as an extension to its Series B. This funding will support further investment in technology and allow ClickHouse to continue building its world-leading team of software engineers. “ClickHouse offers the most efficient database for fast and large-scale analytics, We have long admired this team and are excited to partner with them as they launch ClickHouse Cloud to an even wider audience.” -Avery KIemmer, an investor at Thrive Capital. About ClickHouse: ClickHouse is ​the world's fastest and most resource-efficient online analytical ​column-oriented database management system​. Now offered as a secure and scalable serverless offering in the cloud, ClickHouse Cloud allows anyone to effortlessly take advantage of efficient real-time analytical processing​.

Read More

BUSINESS INTELLIGENCE, BIG DATA MANAGEMENT, BUSINESS STRATEGY

Infleqtion Unveils SupercheQ, a Quantum Advantage for Distributed Databases

Infleqtion | December 09, 2022

Infleqtion, the global quantum ecosystem leader, today unveiled SupercheQ: Quantum Advantage for Distributed Databases, a scientific advance that extends the power of quantum computation to new applications involving distributed data. The emergence of commercial quantum hardware has been accompanied by new approaches to benchmarking quantum computers. In addition to application-centric benchmarking approaches such as Infleqtion's SupermarQ suite, scientists have developed benchmarks based on sampling from random quantum circuits. These benchmarks, including Quantum Volume, have enabled effective cross-platform comparisons, but until now have been disconnected from specific applications of quantum computers. The launch of SupercheQ changes this by endowing these random circuit sampling experiments with their first application. "SupercheQ achieves an exponential advantage for one of the most fundamental tasks in distributed computing: checking if two files are identical. "We leveraged recent advances in quantum information theory to show that the same families of circuits behind quantum volume can be used to realize this advantage." Pranav Gokhale, Vice President of Quantum Software at Infleqtion Gokhale will present SupercheQ on December 8th at Q2B in Santa Clara. Q2B, the world's largest non-academic quantum industry conference, brings together over 1,000 attendees from commercial companies and research institutions from around the world. SupercheQ has been experimentally validated by execution on superconducting quantum hardware from IBM Quantum, which also pioneered the invention of the Quantum Volume benchmarking metric. "The development of SupercheQ is an exciting step forward that starts to connect the dots between quality as measured by Quantum Volume to applications," said Jay Gambetta, IBM Fellow and Vice President of IBM Quantum. "The experimental validation on IBM Quantum hardware demonstrates the need for reliable and available hardware to advance quantum and build this industry together." In addition to the experimental validation on quantum hardware, the team performed large-scale simulations by leveraging NVIDIA GPUs, as well as the cuQuantum software development kit. "The launch of SupercheQ expands the possibilities of the tasks and types of data that can be addressed by a quantum computer," said Tim Costa, Director of HPC and Quantum Computing Products at NVIDIA. "The team's use of NVIDIA GPUs and cuQuantum has enabled them to validate SupercheQ's practical value at the scale of future quantum computers with hundreds of qubits." SupercheQ has been integrated into SuperstaQ, Infleqtion's flagship cloud quantum software platform, and the technical details behind SupercheQ have now been released in an academic paper, "SupercheQ: Quantum Advantage for Distributed Databases." Customers can get started today with a qiskit-superstaq tutorial notebook. SupercheQ is also in a private release as a new benchmark available in the SupermarQ suite. The development of SupercheQ—which requires both quantum computers and quantum networks—originates in Infleqtion's platform approach, spanning multiple quantum technologies. "We believe that the greatest advances in quantum will arise at the intersection of quantum computing, sensing, networking, and clock technologies," said Paul Lipman, Infleqtion's President of Quantum Information Platforms. "SupercheQ is an exemplar of this approach." About Infleqtion Infleqtion is building an ecosystem of quantum technologies and commercial products for today, that will drive the company and the entire industry toward tomorrow. The company believes in taking quantum to its limit and leading from the edge. Infleqtion is built on 15 years of pioneering quantum research from ColdQuanta. Its scalable and versatile quantum technology is used by organizations around the globe and deployed by NASA on the International Space Station. Infleqtion is based in Boulder, CO, with offices in Chicago, IL; Madison, WI; Melbourne, AU and Oxford, UK.

Read More

DATA VISUALIZATION

Opaque Systems, Pioneer in Confidential Computing, Unveils the First Multi-Party Confidential AI and Analytics Platform

Opaque Systems | December 08, 2022

Opaque Systems, the pioneers of secure multi-party analytics and AI for Confidential Computing, today announced the latest advancements in Confidential AI and Analytics with the unveiling of its platform. The Opaque platform, built to unlock use cases in Confidential Computing, is created by the inventors of the popular MC2 open source project which was conceived in the RISELab at UC Berkeley. The Opaque Platform uniquely enables data scientists within and across organizations to securely share data and perform collaborative analytics directly on encrypted data protected by Trusted Execution Environments (TEEs). The platform further accelerates Confidential Computing use cases by enabling data scientists to leverage their existing SQL and Python skills to run analytics and machine learning while working with confidential data, overcoming the data analytics challenges inherent in TEEs due to their strict protection of how data is accessed and used. The Opaque platform advancements come on the heels of Opaque announcing its $22M Series A funding, Confidential Computing – projected to be a $54B market by 2026 by the Everest Group – provides a solution using TEEs or 'enclaves' that encrypt data during computation, isolating it from access, exposure and threats. However, TEEs have historically been challenging for data scientists due to the restricted access to data, lack of tools that enable data sharing and collaborative analytics, and the highly specialized skills needed to work with data encrypted in TEEs. The Opaque Platform overcomes these challenges by providing the first multi-party confidential analytics and AI solution that makes it possible to run frictionless analytics on encrypted data within TEEs, enable secure data sharing, and for the first time, enable multiple parties to perform collaborative analytics while ensuring each party only has access to the data they own. "Traditional approaches for protecting data and managing data privacy leave data exposed and at risk when being processed by applications, analytics, and machine learning (ML) models, The Opaque Confidential AI and Analytics Platform solves this challenge by enabling data scientists and analysts to perform scalable, secure analytics and machine learning directly on encrypted data within enclaves to unlock Confidential Computing use cases." -Rishabh Poddar, Co-founder & CEO, Opaque Systems. Strict privacy regulations result in sensitive data being difficult to access and analyze, said a Data Science Leader at a top US bank. New multi-party secure analytics and computational capabilities and Privacy Enhancing Technology from Opaque Systems will significantly improve the accuracy of AI/ML/NLP models and speed insights. The Opaque Confidential AI and Analytics Platform is designed to specifically ensure that both code and data within enclaves are inaccessible to other users or processes that are collocated on the system. Organizations can encrypt their confidential data on-premises, accelerate the transition of sensitive workloads to enclaves in Confidential Computing Clouds, and analyze encrypted data while ensuring it is never unencrypted during the lifecycle of the computation. Key capabilities and advancements include: Secure, Multi-Party Collaborative Analytics – Multiple data owners can pool their encrypted data together in the cloud, and jointly analyze the collective data without compromising confidentiality. Policy enforcement capabilities ensure the data owned by each party is never exposed to other data owners. Secure Data Sharing and Data Privacy – Teams across departments and across organizations can securely share data protected in TEEs while adhering to regulatory and compliance policies. Use cases requiring confidential data sharing include financial crime, drug research, ad targeting monetization and more. Data Protection Throughout the Lifecycle – Protects all sensitive data, including PII and SHI data, using advanced encryption and secure hardware enclave technology, throughout the lifecycle of computation—from data upload, to analytics and insights. Multi-tiered Security, Policy Enforcement, and Governance – Leverages multiple layers of security, including Intel® Software Guard Extensions, secure enclaves, advanced cryptography and policy enforcement to provide defense in depth, ensuring code integrity, data, and side-channel attack protection. Scalability and Orchestration of Enclave Clusters – Provides distributed confidential data processing across managed TEE clusters and automates orchestration of clusters overcoming performance and scaling challenges and supports secure inter-enclave communication. Confidential Computing is supported by all major cloud vendors including Microsoft Azure, Google Cloud and Amazon Web Services and major chip manufacturers including Intel and AMD. About Opaque Systems: Commercializing the open source MC2 technology invented at UC Berkeley by its founders, Opaque System provides the first collaborative analytics and AI platform for Confidential Computing. Opaque uniquely enables data to be securely shared and analyzed by multiple parties while maintaining complete confidentiality and protecting data end-to-end. The Opaque Platform leverages a novel combination of two key technologies layered on top of state-of-the-art cloud security—secure hardware enclaves and cryptographic fortification. This combination ensures that the overall computation is secure, fast, and scalable. The MC2 technology and Opaque innovation has already been adopted by several organizations, such as Ant Group, IBM, Scotiabank, and Ericsson.

Read More

BIG DATA MANAGEMENT

ClickHouse Launches Cloud Offering For World’s Fastest OLAP Database Management System

ClickHouse | December 07, 2022

Today, ClickHouse, Inc, creators of the online analytical processing (OLAP) database management system, announced the general availability of their newest offering, ClickHouse Cloud, a lightning-fast cloud-based database that simplifies and accelerates insights and analytics for modern digital enterprises. With no infrastructure to manage, ClickHouse Cloud architecture decouples storage and compute and scales automatically to accommodate modern workloads, so users do not have to size and tune their clusters to achieve blazing-fast query speeds. This launch includes a host of new product features, enhancing the security, reliability and usability of ClickHouse Cloud. ClickHouse technology allows a company to turn their data into insights and innovation in near real-time, whether it's a bank trying to detect fraud or a streaming service tracking the next blockbuster. With every modern business relying on massive volumes of data, the ability to derive insights in milliseconds from petabytes of data becomes critically important. The launch of ClickHouse Cloud, available at both www.clickhouse.com and through the AWS Marketplace, allows any company to access this technology on demand for the first time. ClickHouse Cloud is production-ready with a host of new features, including SOC 2 Type II compliance and uptime SLAs for production workloads, with a public trust center and status page that provide reassurance to customers building mission-critical data-based apps on the service. Following the successful acquisition of database client Arctype, users will benefit from a new SQL console that will enable them to connect, explore, and query their databases easily. “The advantage of ClickHouse is speed and simplicity, and ClickHouse Cloud takes that to a new level, enabling businesses to start a service and analyze data a fraction of the cost of other solutions on the market, In just a few months, the ClickHouse Cloud beta has gained over 100 customers and thousands of new users spanning across developers, data analysts, marketing and other critical areas of business where data is analyzed and stored.” -Aaron Katz, CEO of ClickHouse. ClickHouse Cloud aligns with our desire to empower our developers to go from concept to delivery on real-time analytics use cases in days, speeding up the product innovation cycle drastically. We are thrilled to collaborate with ClickHouse Cloud when it comes to performance, scalability, and security, says Eyal Manor, Chief Product Officer of Twilio. Over 100 paying customers have already adopted ClickHouse Cloud during the two-month beta phase, and are experiencing the ability to focus on developing business-critical data applications without the burden of operations resources. In addition, the serverless, production-ready ClickHouse Cloud offering adds a tier optimized for development use cases. This is tuned for smaller workloads and recognizes the importance of lower-cost services enabling rapid prototyping of new features and data products. A user can now architect, model, and experiment with ClickHouse Cloud in preparation for a full production deployment. Alongside these fundamental product announcements, ClickHouse is delighted to further validate its market opportunity, team, and business model following fresh investment from leading technology investor Thrive Capital, as an extension to its Series B. This funding will support further investment in technology and allow ClickHouse to continue building its world-leading team of software engineers. “ClickHouse offers the most efficient database for fast and large-scale analytics, We have long admired this team and are excited to partner with them as they launch ClickHouse Cloud to an even wider audience.” -Avery KIemmer, an investor at Thrive Capital. About ClickHouse: ClickHouse is ​the world's fastest and most resource-efficient online analytical ​column-oriented database management system​. Now offered as a secure and scalable serverless offering in the cloud, ClickHouse Cloud allows anyone to effortlessly take advantage of efficient real-time analytical processing​.

Read More

BUSINESS INTELLIGENCE, BIG DATA MANAGEMENT, BUSINESS STRATEGY

Infleqtion Unveils SupercheQ, a Quantum Advantage for Distributed Databases

Infleqtion | December 09, 2022

Infleqtion, the global quantum ecosystem leader, today unveiled SupercheQ: Quantum Advantage for Distributed Databases, a scientific advance that extends the power of quantum computation to new applications involving distributed data. The emergence of commercial quantum hardware has been accompanied by new approaches to benchmarking quantum computers. In addition to application-centric benchmarking approaches such as Infleqtion's SupermarQ suite, scientists have developed benchmarks based on sampling from random quantum circuits. These benchmarks, including Quantum Volume, have enabled effective cross-platform comparisons, but until now have been disconnected from specific applications of quantum computers. The launch of SupercheQ changes this by endowing these random circuit sampling experiments with their first application. "SupercheQ achieves an exponential advantage for one of the most fundamental tasks in distributed computing: checking if two files are identical. "We leveraged recent advances in quantum information theory to show that the same families of circuits behind quantum volume can be used to realize this advantage." Pranav Gokhale, Vice President of Quantum Software at Infleqtion Gokhale will present SupercheQ on December 8th at Q2B in Santa Clara. Q2B, the world's largest non-academic quantum industry conference, brings together over 1,000 attendees from commercial companies and research institutions from around the world. SupercheQ has been experimentally validated by execution on superconducting quantum hardware from IBM Quantum, which also pioneered the invention of the Quantum Volume benchmarking metric. "The development of SupercheQ is an exciting step forward that starts to connect the dots between quality as measured by Quantum Volume to applications," said Jay Gambetta, IBM Fellow and Vice President of IBM Quantum. "The experimental validation on IBM Quantum hardware demonstrates the need for reliable and available hardware to advance quantum and build this industry together." In addition to the experimental validation on quantum hardware, the team performed large-scale simulations by leveraging NVIDIA GPUs, as well as the cuQuantum software development kit. "The launch of SupercheQ expands the possibilities of the tasks and types of data that can be addressed by a quantum computer," said Tim Costa, Director of HPC and Quantum Computing Products at NVIDIA. "The team's use of NVIDIA GPUs and cuQuantum has enabled them to validate SupercheQ's practical value at the scale of future quantum computers with hundreds of qubits." SupercheQ has been integrated into SuperstaQ, Infleqtion's flagship cloud quantum software platform, and the technical details behind SupercheQ have now been released in an academic paper, "SupercheQ: Quantum Advantage for Distributed Databases." Customers can get started today with a qiskit-superstaq tutorial notebook. SupercheQ is also in a private release as a new benchmark available in the SupermarQ suite. The development of SupercheQ—which requires both quantum computers and quantum networks—originates in Infleqtion's platform approach, spanning multiple quantum technologies. "We believe that the greatest advances in quantum will arise at the intersection of quantum computing, sensing, networking, and clock technologies," said Paul Lipman, Infleqtion's President of Quantum Information Platforms. "SupercheQ is an exemplar of this approach." About Infleqtion Infleqtion is building an ecosystem of quantum technologies and commercial products for today, that will drive the company and the entire industry toward tomorrow. The company believes in taking quantum to its limit and leading from the edge. Infleqtion is built on 15 years of pioneering quantum research from ColdQuanta. Its scalable and versatile quantum technology is used by organizations around the globe and deployed by NASA on the International Space Station. Infleqtion is based in Boulder, CO, with offices in Chicago, IL; Madison, WI; Melbourne, AU and Oxford, UK.

Read More

DATA VISUALIZATION

Opaque Systems, Pioneer in Confidential Computing, Unveils the First Multi-Party Confidential AI and Analytics Platform

Opaque Systems | December 08, 2022

Opaque Systems, the pioneers of secure multi-party analytics and AI for Confidential Computing, today announced the latest advancements in Confidential AI and Analytics with the unveiling of its platform. The Opaque platform, built to unlock use cases in Confidential Computing, is created by the inventors of the popular MC2 open source project which was conceived in the RISELab at UC Berkeley. The Opaque Platform uniquely enables data scientists within and across organizations to securely share data and perform collaborative analytics directly on encrypted data protected by Trusted Execution Environments (TEEs). The platform further accelerates Confidential Computing use cases by enabling data scientists to leverage their existing SQL and Python skills to run analytics and machine learning while working with confidential data, overcoming the data analytics challenges inherent in TEEs due to their strict protection of how data is accessed and used. The Opaque platform advancements come on the heels of Opaque announcing its $22M Series A funding, Confidential Computing – projected to be a $54B market by 2026 by the Everest Group – provides a solution using TEEs or 'enclaves' that encrypt data during computation, isolating it from access, exposure and threats. However, TEEs have historically been challenging for data scientists due to the restricted access to data, lack of tools that enable data sharing and collaborative analytics, and the highly specialized skills needed to work with data encrypted in TEEs. The Opaque Platform overcomes these challenges by providing the first multi-party confidential analytics and AI solution that makes it possible to run frictionless analytics on encrypted data within TEEs, enable secure data sharing, and for the first time, enable multiple parties to perform collaborative analytics while ensuring each party only has access to the data they own. "Traditional approaches for protecting data and managing data privacy leave data exposed and at risk when being processed by applications, analytics, and machine learning (ML) models, The Opaque Confidential AI and Analytics Platform solves this challenge by enabling data scientists and analysts to perform scalable, secure analytics and machine learning directly on encrypted data within enclaves to unlock Confidential Computing use cases." -Rishabh Poddar, Co-founder & CEO, Opaque Systems. Strict privacy regulations result in sensitive data being difficult to access and analyze, said a Data Science Leader at a top US bank. New multi-party secure analytics and computational capabilities and Privacy Enhancing Technology from Opaque Systems will significantly improve the accuracy of AI/ML/NLP models and speed insights. The Opaque Confidential AI and Analytics Platform is designed to specifically ensure that both code and data within enclaves are inaccessible to other users or processes that are collocated on the system. Organizations can encrypt their confidential data on-premises, accelerate the transition of sensitive workloads to enclaves in Confidential Computing Clouds, and analyze encrypted data while ensuring it is never unencrypted during the lifecycle of the computation. Key capabilities and advancements include: Secure, Multi-Party Collaborative Analytics – Multiple data owners can pool their encrypted data together in the cloud, and jointly analyze the collective data without compromising confidentiality. Policy enforcement capabilities ensure the data owned by each party is never exposed to other data owners. Secure Data Sharing and Data Privacy – Teams across departments and across organizations can securely share data protected in TEEs while adhering to regulatory and compliance policies. Use cases requiring confidential data sharing include financial crime, drug research, ad targeting monetization and more. Data Protection Throughout the Lifecycle – Protects all sensitive data, including PII and SHI data, using advanced encryption and secure hardware enclave technology, throughout the lifecycle of computation—from data upload, to analytics and insights. Multi-tiered Security, Policy Enforcement, and Governance – Leverages multiple layers of security, including Intel® Software Guard Extensions, secure enclaves, advanced cryptography and policy enforcement to provide defense in depth, ensuring code integrity, data, and side-channel attack protection. Scalability and Orchestration of Enclave Clusters – Provides distributed confidential data processing across managed TEE clusters and automates orchestration of clusters overcoming performance and scaling challenges and supports secure inter-enclave communication. Confidential Computing is supported by all major cloud vendors including Microsoft Azure, Google Cloud and Amazon Web Services and major chip manufacturers including Intel and AMD. About Opaque Systems: Commercializing the open source MC2 technology invented at UC Berkeley by its founders, Opaque System provides the first collaborative analytics and AI platform for Confidential Computing. Opaque uniquely enables data to be securely shared and analyzed by multiple parties while maintaining complete confidentiality and protecting data end-to-end. The Opaque Platform leverages a novel combination of two key technologies layered on top of state-of-the-art cloud security—secure hardware enclaves and cryptographic fortification. This combination ensures that the overall computation is secure, fast, and scalable. The MC2 technology and Opaque innovation has already been adopted by several organizations, such as Ant Group, IBM, Scotiabank, and Ericsson.

Read More

Events