BIG DATA MANAGEMENT

Penguin Releases the Decentralized Data Network for Web3.0

Penguin | January 03, 2022

Recently, Penguin team has announced the launch of their decentralized data network for Web3.0. With the advancement of blockchain technology, some innovative new players are entering the market. Some are bringing the offline world to a global audience, while others transform the way we invest in our future. Decentralized applications, DeFi, NFTs, and the Metaverse, hold immense potential for future growth and real-world uses. But what the current crypto arena lacks is an independent & one-stop web service that includes a high-performance smart contract blockchain together with a decentralized storage solution. The Penguin network brings in a universal decentralized data network specifically designed for Web 3.0.

Penguin - The Decentralized Storage Platform

Exclusively designed for Web 3.0, Penguin is a peer-to-peer network of nodes, which jointly provides decentralized storage and communication service. By offering a universal decentralized data network for Web3.0, the platform can fulfill multiple roles for different areas of blockchain space. Moreover, Penguin aims to work with the blockchain industry to create decentralized applications (DApps), products, and services seamlessly accessible in Web 3.0.

A unique feature of the platform is that it offers automatic scaling; that is, an increase in storage space demand would be efficiently handled. This will eventually lead to a lowering of costs for the blockchain arena. Penguin also facilitates efficient data storage capabilities and quick data retrieval. The network is economically automated with a native protocol token, PEN, thanks to its built-in smart-contract-based incentive system.

Therefore, the purported goal of the platform is to extend the blockchain by utilizing decentralized storage and communication to position itself as a world computer that can efficiently serve as an operating system and deployment environment for dApps.

Web 3.0 - The Decentralized Internet of the Future

Web 3.0 is not merely a buzzword that tech, crypto, and venture-capital classes have become interested in lately. It aims to provide a future where distributed users and machines can seamlessly interact with data, value, and other counterparties through peer-to-peer networks, eliminating the need for any third parties. It is built majorly on three novel layers of technological innovation. Those are edge computing, decentralized data networks, and artificial intelligence. Web 3.0, built on blockchain, eliminates all big intermediaries, including centralized governing bodies or repositories.

Moreover, the most significant evolution enabled by Web 3.0 is the minimization of the trust required for coordination on a global scale. It fundamentally expands the scale and scope of human and machine interactions to a far new level. These interactions range from easy payments to richer information flows and trusted data transfers, all without passing through a fee-charging intermediary.

Web 3.0 enhances the current internet service with significant characteristics like trustless, verifiable, permissionless, self-governing, etc. This is why a permissionless, decentralized blockchain like Penguin plays a pivotal part in developing the so-called "decentralized internet of the future." Decentralized data networks like Penguin make it possible for data generators to store or sell their data without losing ownership control, compromising privacy, or reliance on intermediaries or go-betweens.

Blockchain Technology and Web 3.0

Blockchain technology and cryptocurrencies have always been an integral part of Web3.0. It provides financial incentives for anyone who wants to create, govern, contribute, or improve projects. Today the internet needs Web 3.0, a new generation of the Internet protocol that facilitates free identity, free contracts, and free assets. Blockchain technology with its advanced network fundamentals offers a near-perfect solution with in-built smart contracts for self-deployment and access, decentralized addresses as accounts, etc. Penguin, the decentralized data network, provides an available decentralized private data storage solution for all Web3.0 developers.

How Does Penguin Benefit The Development Of Web 3.0

Today we live in a data-driven world, where companies often collect massive amounts of user data and use this data with the intent to deliver value. Data privacy has become a greater concern over the past few years. However, the Internet ecosystem has fundamentally changed several concerns like data privacy and storage. This is referred to as Web 3.0, and it ensures this by deploying blockchain.

Penguin primarily focuses on data storage with zero downtime. It also features permanent versionable content storage, zero error operation, and resistance to intermittent disconnection of nodes.

With its exceptional privacy attributes like anonymous browsing, deniable storage, untraceable messaging, and file representation formats that leak no metadata, Penguin meets with the growing security demand on the web. Penguin also offers continuous service and resilience against outages or targeted attacks. The platform facilitates the creation of many products, where all products rely on APIs and SDKs provided by Penguin.

Penguin - An Infrastructure for A Self-Sovereign Society

Penguin is more than just a network; the protocol sets a strong foundation for creating a market economy around data storage and retrieval. The platform also has entered into a host of prospective and strategic partnerships and collaborations with different projects and protocols in the DeFi, GameFi, NFTs, smart contract, and other metaverse spaces. Moreover, as a platform for permissionless publication, the Penguin network promotes information freedom. The platform’s design requirements can only be met by the network native token PEN.

Some of the significant features that Web 3.0 offers are zero central point of control by removing intermediaries, complete ownership of data, sharing information in a permissionless manner, reducing hacks and data breaches with decentralized data, and interoperability.

On the other hand, Penguin aims to build an infrastructure for a self-sovereign society. Without permission and privacy, Penguin efficiently meets the needs of freedom of speech, data sovereignty, open network market, and ensuring its security through integrity protection, censorship resistance, and attack resilience.

Some of its vital meta values are Inclusivity, the need to include the underprivileged in the data economy, lowering the barrier of entry to explain complex data flows, and building decentralized applications.

The integrity of the online persona is necessary. Because Penguin is a network with open participation and offers services and permissionless access to publishing, sharing, and investing your data, users have complete freedom to express their intention and have full authority to decide whether they want to remain anonymous or share interactions.

Incentivization or economic incentives ensure that participants' behavior aligns with the network's desired emergent behavior. Finally, Impartiality guarantees content neutrality and prevents gate-keeping. It successfully rules out other values that treat any particular group as a privileged or express preference for specific content or even data from any specific source. These meta values make Penguin an efficient decentralized, permissionless data network for Web 3.0.

Penguin’s Future-Proof Design Principles - Meeting the Needs of Web 3.0

The information society and data economy have ushered in an era where online transactions and big data are pivotal for everyday life. Therefore, it is essential to have a future-proof and advanced supporting technology like Penguin. The network offers a strong guarantee for continuity. The Penguin network ensures continuity by following some general requirements or system attributes. Some of them are stable and resilient specifications and software implementation. Scalable enough to accommodate many orders of magnitude, more users, and data without lowering the performance or reliability for mass adoption, secure and resilient solution to deliberate attacks, Penguin is a self-sustaining autonomous solution that is independent of human or organizational coordination or any legal entity's business.

Spotlight

Business Intelligence and Data Analytics sound similar, don’t they?  From an outsider perspective, both Business Intelligence and Data Analytics might serve a similar purpose, but they are utilized to deliver different outcomes based on the business requirement. This article focuses on listing out the key differences between Business Intelligence and Data Analytics. Introduction Business Intelligence deals with complex strategies and technologies that help end users in analyzing the data and perform decision-making activities to grow their business. The earliest usage of Business Intelligence was discovered in ‘Cyclopedia of Commercial and Business Anecdote book’ written by Richard Miller Devens in 1865. Devens uses the term BI (Business Intelligence) to describe on how a banker named Sir Henry Furnese, has gained profit by analyzing his own environment to stay ahead of his competitors. Data Analytics or Business Analytics is a process that helps the enterprise users to transform the raw or unstructured data into a meaningful format. The transformed information can be utilized to cleanse, transform or model the data to support the process of decision making, derive conclusions and implement predictive analytics. Data Analytics is a standard or common process that is employed under various procedures or strategies by many organizations around the globe depending upon their business needs.


Other News
DATA ARCHITECTURE

Databricks Launches Data Lakehouse for Retail and Consumer Goods Customers

Databricks | January 14, 2022

Databricks, the Data and AI company and pioneer of the data lakehouse architecture, today announced the Databricks Lakehouse for Retail, the company's first industry-specific data lakehouse for retailers and consumer goods (CG) customers. With Databricks' Lakehouse for Retail, data teams are enabled with a centralized data and AI platform that is tailored to help solve the most critical data challenges that retailers, partners, and their suppliers are facing. Early adopters of Databricks' Lakehouse for Retail include industry-leading customers and partners like Walgreens, Columbia, H&M Group, Reckitt, Restaurant Brands International, 84.51°(a subsidiary of Kroger Co.), Co-Op Food, Gousto, Acosta and more. "As the retail and healthcare industries continue to undergo transformative change, Walgreens has embraced a modern, collaborative data platform that provides a competitive edge to the business and, most importantly, equips our pharmacists and technicians with timely, accurate patient insights for better healthcare outcomes," said Luigi Guadagno, Vice President, Pharmacy and HealthCare Platform Technology at Walgreens. "With hundreds of millions of prescriptions processed by Walgreens each year, Databricks' Lakehouse for Retail allows us to unify all of this data and store it in one place for a full range of analytics and ML workloads. By eliminating complex and costly legacy data silos, we've enabled cross-domain collaboration with an intelligent, unified data platform that gives us the flexibility to adapt, scale and better serve our customers and patients." "Databricks has always innovated on behalf of our customers and the vision of lakehouse helps solve many of the challenges retail organizations have told us they're facing," said Ali Ghodsi, CEO and Co-Founder at Databricks. "This is an important milestone on our journey to help organizations operate in real-time, deliver more accurate analysis, and leverage all of their customer data to uncover valuable insights. Lakehouse for Retail will empower data-driven collaboration and sharing across businesses and partners in the retail industry." Databricks' Lakehouse for Retail delivers an open, flexible data platform, data collaboration and sharing, and a collection of powerful tools and partners for the retail and consumer goods industries. Designed to jumpstart the analytics process, new Lakehouse for Retail Solution Accelerators offer a blueprint of data analytics and machine learning use cases and best practices to save weeks or months of development time for an organization's data engineers and data scientists. Popular solution accelerators for Databricks' Lakehouse for Retail customers include: Real-time Streaming Data Ingestion: Power real-time decisions critical to winning in omnichannel retail with point-of-sale, mobile application, inventory and fulfillment data. Demand forecasting and time-series forecasting: Generate more accurate forecasts in less time with fine-grained demand forecasting to better predict demand for all items and stores. ML-powered recommendation engines: Specific recommendations models for every stage of the buyer journey - including neural network, collaborative filtering, content-based recommendations and more - enable retailers to create a more personalized customer experience. Customer Lifetime Value: Examine customer attrition, better predict behaviors of churn, and segment consumers by lifetime and value with a collection of customer analytics accelerators to help improve decisions on product development and personalized promotions. Additionally, industry-leading Databricks partners like Deloitte and Tredence are driving lakehouse vision and value by delivering pre-built analytics solutions on the lakehouse platform that address real-time customer use cases. Tailor-made for the retail industry, featured partner solutions and platforms include: Deloitte's Trellis solution accelerator for the retail industry is one of many examples of how Deloitte and client partners are adopting the Databricks Lakehouse architecture construct and platform to deliver end-to-end data and AI/ML capabilities in a simple, holistic, and cost-effective way. Trellis provides capabilities that solve retail clients' complex challenges around forecasting, replenishment, procurement, pricing, and promotion services. Deloitte has leveraged their deep industry and client expertise to build an integrated, secured, and multi-cloud ready "as-a-service" solution accelerator on top of Databricks' Lakehouse platform that can be rapidly customized as appropriate based on client's unique needs. Trellis has proven to be a game-changer for our joint clients as it allows them to focus on the critical shifts occurring both on the demand and supply side with the ability to assess recommendations, associated impact, and insights in real-time that result in significant improvement to both topline and bottom line numbers. Tredence will meet the explosive enterprise Data, AI & ML demand and deliver real-time transformative industry value for their business by delivering solutions for Lakehouse for Retail. The partnership first launched the On-Shelf Availability Solution (OSA) accelerator in August 2021, combining Databricks' data processing capability and Tredence's AI/ML expertise to enable Retail, CPG & Manufacturers to solve their trillion dollar out-of-stock challenge. Now with Lakehouse for Retail, Tredence and Databricks will jointly expand the portfolio of industry solutions to address other customer challenges and drive global scale together. About Databricks Databricks is the data and AI company. More than 5,000 organizations worldwide — including Comcast, Condé Nast, H&M, and over 40% of the Fortune 500 — rely on the Databricks Lakehouse Platform to unify their data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe. Founded by the original creators of Apache Spark™, Delta Lake and MLflow, Databricks is on a mission to help data teams solve the world's toughest problems.

Read More

BIG DATA MANAGEMENT

Money.Net Launches Financial Data and Analytics Platform for Institutions

Money.Net | February 18, 2022

Money.Net, a financial data and analytics company, announced today that it has launched an enhanced platform for institutional users, which provides cost-effective access to professional-grade financial tools. Available immediately, Money.Net supports institutional users throughout the entire investment journey leveraging next-generation technologies including artificial intelligence (AI) and machine learning to mine big data. Money.Net provides financial data across an array of asset classes including equities, fixed income, crypto, commodities, foreign exchange, and derivatives/options. Users are able to access information and analytics via the web and an Excel integration, complete with customizable layouts and live support. The platform is offered at several levels, including Pro, Premium and Enterprise. Users at all levels will gain access to real-time market data and news, portfolio monitoring, analysis and charting tools, and the ability to access the Symphony technology platform. Money.Net, operating under new ownership, has established a new management team led by Vincent Sangiovanni, chief executive officer, and Jason Emerson, chief operating officer. Sangiovanni joins Money.Net from 360T GTX, a Deutsche Börse company, and Emerson joins from Pico Quantitative Trading. With over 50 years of combined experience, the two executives bring to the platform extensive industry knowledge and an intuitive understanding of investors’ needs. “It is a new era for Money.Net. We are bringing together next-generation technology with Money.Net’s easy-to-use interface to help users seamlessly research, analyze and support their investment decision making process. Our team brings broad and deep experience developing multi-asset class solutions for institutional users.” Vincent Sangiovanni, CEO of Money.Net As part of the launch, Money.Net has established strategic partnerships to better serve the needs of institutional investors. Cosiac’s ChartIQ provides customizable charts that combine intuitive visualization with multi-asset class data to deliver actionable, tradable insights. Symphony provides access to the largest global community in financial services through its secure and compliant communication stack, with chat, voice and video meetings, and file and screen sharing, allowing investors to interact in real-time. Trading Central provides AI and machine learning technology to find and validate trading opportunities while managing risk. "Trading Central has empowered investors with actionable analytics for over two decades, making us a natural partner for Money.Net," says Alain Pellier, CEO of Trading Central. "Their mission to democratize financial research is a perfect fit for our solutions, and we're proud to see our market insights reach more investors through their platform." “Money.Net has been a long-time ChartIQ client and partner, and we’re excited to be working with the new management team as they launch their new institutional platform,” says Dan Schleifer, CEO of Cosaic (founded in 2012 as ChartIQ). “Money.Net brings a wealth of market data to its clients, and we’re proud that ChartIQ is the data visualization engine powering their new institutional platform.” About Money.Net Money.Net is a financial data and analytics company serving investors across virtually all asset classes. We help investors research, analyze and monitor financial markets in real-time with the support of next-generation technology. Our mission is to empower all investors with cost-effective access to professional-grade financial tools.

Read More

BIG DATA MANAGEMENT

Pathr.ai Unveils New Spatial Intelligence Analytics Tools for Retailers to Help Drive In-Store Profitability, Increase ROI

Pathr.ai | January 15, 2022

Pathr.ai, the industry's first and only artificial intelligence (AI) powered spatial intelligence platform, today announced three powerful new spatial intelligence analytics tools focused on helping retailers drive in-store profitability. Pathr.ai’s CPG Display Tool, True Conversion Rate Tool, and Brand Effect vs. Location Tool are all designed to deliver previously unavailable business insights that empower retailers to obtain higher levels of revenue and make stronger business decisions for their physical stores. “Retailers today lack in-store analytics around customer behavior - critical information that can lead to increased profitability and improved business outcomes. We designed our new tools to address some of the most pressing concerns for retailers,” said George Shaw, CEO and Founder of Pathr.ai. “In addition to our Brand Effect vs. Location Tool and True Conversion Rate Tool, now, for the first time ever, retailers will be able to assess the effectiveness of CPG brands at their stores with our CPG Display Tool.” Pathr.ai’s new tools include: CPG Display Tool: For the first time ever, retailers have a solution that helps them assess the effectiveness of CPG brands at their stores with store-level data to directly measure and maximize the impact of Category Management efforts. By analyzing shopper traffic and dwell impressions within various store departments, retailers can enhance their strategic CPG brand partnerships by offering them valuable data to improve their merchandise placement and marketing promotions. This information also allows CPG brands to better understand how their products are performing in different areas of a store and can be a potentially lucrative new data source for retailers. True Conversion Rate Tool: Allows retailers to quantify group size dynamics in their locations (ex: families, couples, or singles) and delivers a more accurate buyer conversion rate for retailers. This is a huge departure from how the conversion rate is typically calculated, with most retailers measuring individuals, not groups. If a family of 4 enters a retail location, usually only one person from that family will pay for a product, not all 4. In addition to a more accurate conversion rate, this data can also be used to inform merchandising and in-store promotion initiatives. Brand Effect vs. Location Tool: Lets retailers assess how effective their store-within-a-store brands and locations are performing. For example, retailers can leverage this data to understand the full business impact of their store-within-a-store, assessing if that location resulted in traffic to other areas or if it outperformed conventional sections of their store. Retailers can quantify traffic and dwell times around store-within-a-store locations to benchmark rents for each area and guide potential adjustments in location and surrounding store signage to improve performance. “Spatial Intelligence can be a powerful asset to retailers focused on maximizing their profits and improving operational efficiencies critical to their success. We’ve created our insight tools to empower retailers to make business decisions in an accurate and data-driven way, and ultimately share that insight with their CPG supplier base.” Alan Flohr, Chief Revenue Officer of Pathr.ai Pathr.ai integrates and collects data from a retailer’s existing camera infrastructure. It measures customer movement inside a physical space anonymously - allowing companies to comply with GDPR and CCPA standards and achieve positive business results in an unbiased way. About Pathr.ai Pathr.ai is the industry’s first AI-powered spatial intelligence software company that uses anonymous location data from available and existing infrastructure to observe human behavior in any physical space. Its sophisticated technology turns raw behavioral and spatial data from existing sensors into actionable and applied business learnings - allowing companies to drive the business results that matter most to the growth of their companies in real-time. Founded in 2019, Pathr.ai is headquartered in Mountain View, California.

Read More

BIG DATA MANAGEMENT

Talend Acquires Gamma Soft, a Market Innovator in Change Data Capture

Talend | April 11, 2022

Talend, a global leader in data integration and management, announced today it has acquired Gamma Soft, a market innovator in change data capture (CDC). The addition of Gamma Soft's highly complementary, enterprise-class change data capture technologies will help customers streamline their data modernization initiatives, including cloud migrations, and support advanced, real-time analytics use cases across hybrid and multi-cloud environments. Today, many organizations rely on brittle, hand-coded integrations, or rely on multiple data management tools with redundant capabilities across integration, replication, modeling, preparation, quality, cataloging, and governance. With the combination of Talend and Gamma Soft, data professionals will be able to solve more use cases that require support for quickly changing data faster and easier than ever on a single end-to-end solution. "We are thrilled to welcome the talented Gamma Soft team to Talend. Complementary to our product portfolio, Gamma Soft deepens our already comprehensive integration capabilities and gives us new functionality for enabling advanced, real-time business insight. More broadly, Gamma Soft extends the value we provide customers in helping them quickly build, continually monitor, and easily optimize enterprise-wide data health." Christal Bemont, CEO, Talend Headquartered in Paris, France, Gamma Soft helps companies continuously track and replicate changed data in real time from a source, such as data warehouses, data lakes, and other databases, to a destination without requiring the entire data set to be extracted. This process provides multiple benefits, including streamlining and accelerating cloud data migration projects and enabling real-time business optics to drive everything from supply-chain optimization to fraud detection. "Change data capture technologies offer speed, accuracy, and agility in data replication that can help businesses successfully optimize their real-time analytics and cloud migration initiatives," said Stewart Bond, Research Director, IDC. "According to our recent market forecast, taking control of dynamic data is a high priority for companies that need to continue their digital transformation and plan for digital resiliency. Bringing Gamma Soft into Talend's product portfolio is a great add for Talend and for its customers." Véronique Goussard, general manager, Gamma Soft said, "Joining Talend is a great fit from a product and cultural perspective for Gamma Soft and for our customers. Talend will help take our CDC capabilities to the next level and provide customers with a single, end-to-end solution to successfully execute on data strategies that rely on quickly capturing changing data for analysis in cloud, hybrid or multi-cloud implementations." About Talend Talend, a leader in data integration and data management, is changing the way the world makes decisions. Talend Data Fabric is the only platform that seamlessly combines an extensive range of data integration and governance capabilities to actively manage the health of corporate information. This unified approach is unique and essential to delivering complete, clean, and uncompromised data in real-time to all employees. It has made it possible to create innovations like the Talend Trust Score™, an industry-first assessment that instantly quantifies the reliability of any data set.

Read More

Spotlight

Business Intelligence and Data Analytics sound similar, don’t they?  From an outsider perspective, both Business Intelligence and Data Analytics might serve a similar purpose, but they are utilized to deliver different outcomes based on the business requirement. This article focuses on listing out the key differences between Business Intelligence and Data Analytics. Introduction Business Intelligence deals with complex strategies and technologies that help end users in analyzing the data and perform decision-making activities to grow their business. The earliest usage of Business Intelligence was discovered in ‘Cyclopedia of Commercial and Business Anecdote book’ written by Richard Miller Devens in 1865. Devens uses the term BI (Business Intelligence) to describe on how a banker named Sir Henry Furnese, has gained profit by analyzing his own environment to stay ahead of his competitors. Data Analytics or Business Analytics is a process that helps the enterprise users to transform the raw or unstructured data into a meaningful format. The transformed information can be utilized to cleanse, transform or model the data to support the process of decision making, derive conclusions and implement predictive analytics. Data Analytics is a standard or common process that is employed under various procedures or strategies by many organizations around the globe depending upon their business needs.

Resources