Piano | January 19, 2022
Piano, the Digital Experience Cloud, today announced a partnership with Snowflake, the Data Cloud company, to help businesses understand and activate their data at scale. As part of the Powered by Snowflake program, Piano leverages Snowflake as the cloud-based data platform for its sophisticated analytics tool—making it fast and easy to store, query, enrich and securely share data within the Snowflake ecosystem. These advanced capabilities enable real-time, accurate analysis of customer behavior to help organizations drive personalization at scale.
Launched in September 2021, Piano Analytics delivers a powerful analytics solution designed for broad accessibility and manipulation, regardless of an employee's level of data proficiency. By democratizing access to data, businesses can eliminate data silos and ensure all teams, from marketing to sales, data science to operations, are operating from a single source of truth. A core feature of the Piano Analytics solution is its superior data harvesting, which ensures data is clean, privacy-compliant, reliable and never sampled. This reduces risk for businesses and means they're able to confidently chart their path forward using the most accurate information at their disposal.
The Snowflake partnership improves data portability for Piano Analytics customers, who can now use Secure Data Sharing within Snowflake to easily connect their high-quality data into other systems in the Snowflake Data Cloud, such as business intelligence tools or data governance tools. This process, which can be completed in as few as two clicks, eliminates data silos within an organization, ensuring all teams can access and work from the same reliable source of information.
Thanks to Snowflake's architecture, Piano Analytics users also benefit from faster querying times, allowing teams to understand and optimize campaigns faster without compromising data quality or integrity.
"We've long admired Snowflake's leading position in the data industry. When we built our Piano Analytics platform on Snowflake, we knew it would dramatically enhance both our capabilities and our user experience. As our relationship continues, we're eager to partner with Snowflake in new ways to revolutionize how organizations work with their data and use it to create superior digital experiences for their customers."
Trevor Kaufman, CEO, Piano
Piano's tools for data analysis and activation are already used by blue chip clients in many industries, including publishing, broadcasting, financial services, travel and more, to understand their audiences and personalize customer experiences.
"The Snowflake and Piano partnership is focused on providing customers with the tools to enhance the customer experience with cutting-edge efficiency and performance," said Colleen Kapase, SVP of Worldwide Partnerships at Snowflake. "Together, we can empower joint customers to drive personalized digital strategies and connect them to other data-driven organizations through the Snowflake Data Cloud."
Learn more about this partnership during Snowflake's Media Data Cloud Summit on January 19, 2022. Piano will discuss how secure data sharing through Snowflake helps organizations put the right data in the hands of every employee.
Piano's Digital Experience Cloud empowers organizations to understand and influence customer behavior. By unifying customer data, analyzing behavior metrics and creating personalized customer journeys, Piano helps brands launch campaigns and products faster, strengthen customer engagement and drive personalization at scale from a single platform. Headquartered in Philadelphia with offices across the Americas, Europe and Asia Pacific, Piano serves a global client base, including Air France, the BBC, CBS, IBM, Kirin Holdings, Jaguar Land Rover, Linkedin, Nielsen, The Wall Street Journal and more. Piano has been recognized as one of the fastest-growing, most innovative technology companies in the world by World Economic Forum, Red Herring, Inc. and Deloitte.
Qontigo | January 11, 2022
Qontigo, a leading provider of innovative risk, analytics and index solutions has made available ISS ESG, Clarity AI and Sustainalytics data within its financial optimizer, Axioma Portfolio Optimizer (APO). Sustainalytics will also be integrated into Axioma Portfolio Analytics (APA) for performance attribution and reporting as well as Axioma Risk Model Machine (RMM), which allows users to create custom risk models.
With this direct integration into Axioma portfolio construction tools, asset managers, wealth managers and asset owners will be able to construct portfolios actively tilting towards a combination of ESG attributes; identify point-in-time ESG exposures; create desired hedges; develop custom risk models that explain risk and return; and run performance attribution based on ESG attributes.
Using its proprietary Risk Entity Framework, Qontigo is able to consolidate and normalize sustainability-linked content from multiple sources, ensuring consistency across multiple asset classes.
"We know that there's a great need for sustainability to inform the investment process and so it's important for us to offer clients a breadth of best-in-class content," said Chris Sturhahn, Chief Product Officer for Analytics at Qontigo. "Clients running strategies with ESG-linked goals – from climate to impact – may already be working with a number of vendors, but they can now benefit from more seamless data integration with leading analytics tools for portfolio construction, risk and reporting, which could result in lower total cost of ownership."
The forthcoming introduction of sustainability content into Axioma multi-asset class, cloud-native offerings will enable investment managers to also manage their portfolios with respect to a wide spectrum of sustainability-linked investment targets, as well as generate output that helps with sustainability-linked management, investor and regulatory requirements.
"We are pleased to offer our market-leading ESG data through the Axioma Portfolio Optimizer, initially starting with a focus on climate and encompassing emissions data, physical and transitional risk metrics, and scenario data across approximately 28,000 issuers," said Dr. Maximilian Horster, Head of ISS ESG, the responsible investment arm of Institutional Shareholder Services Inc. "We are excited by the opportunities this partnership affords mutual clients to access class-leading ESG and risk management solutions through an integrated channel."
"One dimension of bringing societal impact to markets is providing broad, granular and transparent ESG Risk and SDG Impact analysis, and we are excited to bring that to Qontigo and its clients," said Rebeca Minguela, Founder & CEO of Clarity AI, the global, leading sustainability tech platform. "Instead of applying analysts' subjective assessments, we leverage science-based methodologies and proprietary machine learning to deliver reliable and objective data at scale with exceptionally broad coverage. As a digital-native provider, we lead the industry in offering a comprehensive range of SaaS solutions for sustainability assessment, and integrating into Axioma Portfolio Optimizer creates the opportunity for us to illuminate paths to a more sustainable world for an even wider base of clients."
"Sustainalytics is delighted to provide Qontigo's clients with our diverse range of ESG risk and compliance solutions," said Shila Wattamwar, Global Head of Strategic Partnerships at Sustainalytics. "With ESG issues becoming a more central part of the investment decision-making process, investors can now more easily show the ESG risk attributes of their portfolios and report on them by leveraging Sustainalytics' rich ESG solutions in Qontigo's risk and analytics environment. We look forward to building on the success of our relationship with Qontigo and expanding our partnership."
Users of APO, APA and RMM will be able to access:
ISS ESG Daily Multi-Attribute files covering 2 modules across more than 28,000 entities: Climate Core and Climate Impact
Clarity AI ESG Risk ratings as well as SDG Impact metrics, with coverage that extends across 30,000 companies, 135,000 funds, 198 countries and 187 local governments
Sustainalytics Daily Multi-Attribute files covering 7 modules across more than 25,000 entities: ESG Risk Ratings, Product involvement, Controversial Weapons, Global Standards Screening, Corporate Governance and Controversies
Qontigo currently uses ISS ESG and Sustainalytics datasets for the construction of STOXX and DAX indices.
BIG DATA MANAGEMENT
Penguin | January 03, 2022
Recently, Penguin team has announced the launch of their decentralized data network for Web3.0. With the advancement of blockchain technology, some innovative new players are entering the market. Some are bringing the offline world to a global audience, while others transform the way we invest in our future. Decentralized applications, DeFi, NFTs, and the Metaverse, hold immense potential for future growth and real-world uses. But what the current crypto arena lacks is an independent & one-stop web service that includes a high-performance smart contract blockchain together with a decentralized storage solution. The Penguin network brings in a universal decentralized data network specifically designed for Web 3.0.
Penguin - The Decentralized Storage Platform
Exclusively designed for Web 3.0, Penguin is a peer-to-peer network of nodes, which jointly provides decentralized storage and communication service. By offering a universal decentralized data network for Web3.0, the platform can fulfill multiple roles for different areas of blockchain space. Moreover, Penguin aims to work with the blockchain industry to create decentralized applications (DApps), products, and services seamlessly accessible in Web 3.0.
A unique feature of the platform is that it offers automatic scaling; that is, an increase in storage space demand would be efficiently handled. This will eventually lead to a lowering of costs for the blockchain arena. Penguin also facilitates efficient data storage capabilities and quick data retrieval. The network is economically automated with a native protocol token, PEN, thanks to its built-in smart-contract-based incentive system.
Therefore, the purported goal of the platform is to extend the blockchain by utilizing decentralized storage and communication to position itself as a world computer that can efficiently serve as an operating system and deployment environment for dApps.
Web 3.0 - The Decentralized Internet of the Future
Web 3.0 is not merely a buzzword that tech, crypto, and venture-capital classes have become interested in lately. It aims to provide a future where distributed users and machines can seamlessly interact with data, value, and other counterparties through peer-to-peer networks, eliminating the need for any third parties. It is built majorly on three novel layers of technological innovation. Those are edge computing, decentralized data networks, and artificial intelligence. Web 3.0, built on blockchain, eliminates all big intermediaries, including centralized governing bodies or repositories.
Moreover, the most significant evolution enabled by Web 3.0 is the minimization of the trust required for coordination on a global scale. It fundamentally expands the scale and scope of human and machine interactions to a far new level. These interactions range from easy payments to richer information flows and trusted data transfers, all without passing through a fee-charging intermediary.
Web 3.0 enhances the current internet service with significant characteristics like trustless, verifiable, permissionless, self-governing, etc. This is why a permissionless, decentralized blockchain like Penguin plays a pivotal part in developing the so-called "decentralized internet of the future." Decentralized data networks like Penguin make it possible for data generators to store or sell their data without losing ownership control, compromising privacy, or reliance on intermediaries or go-betweens.
Blockchain Technology and Web 3.0
Blockchain technology and cryptocurrencies have always been an integral part of Web3.0. It provides financial incentives for anyone who wants to create, govern, contribute, or improve projects. Today the internet needs Web 3.0, a new generation of the Internet protocol that facilitates free identity, free contracts, and free assets. Blockchain technology with its advanced network fundamentals offers a near-perfect solution with in-built smart contracts for self-deployment and access, decentralized addresses as accounts, etc. Penguin, the decentralized data network, provides an available decentralized private data storage solution for all Web3.0 developers.
How Does Penguin Benefit The Development Of Web 3.0
Today we live in a data-driven world, where companies often collect massive amounts of user data and use this data with the intent to deliver value. Data privacy has become a greater concern over the past few years. However, the Internet ecosystem has fundamentally changed several concerns like data privacy and storage. This is referred to as Web 3.0, and it ensures this by deploying blockchain.
Penguin primarily focuses on data storage with zero downtime. It also features permanent versionable content storage, zero error operation, and resistance to intermittent disconnection of nodes.
With its exceptional privacy attributes like anonymous browsing, deniable storage, untraceable messaging, and file representation formats that leak no metadata, Penguin meets with the growing security demand on the web. Penguin also offers continuous service and resilience against outages or targeted attacks. The platform facilitates the creation of many products, where all products rely on APIs and SDKs provided by Penguin.
Penguin - An Infrastructure for A Self-Sovereign Society
Penguin is more than just a network; the protocol sets a strong foundation for creating a market economy around data storage and retrieval. The platform also has entered into a host of prospective and strategic partnerships and collaborations with different projects and protocols in the DeFi, GameFi, NFTs, smart contract, and other metaverse spaces. Moreover, as a platform for permissionless publication, the Penguin network promotes information freedom. The platform’s design requirements can only be met by the network native token PEN.
Some of the significant features that Web 3.0 offers are zero central point of control by removing intermediaries, complete ownership of data, sharing information in a permissionless manner, reducing hacks and data breaches with decentralized data, and interoperability.
On the other hand, Penguin aims to build an infrastructure for a self-sovereign society. Without permission and privacy, Penguin efficiently meets the needs of freedom of speech, data sovereignty, open network market, and ensuring its security through integrity protection, censorship resistance, and attack resilience.
Some of its vital meta values are Inclusivity, the need to include the underprivileged in the data economy, lowering the barrier of entry to explain complex data flows, and building decentralized applications.
The integrity of the online persona is necessary. Because Penguin is a network with open participation and offers services and permissionless access to publishing, sharing, and investing your data, users have complete freedom to express their intention and have full authority to decide whether they want to remain anonymous or share interactions.
Incentivization or economic incentives ensure that participants' behavior aligns with the network's desired emergent behavior. Finally, Impartiality guarantees content neutrality and prevents gate-keeping. It successfully rules out other values that treat any particular group as a privileged or express preference for specific content or even data from any specific source. These meta values make Penguin an efficient decentralized, permissionless data network for Web 3.0.
Penguin’s Future-Proof Design Principles - Meeting the Needs of Web 3.0
The information society and data economy have ushered in an era where online transactions and big data are pivotal for everyday life. Therefore, it is essential to have a future-proof and advanced supporting technology like Penguin. The network offers a strong guarantee for continuity. The Penguin network ensures continuity by following some general requirements or system attributes. Some of them are stable and resilient specifications and software implementation. Scalable enough to accommodate many orders of magnitude, more users, and data without lowering the performance or reliability for mass adoption, secure and resilient solution to deliberate attacks, Penguin is a self-sustaining autonomous solution that is independent of human or organizational coordination or any legal entity's business.
DataRobot | December 17, 2021
DataRobot today announced DataRobot Core, a comprehensive offering that broadens its AI Cloud platform for code-first data science experts. DataRobot also announced its latest platform release, extending the capabilities of AI Cloud for all users with broader and more sophisticated analytical capabilities for data scientists, enhanced decision intelligence, and new features to manage and scale operations in production.
The unprecedented demand for AI, combined with the complexity in delivering AI to production, has created significant delays in data science initiatives for all businesses at a time when AI has never been more vital to business outcomes: 87% of organizations continue to struggle with long deployment timelines, while data scientists spend at least 50% of their time on non-strategic model deployment. To scale quickly and remain agile, data science teams need the tools and product capabilities to deliver high-impact results, faster.
DataRobot Core brings together a complete portfolio of purpose-built capabilities that give data scientists ultimate flexibility in how they deliver AI to the business, enabling faster experimentation and rapid time to value, while making teams more efficient and effective at driving clear business impact from AI:
Platform: Unified environment with first-class, embedded and multilanguage notebook experience; Composable ML to seamlessly pivot between code-first and automated model generation; code-centric pipelines on top of Apache Spark; open API to enable programmatic access to the full AI Cloud platform; and built for the modern enterprise with support for the reliability, governance, compliance and scale needs across industries.
Resources: Extensive portfolio of accelerators, third-party integrations and libraries to expedite AI delivery and drive efficiency, along with evolving education resources to advance skills and enable data scientists to stay at the cutting edge.
Community: Shared knowledge and access to the unique expertise of the DataRobot team, industry experts and thousands of community members from DataRobot customers representing some of the largest and most successful AI implementations in the world.
DataRobot’s team of over 300 data scientists are pioneering efforts in AI, with applied expertise across more than a million active projects for customers across industries on a global scale. Leveraging DataRobot AI Cloud, full service direct mortgage lender Embrace Home Loans eliminated 43 million lines of code, freeing up their data scientists to build even more complex and strategic solutions.
“DataRobot has been transformational for our business,” said Keith Portman, Chief Analytics Officer at Embrace Home Loans. “DataRobot’s AI Cloud platform enabled us to double our return on marketing investment spend and maintain a notebook-first approach. Our data scientists can now build complex models with flexibility and seamless integration, gaining back hours of time.”
Alongside Core, the launch of DataRobot 7.3 introduces over 80 new features and capabilities designed for all users to enable AI-driven decisions across all lines of business, within a single platform. DataRobot 7.3 offers:
Expanded Support for Diverse Use Cases. Giving data science teams native, out-of-the-box flexibility across data types, users can now run anomaly detection with images and leverage the next generation of Text AI, as well as comprehensive tools, including Multimodal Clustering, Time-Series Segmented Modeling and Multilabel Classification.
Better, Faster Decisions with Decision Intelligence. Teams can rapidly deploy models that combine complex rules and business logic with post-process prediction scores with simple APIs, and build fully customized AI applications in a matter of minutes with no coding required.
Enhanced Performance Monitoring, Compliance and Regulatory Capabilities. Automated compliance documentation now extends to custom models built outside of DataRobot, streamlining regulation readiness for all users. With all models in production, users can easily evaluate and compare challenger models against live models, and clearly see if a model should be replaced in order to maintain peak performance for the business.
“For organizations today, translating data and AI into tangible outcomes is critical in order to remain competitive and thrive,” said Nenshad Bardoliwalla, Chief Product Officer at DataRobot. “DataRobot Core and 7.3 are designed to meet increasing demand and scale, and empower the largest number of AI creators, from code-centric data science teams to business analysts and decision makers, to experiment fast and collaborate effectively on the same platform. Together, these solutions provide the much-needed flexibility, speed and control that brings trustworthy AI solutions to life for every organization.”
In support of DataRobot Core, DataRobot is also announcing an expanded partnership with AtScale to deliver more comprehensive data access and feature modeling to customers. AtScale brings its semantic layer technology to DataRobot Core, simplifying connections from DataRobot to a broad range of cloud data platforms and providing a powerful modeling canvas for feature engineering. Together, DataRobot and AtScale deliver complete services for organizations to operationalize AI/ML workloads with support for a wide range of data platforms, protocols and visualization platforms.
DataRobot AI Cloud is the next generation of AI. DataRobot's AI Cloud vision is to bring together all data types, all users, and all environments to deliver critical business insights for every organization. DataRobot is trusted by global customers across industries and verticals, including a third of the Fortune 50.