BIG DATA MANAGEMENT

Kyligence’s Intelligent Data Cloud Platform Now Available on Google Cloud

Kyligence | February 16, 2022

Kyligence, originator of Apache Kylin and developer of the AI-augmented data services and management platform Kyligence Cloud, today announced the beta availability of its intelligent data cloud platform on Google Cloud. Kyligence’s one-stop, cloud-native big data OLAP solution helps data analysts and business users quickly discover the business value in the massive amounts of data in the cloud.

Designed and built for today’s cloud, Kyligence Cloud allows enterprise organizations to create fast, flexible, and cost-optimal innovative big data analysis applications on a data lake based on cloud-native computing and storage. With automated model optimization using AI-enhanced semantic modeling based on past analysis, business users can make more informed decisions. Kyligence Cloud can be seamlessly integrated with Google Cloud Storage to help maximize the use of existing cloud assets.

“This is a significant step in the evolution of both the Kyligence Cloud and the Google Cloud ecosystem. Having Kyligence’s cloud platform available on Google Cloud, makes it even more seamless for customers around the world to leverage Kyligence to unlock insights faster from big data.”

Li Kang, vice president, North America, Kyligence

Kyligence’s AI-augmented data services and management platform provides analysts and business users with a unified, governed, and optimized semantic layer. Through multiple interfaces—such as SQL, MDX, and REST APIs—Kyligence Cloud seamlessly connects business applications, popular BI tools, and AI/ML environments, enabling users to work efficiently with familiar tools.

Kyligence seamlessly connects to native data sources such as Cloud Storage to get the most out of data on Google Cloud, building a comprehensive Google Cloud big data solution. Kyligence Cloud on Google Cloud is currently in beta.

About Kyligence
Founded by the creators of Apache Kylin, Kyligence Cloud provides an intelligent analytics performance layer that sits between data sources and BI tools. Kyligence features an AI-Augmented learning engine to ensure peak performance and vastly simplified data modeling. The result is sub-second query response time for BI, SQL, OLAP, and Excel users even against petabytes of data.

Kyligence is headquartered in San Jose, CA. Investors include Redpoint Ventures, Cisco, China Broadband Capital, Shunwei Capital, Eight Roads Ventures (the proprietary investment arm of Fidelity International Limited), and Coatue Management. Kyligence serves a global customer base that includes UBS, Costa, Appzen, McDonald’s, YUM, L’OREAL, Porsche, Xactly, China Merchants Bank, and China Construction Bank.

Spotlight

Big Data My first Inforgrahics Dynamic Chart https://magic.piktochart.com/output/247760 7-minimalist# Data Science Blog : http://thedatascientistview.blogspot.ie/?view =flipcard Data.


Other News
DATA ARCHITECTURE

Fractal expands cloud AI offerings with the acquisition of Neal Analytics

Fractal | January 12, 2022

Fractal, a global provider of artificial intelligence and advanced analytics to Fortune 500® companies, today announced the acquisition of Neal Analytics, a cloud, data, engineering and AI Microsoft Gold consulting partner, for an undisclosed amount. Neal Analytics strengthens Fractal's AI engineering capabilities & cloud-first offerings on Microsoft's multi cloud ecosystems and enables clients to scale AI and power decisions; It also strengthens Fractal's presence in the Pacific Northwest, Canada, and India. Founded in 2011, Neal Analytics is a Microsoft Gold Consulting Partner supporting its cross-industry clients, such as PepsiCo and Microsoft, in their data-driven transformation initiatives. Neal Analytics' 200+ team brings deep expertise on the Azure stack - across data science, AI & ML, IoT, Edge Computing, BI, application development, migration and modernization, and automation. Neal Analytics also has partnerships with Intel, Nvidia, and Databricks. "We are excited about partnering with Dylan and his talented team at Neal Analytics. They have built a great client-centric, people-oriented culture, and have an impressive track record of solving and scaling AI engineering challenges, especially on the Microsoft platform, for marquee clients. This partnership will accelerate our ability to power data-driven decisions end-to-end for our Fortune® 500 sized clients." Srikanth Velamakanni, Co-founder and Group Chief Executive, Fractal Dylan Dias, CEO, Neal Analytics, said, "This is the successful culmination of a thorough, year-long process. Our goals were to find the best long-term home for Neal's 200+ employees, a platform to scale faster, and the ability to play a bigger role in this fast-accelerating space. Fractal was a clear choice. Our culture and vision are 100% aligned. This is an exciting opportunity to empower our people and work alongside like-minded practitioners to transform businesses with cloud, data, and AI. It will enable Neal Analytics and Fractal to grow and achieve more together." Satish Raman, Chief Strategy Officer, Fractal, said, "Hyperscale cloud infrastructures are enabling organizations to be more agile and improve their business performance to better serve clients. Neal's award-winning experts in Microsoft's Azure and cloud technologies have been helping organizations to securely migrate, modernize and accelerate their business transformation within the cloud for more than a decade. We're delighted that this acquisition of Neal will further strengthen our rapidly growing analytics, AI, and data engineering business in North America." About Fractal Fractal is one of the most prominent players in the Artificial Intelligence space. Fractal's mission is to power every human decision in the enterprise and brings AI, engineering, and design to help the world's most admired Fortune 500® companies. Fractal's products include Qure.ai to assist radiologists in making better diagnostic decisions, Crux Intelligence to assist CEOs and senior executives make better tactical and strategic decisions, Theremin.ai to improve investment decisions, Eugenie.ai to find anomalies in high-velocity data, Samya.ai to drive next-generation Enterprise Revenue Growth Management, Senseforth.ai to automate customer interactions at scale to grow top-line and bottom-line and Analytics Vidhya is the largest Analytics and Data Science community offering industry-focused training programs. About Neal Analytics Neal Analytics is a cloud, data, and AI Microsoft Gold Consulting Partner supporting data-driven transformation initiatives from data strategy to solution design, architecture, development, operationalization, and support. Neal leverages Agile methodologies and flexible engagement models to deliver measurable customer value with a focus on right sized and pragmatic approaches towards digital transformation.

Read More

BIG DATA MANAGEMENT

Civis Analytics Launches Toolkit for a Data-Driven COVID Vaccine Campaign

Civis Analytics | February 23, 2021

Civis Analytics, a data science firm innovating at the intersection of public good and scientific best practices, today announced the launch of its COVID Vaccine Campaign Toolkit. This resource hub includes key information for organizations looking to use data to inform persuasive and equitable COVID vaccination outreach. The toolkit provides guidance on each aspect of an outreach campaign that requires a specific, tailored approach. These include: • Messaging and messenger: Results from scientific experiments to guide messaging and spokespeople, so the most persuasive language is used for each audience. This includes new research on employer-specific messaging, conducted in partnership with the U.S. Chamber of Commerce Foundation. Crystal Son, MPH, Director of Healthcare Analytics at Civis Analytics, also shares examples of counterintuitive findings, tangible advice for any vaccine campaign team, and comparisons to Civis research on other vaccine messaging. • Resource allocation: An interactive vaccine hesitancy map that visualizes which U.S. counties are most likely to need additional intervention to drive up vaccination rates. This map can guide groups that need to determine where to focus education or outreach efforts. • Process: Leveraging best practices from recent 2020 Census outreach campaigns, Civis outlines a five-step process organizations can follow to ensure their vaccine campaign is data-driven and set up for success. "Vaccine uptake is absolutely critical to herd immunity -- which is what we all need to recover from the pandemic," said Son. "Too often, we assume that a one-size-fits-all approach to public health campaigns can work. We continue to see that the language and messengers that work for one group can backfire with another -- so we really need to eliminate our own biases and rely on the data to guide our campaigns. For many of us, this is the most important campaign of our lifetime, and we need to do this right." About Civis Analytics Civis Analytics helps leading public and private sector organizations use data to gain a competitive advantage in how they identify, attract, and engage people. With a blend of proprietary data, technology and advisory services, and an interdisciplinary team of data scientists, developers, and survey science experts, Civis helps organizations stop guessing and start using statistical proof to guide decisions.

Read More

BIG DATA MANAGEMENT

Penguin Releases the Decentralized Data Network for Web3.0

Penguin | January 03, 2022

Recently, Penguin team has announced the launch of their decentralized data network for Web3.0. With the advancement of blockchain technology, some innovative new players are entering the market. Some are bringing the offline world to a global audience, while others transform the way we invest in our future. Decentralized applications, DeFi, NFTs, and the Metaverse, hold immense potential for future growth and real-world uses. But what the current crypto arena lacks is an independent & one-stop web service that includes a high-performance smart contract blockchain together with a decentralized storage solution. The Penguin network brings in a universal decentralized data network specifically designed for Web 3.0. Penguin - The Decentralized Storage Platform Exclusively designed for Web 3.0, Penguin is a peer-to-peer network of nodes, which jointly provides decentralized storage and communication service. By offering a universal decentralized data network for Web3.0, the platform can fulfill multiple roles for different areas of blockchain space. Moreover, Penguin aims to work with the blockchain industry to create decentralized applications (DApps), products, and services seamlessly accessible in Web 3.0. A unique feature of the platform is that it offers automatic scaling; that is, an increase in storage space demand would be efficiently handled. This will eventually lead to a lowering of costs for the blockchain arena. Penguin also facilitates efficient data storage capabilities and quick data retrieval. The network is economically automated with a native protocol token, PEN, thanks to its built-in smart-contract-based incentive system. Therefore, the purported goal of the platform is to extend the blockchain by utilizing decentralized storage and communication to position itself as a world computer that can efficiently serve as an operating system and deployment environment for dApps. Web 3.0 - The Decentralized Internet of the Future Web 3.0 is not merely a buzzword that tech, crypto, and venture-capital classes have become interested in lately. It aims to provide a future where distributed users and machines can seamlessly interact with data, value, and other counterparties through peer-to-peer networks, eliminating the need for any third parties. It is built majorly on three novel layers of technological innovation. Those are edge computing, decentralized data networks, and artificial intelligence. Web 3.0, built on blockchain, eliminates all big intermediaries, including centralized governing bodies or repositories. Moreover, the most significant evolution enabled by Web 3.0 is the minimization of the trust required for coordination on a global scale. It fundamentally expands the scale and scope of human and machine interactions to a far new level. These interactions range from easy payments to richer information flows and trusted data transfers, all without passing through a fee-charging intermediary. Web 3.0 enhances the current internet service with significant characteristics like trustless, verifiable, permissionless, self-governing, etc. This is why a permissionless, decentralized blockchain like Penguin plays a pivotal part in developing the so-called "decentralized internet of the future." Decentralized data networks like Penguin make it possible for data generators to store or sell their data without losing ownership control, compromising privacy, or reliance on intermediaries or go-betweens. Blockchain Technology and Web 3.0 Blockchain technology and cryptocurrencies have always been an integral part of Web3.0. It provides financial incentives for anyone who wants to create, govern, contribute, or improve projects. Today the internet needs Web 3.0, a new generation of the Internet protocol that facilitates free identity, free contracts, and free assets. Blockchain technology with its advanced network fundamentals offers a near-perfect solution with in-built smart contracts for self-deployment and access, decentralized addresses as accounts, etc. Penguin, the decentralized data network, provides an available decentralized private data storage solution for all Web3.0 developers. How Does Penguin Benefit The Development Of Web 3.0 Today we live in a data-driven world, where companies often collect massive amounts of user data and use this data with the intent to deliver value. Data privacy has become a greater concern over the past few years. However, the Internet ecosystem has fundamentally changed several concerns like data privacy and storage. This is referred to as Web 3.0, and it ensures this by deploying blockchain. Penguin primarily focuses on data storage with zero downtime. It also features permanent versionable content storage, zero error operation, and resistance to intermittent disconnection of nodes. With its exceptional privacy attributes like anonymous browsing, deniable storage, untraceable messaging, and file representation formats that leak no metadata, Penguin meets with the growing security demand on the web. Penguin also offers continuous service and resilience against outages or targeted attacks. The platform facilitates the creation of many products, where all products rely on APIs and SDKs provided by Penguin. Penguin - An Infrastructure for A Self-Sovereign Society Penguin is more than just a network; the protocol sets a strong foundation for creating a market economy around data storage and retrieval. The platform also has entered into a host of prospective and strategic partnerships and collaborations with different projects and protocols in the DeFi, GameFi, NFTs, smart contract, and other metaverse spaces. Moreover, as a platform for permissionless publication, the Penguin network promotes information freedom. The platform’s design requirements can only be met by the network native token PEN. Some of the significant features that Web 3.0 offers are zero central point of control by removing intermediaries, complete ownership of data, sharing information in a permissionless manner, reducing hacks and data breaches with decentralized data, and interoperability. On the other hand, Penguin aims to build an infrastructure for a self-sovereign society. Without permission and privacy, Penguin efficiently meets the needs of freedom of speech, data sovereignty, open network market, and ensuring its security through integrity protection, censorship resistance, and attack resilience. Some of its vital meta values are Inclusivity, the need to include the underprivileged in the data economy, lowering the barrier of entry to explain complex data flows, and building decentralized applications. The integrity of the online persona is necessary. Because Penguin is a network with open participation and offers services and permissionless access to publishing, sharing, and investing your data, users have complete freedom to express their intention and have full authority to decide whether they want to remain anonymous or share interactions. Incentivization or economic incentives ensure that participants' behavior aligns with the network's desired emergent behavior. Finally, Impartiality guarantees content neutrality and prevents gate-keeping. It successfully rules out other values that treat any particular group as a privileged or express preference for specific content or even data from any specific source. These meta values make Penguin an efficient decentralized, permissionless data network for Web 3.0. Penguin’s Future-Proof Design Principles - Meeting the Needs of Web 3.0 The information society and data economy have ushered in an era where online transactions and big data are pivotal for everyday life. Therefore, it is essential to have a future-proof and advanced supporting technology like Penguin. The network offers a strong guarantee for continuity. The Penguin network ensures continuity by following some general requirements or system attributes. Some of them are stable and resilient specifications and software implementation. Scalable enough to accommodate many orders of magnitude, more users, and data without lowering the performance or reliability for mass adoption, secure and resilient solution to deliberate attacks, Penguin is a self-sustaining autonomous solution that is independent of human or organizational coordination or any legal entity's business.

Read More

DATA ARCHITECTURE

Databricks Launches Data Lakehouse for Retail and Consumer Goods Customers

Databricks | January 14, 2022

Databricks, the Data and AI company and pioneer of the data lakehouse architecture, today announced the Databricks Lakehouse for Retail, the company's first industry-specific data lakehouse for retailers and consumer goods (CG) customers. With Databricks' Lakehouse for Retail, data teams are enabled with a centralized data and AI platform that is tailored to help solve the most critical data challenges that retailers, partners, and their suppliers are facing. Early adopters of Databricks' Lakehouse for Retail include industry-leading customers and partners like Walgreens, Columbia, H&M Group, Reckitt, Restaurant Brands International, 84.51°(a subsidiary of Kroger Co.), Co-Op Food, Gousto, Acosta and more. "As the retail and healthcare industries continue to undergo transformative change, Walgreens has embraced a modern, collaborative data platform that provides a competitive edge to the business and, most importantly, equips our pharmacists and technicians with timely, accurate patient insights for better healthcare outcomes," said Luigi Guadagno, Vice President, Pharmacy and HealthCare Platform Technology at Walgreens. "With hundreds of millions of prescriptions processed by Walgreens each year, Databricks' Lakehouse for Retail allows us to unify all of this data and store it in one place for a full range of analytics and ML workloads. By eliminating complex and costly legacy data silos, we've enabled cross-domain collaboration with an intelligent, unified data platform that gives us the flexibility to adapt, scale and better serve our customers and patients." "Databricks has always innovated on behalf of our customers and the vision of lakehouse helps solve many of the challenges retail organizations have told us they're facing," said Ali Ghodsi, CEO and Co-Founder at Databricks. "This is an important milestone on our journey to help organizations operate in real-time, deliver more accurate analysis, and leverage all of their customer data to uncover valuable insights. Lakehouse for Retail will empower data-driven collaboration and sharing across businesses and partners in the retail industry." Databricks' Lakehouse for Retail delivers an open, flexible data platform, data collaboration and sharing, and a collection of powerful tools and partners for the retail and consumer goods industries. Designed to jumpstart the analytics process, new Lakehouse for Retail Solution Accelerators offer a blueprint of data analytics and machine learning use cases and best practices to save weeks or months of development time for an organization's data engineers and data scientists. Popular solution accelerators for Databricks' Lakehouse for Retail customers include: Real-time Streaming Data Ingestion: Power real-time decisions critical to winning in omnichannel retail with point-of-sale, mobile application, inventory and fulfillment data. Demand forecasting and time-series forecasting: Generate more accurate forecasts in less time with fine-grained demand forecasting to better predict demand for all items and stores. ML-powered recommendation engines: Specific recommendations models for every stage of the buyer journey - including neural network, collaborative filtering, content-based recommendations and more - enable retailers to create a more personalized customer experience. Customer Lifetime Value: Examine customer attrition, better predict behaviors of churn, and segment consumers by lifetime and value with a collection of customer analytics accelerators to help improve decisions on product development and personalized promotions. Additionally, industry-leading Databricks partners like Deloitte and Tredence are driving lakehouse vision and value by delivering pre-built analytics solutions on the lakehouse platform that address real-time customer use cases. Tailor-made for the retail industry, featured partner solutions and platforms include: Deloitte's Trellis solution accelerator for the retail industry is one of many examples of how Deloitte and client partners are adopting the Databricks Lakehouse architecture construct and platform to deliver end-to-end data and AI/ML capabilities in a simple, holistic, and cost-effective way. Trellis provides capabilities that solve retail clients' complex challenges around forecasting, replenishment, procurement, pricing, and promotion services. Deloitte has leveraged their deep industry and client expertise to build an integrated, secured, and multi-cloud ready "as-a-service" solution accelerator on top of Databricks' Lakehouse platform that can be rapidly customized as appropriate based on client's unique needs. Trellis has proven to be a game-changer for our joint clients as it allows them to focus on the critical shifts occurring both on the demand and supply side with the ability to assess recommendations, associated impact, and insights in real-time that result in significant improvement to both topline and bottom line numbers. Tredence will meet the explosive enterprise Data, AI & ML demand and deliver real-time transformative industry value for their business by delivering solutions for Lakehouse for Retail. The partnership first launched the On-Shelf Availability Solution (OSA) accelerator in August 2021, combining Databricks' data processing capability and Tredence's AI/ML expertise to enable Retail, CPG & Manufacturers to solve their trillion dollar out-of-stock challenge. Now with Lakehouse for Retail, Tredence and Databricks will jointly expand the portfolio of industry solutions to address other customer challenges and drive global scale together. About Databricks Databricks is the data and AI company. More than 5,000 organizations worldwide — including Comcast, Condé Nast, H&M, and over 40% of the Fortune 500 — rely on the Databricks Lakehouse Platform to unify their data, analytics and AI. Databricks is headquartered in San Francisco, with offices around the globe. Founded by the original creators of Apache Spark™, Delta Lake and MLflow, Databricks is on a mission to help data teams solve the world's toughest problems.

Read More

Spotlight

Big Data My first Inforgrahics Dynamic Chart https://magic.piktochart.com/output/247760 7-minimalist# Data Science Blog : http://thedatascientistview.blogspot.ie/?view =flipcard Data.

Resources