BIG DATA MANAGEMENT

DAS42 and AtScale Partner to Deliver Advanced Data Technology Solutions

DAS42 | October 19, 2021

DAS42, a provider of FullStack data technology implementation and advisory services, and AtScale, the leading provider of semantic layer solutions for modern business intelligence and data science teams, today announced a partnership to deliver innovative solutions to enterprises implementing modern data platforms.

“Creating a standardized and centralized source of truth for metrics and business definitions is a central part of our FullStack approach for helping companies build modern data analytics environments and data-centric cultures,” said DAS42 CEO Nick Amabile. “AtScale provides a world-class semantic layer solution to make data, models, and analysis more accessible, consistent, and secure across organizations. We’re excited to be joining with them to help clients get the most out of their data – quickly and efficiently.”

“DAS42 has an excellent reputation for helping some of the world’s leading organizations to uncover, understand, and best utilize data across their organizations, improving decision-making at all levels. The combination of our semantic layer technology and DAS42’s expertise makes a powerful package for organizations looking to scale their data use while maintaining consistency and establishing a level of control and governance over said data.”

David Mariani, CTO and Founder of AtScale

About DAS42
DAS42 is a leading provider of cloud-based data analytics consulting and professional services. Based in New York and with offices across the United States, our clients include some of the world’s largest companies. We work with cutting-edge technology partners to help organizations use data to improve their operations, reduce the time to actionable insights, and empower them to make better decisions, faster.

About AtScale
AtScale enables smarter decision-making by accelerating the flow of data-driven insights. The company’s semantic layer platform simplifies, accelerates, and extends business intelligence and data science capabilities for enterprise customers across all industries. With AtScale, customers are empowered to democratize data, implement self-service BI and build a more agile analytics infrastructure for better, more impactful decision making.

Spotlight

centering your database and data analytics workload development and deployment on a Kubernetes-based container, you can create a more efficient and speedy data life cycle. Access this white paper to learn how to improve key capabilities for database and data analytics workloads across hybrid cloud environments.


Other News
BIG DATA MANAGEMENT

ListenFirst Announces BI Connector For Tableau

ListenFirst | January 13, 2022

ListenFirst, the premier enterprise social analytics solution, today announced the launch of BI Connector for Tableau, providing their customers with an interface to connect directly to the ListenFirst API and access data on-demand within their Tableau dashboard. This new feature introduces codeless integration and enables ListenFirst customers to tell a multitude of stories by manipulating and layering different insights within Tableau's 24 visualization types and reporting customizations. Available in Tableau's app directory, ListenFirst's BI Connector for Tableau creates a quicker, more seamless way to get bulk selection of data without needing frontend access. Reporting cycles are readily available in Tableau by utilizing dynamic date ranges to automatically refresh the latest insights in the desired date range and parameters. "More than half of ListenFirst API customers use Tableau, and BI Connector For Tableau creates a quick path to insights for these data power users," explained Jimmy Li, Product Manager at ListenFirst. "Users can now effortlessly conduct cross data source analysis, on-demand reporting, advanced charting, and data visualization. Additionally, insights can be published side by side with other data points from non-ListenFirst sources, simplifying complex stories into insights that are instantly understandable." Optimized for self service, BI Connector for Tableau supports flexible data queries for brand, content and paid level datasets. About ListenFirst ListenFirst is the premier social analytics solution used by the world's leading brands. With a breadth of data and award-winning expertise unmatched in the market, we offer an easy, one-stop solution to optimize social media marketing and maximize ROI. ListenFirst has been honored with multiple accolades including a 2020 SIIA CODiE Award for Best Emerging Technology, 2020 Cynopsis AdTech Award for Outstanding Data Solution, 2022 High Performer recognition from G2 Crowd, MarTech Breakthrough Award for Best Social Media Monitoring Software, and named one of Inc. 500's fastest growing companies. Founded in 2012, ListenFirst is trusted by leading global brands including AT&T, Amazon, NBCUniversal, and Peloton.

Read More

BIG DATA MANAGEMENT

Voxco Launches Voxco Intelligence, a No-code Data Analytics Platform to Fuel the Future of Customer Insights

Voxco | April 06, 2022

Voxco, the actionable insights platform, today announced an extension to their existing survey research platform with the launch of Voxco Intelligence. The launch comes at a time when the pandemic has transformed the way Voxco does business, with an ever-growing number of organisations realising the importance of using digital platforms to better serve their customers. After serving several major players in the retail, automotive & finance industry, Voxco Intelligence (previously Actify by Voxco) will now be available to organisations globally. The new offering - Voxco Intelligence, a no-code data analytics platform, will help organisations unlock the true potential of customer data using predictive analytics, AI & Machine learning models. Voxco Intelligence enables businesses to understand customers faster, uncover hidden insights and make effective decisions. Voxco's existing omnichannel survey capabilities and Voxco Audience (its global panel aggregation platform) will be integrated as one offering under Voxco Research. Voxco Intelligence perfectly complements Voxco Research as the two combined, ensure a seamless end-to-end solution for enterprises looking to gather feedback, measure sentiment, uncover insights & act on them. It enables organisations to fuel experiences, foster loyalty & maximise customer LTV. "Most organisations struggle with implementing customer-centric solutions due to the poor quality of data they've. Often, they also lack the technical expertise that's required to make sense of their data. Voxco Intelligence, with its AI & ML capabilities, helps them unlock their true growth potential by unifying & analysing huge volumes of siloed data, developing actionable intelligence, and enabling business transformations." Sumit Aneja, CEO, Voxco Transform experiences and survey research with Voxco Intelligence's core capabilities: Single Source of Truth Gather customer data from multiple data sources and interactive channels, filter fraudulent data, and integrate and standardise it to create a complete 360 view of your customers. Predictive Insights Analyse omnichannel customer data to understand customer needs, measure emotion, predict next behaviour & forecast business metrics in real-time Advanced Analytics Using text analytics, identify and prioritise the most pressing issues by analysing the underlying satisfaction drivers to understand customer sentiment and behavior. Real-Time Actions Combine AI and ML to recommend high-value actions to relevant teams in real-time. Voxco Intelligence also enhances efficiency with automation of manual tasks, standardisation of data for easy analysis, and improved data visibility across levels. Voxco Voxco, a leading actionable insights platform helps the world's leading brands take data driven decisions to drive growth & fuel omnichannel experiences. Using Voxco, organisations can foster loyalty, increase customer lifetime value and enhance risk management which delivers exceptional returns on investment. Over 500+ market research organisations, government & government agencies, universities and global corporations use Voxco to gather data, measure sentiment, uncover insights and act on them.

Read More

BIG DATA MANAGEMENT

Penguin Releases the Decentralized Data Network for Web3.0

Penguin | January 03, 2022

Recently, Penguin team has announced the launch of their decentralized data network for Web3.0. With the advancement of blockchain technology, some innovative new players are entering the market. Some are bringing the offline world to a global audience, while others transform the way we invest in our future. Decentralized applications, DeFi, NFTs, and the Metaverse, hold immense potential for future growth and real-world uses. But what the current crypto arena lacks is an independent & one-stop web service that includes a high-performance smart contract blockchain together with a decentralized storage solution. The Penguin network brings in a universal decentralized data network specifically designed for Web 3.0. Penguin - The Decentralized Storage Platform Exclusively designed for Web 3.0, Penguin is a peer-to-peer network of nodes, which jointly provides decentralized storage and communication service. By offering a universal decentralized data network for Web3.0, the platform can fulfill multiple roles for different areas of blockchain space. Moreover, Penguin aims to work with the blockchain industry to create decentralized applications (DApps), products, and services seamlessly accessible in Web 3.0. A unique feature of the platform is that it offers automatic scaling; that is, an increase in storage space demand would be efficiently handled. This will eventually lead to a lowering of costs for the blockchain arena. Penguin also facilitates efficient data storage capabilities and quick data retrieval. The network is economically automated with a native protocol token, PEN, thanks to its built-in smart-contract-based incentive system. Therefore, the purported goal of the platform is to extend the blockchain by utilizing decentralized storage and communication to position itself as a world computer that can efficiently serve as an operating system and deployment environment for dApps. Web 3.0 - The Decentralized Internet of the Future Web 3.0 is not merely a buzzword that tech, crypto, and venture-capital classes have become interested in lately. It aims to provide a future where distributed users and machines can seamlessly interact with data, value, and other counterparties through peer-to-peer networks, eliminating the need for any third parties. It is built majorly on three novel layers of technological innovation. Those are edge computing, decentralized data networks, and artificial intelligence. Web 3.0, built on blockchain, eliminates all big intermediaries, including centralized governing bodies or repositories. Moreover, the most significant evolution enabled by Web 3.0 is the minimization of the trust required for coordination on a global scale. It fundamentally expands the scale and scope of human and machine interactions to a far new level. These interactions range from easy payments to richer information flows and trusted data transfers, all without passing through a fee-charging intermediary. Web 3.0 enhances the current internet service with significant characteristics like trustless, verifiable, permissionless, self-governing, etc. This is why a permissionless, decentralized blockchain like Penguin plays a pivotal part in developing the so-called "decentralized internet of the future." Decentralized data networks like Penguin make it possible for data generators to store or sell their data without losing ownership control, compromising privacy, or reliance on intermediaries or go-betweens. Blockchain Technology and Web 3.0 Blockchain technology and cryptocurrencies have always been an integral part of Web3.0. It provides financial incentives for anyone who wants to create, govern, contribute, or improve projects. Today the internet needs Web 3.0, a new generation of the Internet protocol that facilitates free identity, free contracts, and free assets. Blockchain technology with its advanced network fundamentals offers a near-perfect solution with in-built smart contracts for self-deployment and access, decentralized addresses as accounts, etc. Penguin, the decentralized data network, provides an available decentralized private data storage solution for all Web3.0 developers. How Does Penguin Benefit The Development Of Web 3.0 Today we live in a data-driven world, where companies often collect massive amounts of user data and use this data with the intent to deliver value. Data privacy has become a greater concern over the past few years. However, the Internet ecosystem has fundamentally changed several concerns like data privacy and storage. This is referred to as Web 3.0, and it ensures this by deploying blockchain. Penguin primarily focuses on data storage with zero downtime. It also features permanent versionable content storage, zero error operation, and resistance to intermittent disconnection of nodes. With its exceptional privacy attributes like anonymous browsing, deniable storage, untraceable messaging, and file representation formats that leak no metadata, Penguin meets with the growing security demand on the web. Penguin also offers continuous service and resilience against outages or targeted attacks. The platform facilitates the creation of many products, where all products rely on APIs and SDKs provided by Penguin. Penguin - An Infrastructure for A Self-Sovereign Society Penguin is more than just a network; the protocol sets a strong foundation for creating a market economy around data storage and retrieval. The platform also has entered into a host of prospective and strategic partnerships and collaborations with different projects and protocols in the DeFi, GameFi, NFTs, smart contract, and other metaverse spaces. Moreover, as a platform for permissionless publication, the Penguin network promotes information freedom. The platform’s design requirements can only be met by the network native token PEN. Some of the significant features that Web 3.0 offers are zero central point of control by removing intermediaries, complete ownership of data, sharing information in a permissionless manner, reducing hacks and data breaches with decentralized data, and interoperability. On the other hand, Penguin aims to build an infrastructure for a self-sovereign society. Without permission and privacy, Penguin efficiently meets the needs of freedom of speech, data sovereignty, open network market, and ensuring its security through integrity protection, censorship resistance, and attack resilience. Some of its vital meta values are Inclusivity, the need to include the underprivileged in the data economy, lowering the barrier of entry to explain complex data flows, and building decentralized applications. The integrity of the online persona is necessary. Because Penguin is a network with open participation and offers services and permissionless access to publishing, sharing, and investing your data, users have complete freedom to express their intention and have full authority to decide whether they want to remain anonymous or share interactions. Incentivization or economic incentives ensure that participants' behavior aligns with the network's desired emergent behavior. Finally, Impartiality guarantees content neutrality and prevents gate-keeping. It successfully rules out other values that treat any particular group as a privileged or express preference for specific content or even data from any specific source. These meta values make Penguin an efficient decentralized, permissionless data network for Web 3.0. Penguin’s Future-Proof Design Principles - Meeting the Needs of Web 3.0 The information society and data economy have ushered in an era where online transactions and big data are pivotal for everyday life. Therefore, it is essential to have a future-proof and advanced supporting technology like Penguin. The network offers a strong guarantee for continuity. The Penguin network ensures continuity by following some general requirements or system attributes. Some of them are stable and resilient specifications and software implementation. Scalable enough to accommodate many orders of magnitude, more users, and data without lowering the performance or reliability for mass adoption, secure and resilient solution to deliberate attacks, Penguin is a self-sustaining autonomous solution that is independent of human or organizational coordination or any legal entity's business.

Read More

BIG DATA MANAGEMENT

Software AG, SAP partner on industry 4.0 data

Software AG | February 24, 2021

Software AG and SAP have partnered to better surface supply chain management data with the aim of improving product quality. The news, which landed as Software AG held its Capital Markets Day, highlights how multiple players are forming partnerships to focus on the industry 4.0 market. Software AG's alliance with SAP will combine SAP's S/4HANA Cloud with Software AG's TrendMiner, which is self-service industrial analytics software for smart factories. According to the companies, the partnership will bring sensor-generated time-series data into the analytics and operational performance fold. Software AG in January reported bookings growth of 31% in the fourth quarter and 24% for 2020. Digital transformation drove demand for Software AG. For 2020, Software AG reported revenue of €834.8 million, down 3.8% from a year ago. Net income (non-IFRS) was €125.4 million. Software AG said that it will "double down on its existing strategy by focusing on five priority areas, namely: continuing to develop its subscription growth engine, fostering product innovation, driving internal simplification to improve productivity, progressing the ongoing cultural transformation driving its success, and taking a more proactive stance towards Mergers & Acquisitions."

Read More

Spotlight

centering your database and data analytics workload development and deployment on a Kubernetes-based container, you can create a more efficient and speedy data life cycle. Access this white paper to learn how to improve key capabilities for database and data analytics workloads across hybrid cloud environments.

Resources