BIG DATA MANAGEMENT
Pariveda | August 05, 2022
Pariveda, a leader specializing in solving complex business problems, announces it has earned the Data Warehouse Migration to Microsoft Azure advanced specialization, a validation of a solution partner's deep knowledge, extensive experience and proven expertise in analysing existing workloads, generating schema models and performing extract, transform and load (ETL) operations to migrate data and enable cloud-based analytics in Azure.
The Data Warehouse Migration to Microsoft Azure advanced specialization can only be earned by partners that meet stringent criteria around customer success and staff skilling, as well as pass a third-party audit of their data warehouse migration practices, including their ability to migrate data from Netezza and Teradata appliances.
As companies adopt digital transformation and increase their data usage and demand, they require more scalable and agile analytics solutions than on-premises legacy systems can offer. These companies are looking for a partner with advanced skills to migrate their existing data warehouses to the cloud and enable cloud-based analytics.
"The Data Warehouse Migration to Microsoft Azure advanced specialization highlights the partners who can be viewed as most capable when it comes to migrating data warehouses and implementing cloud-based analytics in Azure. Pariveda clearly demonstrated that they have both the skills and the experience to deliver best-in-class cloud-based analytics capabilities to customers with Azure."
Andrew Smith, General Manager, Partner Program Management at Microsoft
"This advanced specialization places Pariveda in a unique position to help enterprise data organizations advance at an innovative pace," said T Linson, Vice President, Pariveda. "We are proud of how we developed our Migration offering suite to address the key areas we see clients face every day."
Pariveda is a consulting firm solving complex technology and business problems by aligning our people-development focus with the mission of our clients. As an employee-owned company, our people are naturally curious, driven individuals comfortable with complexity. We are invested in helping our clients identify, architect and develop custom solutions to help their organization succeed now and into the future. Headquartered in Dallas, Texas, we live and work in major cities across North America.
BIG DATA MANAGEMENT`
MOSTLY AI | July 06, 2022
MOSTLY AI, who pioneered the creation of AI-generated synthetic data, has today launched new editions of its platform, for mid-market businesses wanting to speed up test data generation through automation, and better support agile processes. By experimenting with the free edition of the platform, test engineers, QA leads, and test automation experts can see for themselves how the pioneering platform easily and automatically synthesizes complex data structures. Boosting efficiency is coupled with the benefit of generating high quality test data for QA - a critical need for businesses that are required to deliver customer experiences that are increasingly personal and relevant.
“Scaled synthetic datasets generated through our platform offer absolute protection of customer data, with zero risk of re-identification and therefore full compliance with data privacy laws such as GDPR. What’s more is that the datasets preserve granular behavioral insights embedded in the production data “This is of course valuable for innovative companies focused on accelerating the agile delivery of robust software applications that enhance customer experience.”
Dr. Tobias Hann, CEO at MOSTLY AI
For tests, such as load and performance testing, MOSTLY AI’s platform completely removes the need to use production data or manually created dummy data, which is what the majority of testers are still using now. Apart from clear privacy issues that come with doing it this way, it’s a massive time thief with a huge chunk of the average tester’s time being spent waiting for test data, looking for it, or creating it manually.
“Our research over the past months confirms this risky habit of testers using production or dummy data,” says Hann, adding, “and coupled with the fact that 20% of test data will be synthetically generated by 2025, it’s the right time for us to bring AI-generated synthetic data to the mid-market and be instrumental in reaching the synthetic-data tipping point we know is on the horizon.”
AI-generated synthetic data is not mock data or fake data. It’s not generated manually - as it was ten years ago - but by a powerful AI engine that is capable of learning all the qualities of the dataset on which it is trained. Using the MOSTLY AI platform, testers don’t need to manually configure business rules anymore, plus it enables them to create as little data or as much data as they need - generating small, manageable and referentially intact subsets of data to speed up cycles and reduce storage sizes or upscale small datasets to massive sizes for stress testing applications.
“Mid-market companies have an advantage over larger corporations - they can adopt and roll out new tech quickly without the red tape that often makes this a drawn-out process. Adopting AI-generated synthetic data for testing is a win-win situation – for testers who get to work smarter and faster, and for businesses wanting to innovate and deliver the best in customer experience,” concludes Hann.
BIG DATA MANAGEMENT
Snowplow | July 04, 2022
Snowplow Analytics, based in London, offers organizations a platform for creating structured behavioral data assets that are suited to specific AI and BI applications while being entirely compliant.
These assets reflect consumer behavior, behaviors, and decisions, as well as the context in which they are made. The company announced today that a series B round of funding reached USD 40 million. The platform delivers AI/BI-ready data straight to the data warehouse or lakehouse, whether streamed for real-time applications or supplemented with third-party data and systems to meet future use cases.
Snowplow intends to develop both locally and globally with the support of this round of investment, which was spearheaded by multinational venture capital company NEA. The business plans to do this by increasing its workforce and offering support for a broader range of data types.