DATA SCIENCE

Newgen Software to acquire Number Theory, an AI/ML data science platform company

Newgen Software Technologies Limited | January 21, 2022

Newgen Software, a leading provider of a unified digital transformation platform, is pleased to announce that it is acquiring India-based Number Theory, an AI/ML (artificial intelligence and machine learning) data science platform company, subject to the completion of conditions as stated in the approved Share Purchase Agreement.

Number Theory's platform, AI Studio, brings intuitive AI/ML to every enterprise, while unifying the entire lifecycle of data engineering, from data preparation to model development and monitoring. It empowers both citizen and expert data scientists to work faster and more efficiently, thereby helping in accomplishing key machine learning tasks in just hours or days, not months. This acquisition will further strengthen Newgen's low code digital transformation platform, NewgenONE, with AI/ML modeling and data analytics capabilities.

"Our customers are increasingly looking to leverage data for deeper insights and accelerated growth. Number Theory will bring domain expertise, along with a powerful engine to extract actionable insights in real time. AI/ML projects often get complex, expensive, and not rewarding. What we like about Number Theory's platform is that it is for every enterprise. It lets fusion teams build, deploy, and collaborate on the entire modeling lifecycle in low code and on cloud. We look forward to welcoming the Number Theory team to the Newgen family."

Virender Jeet, CEO, Newgen

"Newgen has developed mission-critical and complex business applications for its customers across the globe, including for enterprises in the banking and insurance space. We felt that Newgen, with its strong customer portfolio and partner ecosystem, is the perfect growth partner. We are looking forward to helping our joint customers utilize their data in the enterprise with full potential using AI/ML technologies," said Rajan Nagina and Tarun Gulyani, co-founders of Number Theory.

About Newgen Software Technologies Limited
Newgen is the leading provider of a unified digital transformation platform with native process automation, content services, and communication management capabilities. Globally, successful enterprises rely on Newgen's industry-recognized low code application platform to develop and deploy complex, content-driven, and customer-engaging business applications on the cloud. From onboarding to service requests, lending to underwriting, and for many more use cases across industries. Newgen unlocks simple with speed and agility.

Spotlight

Learn how EMC Solutions help create value by merging Big Data with Cloud Computing in this hand-drawn animation narrated by EMC's Patricia Florissi, VP and Global Sales CTO.


Other News
BIG DATA MANAGEMENT,DATA SCIENCE

Amazon Unveils Additional Analytics and Data to Empower Seller Success

Amazon | September 16, 2022

Today at Accelerate, Amazon’s annual seller conference, Amazon announced new features to Manage Your Experiments, a tool that helps sellers optimize content on product detail pages to drive higher rates of conversion, increasing their sales by up to 25%. Amazon also enhanced the Product Opportunity Explorer and Search Analytics Dashboard with new capabilities that help brands analyze marketing campaigns and identify areas to acquire new customers and drive repeat purchases. This new set of industry-leading tools makes it easier for sellers to tap into customer insights and analytics data to launch new products and increase sales. “We’re focused on supporting sellers as they work to build and grow their business,” said Benjamin Hartman, vice president of Amazon North America Selling Partner Services. “The tools we’re announcing today are a direct result of seller feedback and target every step of their Amazon sales funnel, from new customer acquisition to increased lifetime value. We’re committed to continuing to develop tools and features that deliver actionable insights for sellers.” “We have been working with Amazon since the beginning, leveraging data to build our business into one of the largest jewelry sellers on Amazon,” said Tal Masica, founder of PAVOI Jewelry. “Thanks to enhancements to the Search Analytics Dashboard and Product Opportunity Explorer, we now have the ability to analyze search trends at a granular level, giving us actionable insights to improve both trend forecasting and design for future collections – so we can continue delivering quality sustainable jewelry that our customers love to wear every day.” Amazon offers a range of industry-leading tools that empower sellers to optimize their listings, better understand customers, differentiate their brands, and grow their business. The following new tools were announced at Accelerate 2022: Manage Your Experiments is designed to increase the quality of product detail pages and drive higher conversion. With Manage Your Experiments, brands are able to run A/B tests on their titles, main images, and A+ content to see what performs best. Now, brands can also A/B test bullet points and descriptions, and review machine learning-based recommendations for product images and titles to drive better conversion. Additionally, brands can now opt-in to auto-publish winning experiments to the product detail page, automating their A/B tests. Sellers benefit from traffic from hundreds of millions of Amazon customers, and the new Manage Your Experiments features make it easier to test more content, faster. Search Analytics Dashboard has expanded since its launch in early 2022 to offer a new insights dashboard that provides sellers with anonymized data to better understand customers’ interests and shopping habits. For the first time, brands can download Search Query and Catalog Performance data and new ASIN-level details. This new capability enables brands to easily assess marketing campaigns to identify areas to drive repeat purchases and acquire new customers—either directly from within Amazon’s tools or by combining Amazon data with the seller’s own business data. The enhanced Search Analytics Dashboard is launching worldwide in September. Product Opportunity Explorer builds on its successful beta introduction in 2021, continuing to offer rich, accurate data that helps sellers understand, gauge, and evaluate product opportunities in the Amazon store. Sellers can assess the likelihood of a new product gaining traction with customers and forecast sales potential. For the first time, Amazon has now introduced an enhanced Product Opportunity Explorer with a new feature, Customer Reviews Insight. This feature helps sellers work backward from the customer, using customer feedback from product review insights and product star ratings, to help brands determine what features they should build and prioritize as they launch new products or modify existing ones. Marketplace Product Guidance, initially announced in 2021, has been enhanced to provide Selection Recommendations—products in high demand—for U.S. sellers looking to expand to France, Italy, and Spain. Selection Recommendations give sellers insight as to products not currently offered that fit a seller’s portfolio, surfacing new growth opportunities. The tool takes the guesswork out of which products should be considered in those stores, based on customer demand. These recommendations are personalized and ranked based on their opportunity score as calculated by machine learning models that are designed to predict the best opportunities for new selection. Every year, Amazon invests billions of dollars to improve the infrastructure, tools, services, fulfillment solutions, and resources dedicated to helping sellers succeed. Sellers are responsible for more than half of Amazon’s physical product sales; sellers in our store employed and provided jobs for more than 1.5 million people in the United States. About Amazon Amazon is guided by four principles: customer obsession rather than competitor focus, passion for invention, commitment to operational excellence, and long-term thinking. Amazon strives to be Earth’s Most Customer-Centric Company, Earth’s Best Employer, and Earth’s Safest Place to Work. Customer reviews, 1-Click shopping, personalized recommendations, Prime, Fulfillment by Amazon, AWS, Kindle Direct Publishing, Kindle, Career Choice, Fire tablets, Fire TV, Amazon Echo, Alexa, Just Walk Out technology, Amazon Studios, and The Climate Pledge are some of the things pioneered by Amazon.

Read More

BIG DATA MANAGEMENT,DATA VISUALIZATION

Syniti Announces New Data Quality and Catalog Capabilities to Help Deliver Clean, Actionable Data

Syniti | September 15, 2022

Syniti, a global leader in enterprise data management, today announced new data quality and catalog capabilities available in its industry leading Syniti Knowledge Platform, building on the enhancements in data migration and data matching added earlier this year. The Syniti Knowledge Platform now includes data quality, catalog, matching, replication, migration and governance, all available under one login, in a single cloud solution. This provides users with a complete and unified data management platform enabling them to deliver faster and better business outcomes with data they can trust. Trustworthy data is critical for the decisions businesses must make to reduce risk, drive competitive advantage and deliver bottom-line growth. According to Gartner® research, "Significant data quality issues remain a key impediment for organizations' digital initiatives. Failure to address data quality issues for critical use cases puts organizations at a disadvantage delivering business value and has severe consequences."1 Historically, in order to get better data, companies would have to buy multiple, point-based solutions like heavy data catalog tools that require massive teams to build and maintain or data quality solutions that only identify problems, rather than helping fix issues. This approach is expensive, unnecessarily complex and does not address the data needs of today's businesses. With the Syniti Knowledge Platform, customers now have a unified solution to address the data needed to drive critical business objectives now and in the future. The same Gartner research states that, "From an end-user perspective, organizations are attracted to [unified data management platforms] this option as well, anticipating improved total cost of ownership due to less integration and maintenance between data quality solutions and adjacent applications."1 Gartner, The State of Data Quality Solutions: Augment, Automate and Simplify , Melody Chien, Ankush Jain, 15 March 2022 Each enhanced component of the Syniti Knowledge Platform includes significant new functionality, updates and enhancements, all of which are amplified by their integration. With these new combined capabilities, organizations will benefit from: More efficient data management: From data identification through to resolution, stakeholders can collaborate in one platform. With a single catalog that underpins all data management activities, data activities can be reused across multiple projects helping drive faster and cheaper data management initiatives. Better resourcing & improved business processes: Linking data management and quality to business outcomes improves processes and decision-making while also helping to ensure more bang-for-their buck when it comes to allocating time and resources. Data quality issues with the greatest impact are automatically detected and KPI improvements are tracked over time with smart remediation pipelines. Faster ROI & savings potential: The Syniti Knowledge Platform offers hundreds of proven, out of the box data quality rules and reports and business outcome-related dashboards, which can help users discover millions of dollars in savings. Rules created in data migrations can be used for ongoing data quality, saving time and enforcing compliance. Knowledge re-use can help reduce future data projects by 50%. "Data quality isn't a one-time event. Organizations need a unified approach that enables them not just to rapidly find bad data, but to efficiently fix it and sustain that high quality to drive continuous, ongoing value. The Syniti Knowledge Platform's new capabilities allow our users to leverage a more efficient, interconnected and user-friendly platform in a way that's directly tied to business outcomes and objectives." Jon Green, Vice President, Product Management, Syniti Kevin Campbell, CEO, Syniti, said: "Poor quality data pollutes the entire organization, negatively impacting business operations and wasting time, money and resources. We have purpose-built a data platform to drive business value as opposed to the many siloed solutions that treat data quality as purely a technical exercise. We want our customers to spend more time drawing insights from trusted data versus finding and fixing data problems." Allan Coulter, global chief technology officer for SAP Services, IBM said: "The strategic important of clean, high-quality data cannot be overstated – it is critical to any business modernization effort and to unlocking potential from future analytics and insights. It is exciting to see the new capabilities Syniti is adding to its Syniti Knowledge Platform to help customers succeed in their transformation journeys." GARTNER is a registered trademark and service mark of Gartner, Inc. and/or its affiliates in the U.S. and internationally and is used herein with permission. All rights reserved. About Syniti Syniti solves the world's most complex data challenges by uniquely combining intelligent, AI-driven software and vast data expertise to yield certain and superior business outcomes. For over 25 years, Syniti has partnered with the Fortune 2000 to unlock valuable insights that ignite growth, reduce risk and increase their competitive advantage. Syniti's silo-free enterprise data management platform supports data migration, data quality, data replication, master data management, analytics, data governance, and data strategy in a single, unified solution. Syniti is a portfolio company of private equity firm Bridge Growth Partners LLC.

Read More

BIG DATA MANAGEMENT

integrate.ai Announces Availability of New Platform for Collaborative Machine Learning and Analytics Across Sensitive Data

integrate.ai | August 18, 2022

integrate.ai, a SaaS company helping developers solve the world’s most important problems without risking sensitive data, today announces the availability of its privacy-preserving machine learning and analytics platform. The platform leverages federated learning and differential privacy technologies to unlock a range of machine learning and analytics capabilities on data that would otherwise be difficult or impossible to access due to privacy, confidentiality, or technical hurdles. Traditional approaches to machine learning and analytics require centralization and aggregation of data sources, often necessitating data-sharing agreements and supporting infrastructure. This can present an insurmountable roadblock for the world’s most important data-driven problems, particularly in the healthcare, industrial, and finance sectors, where data custodians must enforce the highest privacy and security standards to ensure regulatory and contractual compliance. With integrate.ai’s solution, collaboration barriers can be broken as data does not need to move. It allows data to stay distributed in its original protected environments, while unlocking its value with privacy-protective machine learning and analytics. Operations such as model training and analytics are performed locally, and only end-results are aggregated in a secure and confidential manner. “When data can be securely accessed and collaborated upon, we unlock boundless opportunities for life-saving research and innovation. By allowing organizations to work in a federated way, our platform helps reduce cost structure, accelerate progress against product roadmaps and capture new revenue opportunities—all with more speed and flexibility than any other solution on the market. “Business and technology leaders alike increasingly recognize the global shift towards a more distributed paradigm. After serving at the forefront of this shift over the past five years, this platform will continue to grow into a product suite of easy-to-use tools for developers addressing humanity’s greatest challenges.” Steve Irvine, founder and CEO of integrate.ai integrate.ai is packaged as a developer tool, enabling developers to seamlessly integrate these capabilities into almost any solution with an easy-to-use software development kit (SDK) and supporting cloud service for end-to-end management. Once integrated, end-users can collaborate across sensitive data sets while data custodians retain full control. Solutions incorporating integrate.ai can serve as both effective experimentation tools and production-ready services. DNAstack, a company that offers software for scientists to more efficiently find, access, and analyze the world’s exponentially growing volumes of genomic and biomedical data, is using integrate.ai’s product platform to support federated learning in their work in autism. DNAstack leads the Autism Sharing Initiative, an international collaboration to create the largest federated network of autism data, empowering better genetic insights and accelerating precision healthcare approaches. “Autism is complex and research has shown the value of connecting massive datasets to drive critical insights. Genetic and health datasets are large, sensitive, and globally distributed, making it impossible to bring them all together in one place,” said Marc Fiume, co-founder and CEO of DNAstack. “Federated learning will empower us to ask new questions about autism across global networks while preserving privacy of research participants.” In the heavily regulated worlds of healthcare, financial services, and manufacturing, roadblocks to collaborating with sensitive data abound – from existing and proposed privacy regulations and intellectual property (IP) concerns to the high cost of centralizing massive datasets. Data science initiatives often fail or never start in the areas where their impact could be most life changing, such as early cancer diagnoses and detections of fraud, underscoring the considerable need for privacy-preserving data analytics solutions. Armed with experience serving enterprises across six industries and the construction of its own data network, which leveraged 20B interactions between businesses and people, integrate.ai enables safe access to sensitive data with developer tools for privacy-safe machine learning and analytics. About integrate.ai integrate.ai is a SaaS company democratizing access to privacy-enhancing technology to help developers solve the world’s most important problems without risking sensitive data. By breaking down collaboration barriers within and between organizations, integrate.ai empowers developers and data teams with the privacy-preserving tools they need to harness collective intelligence. Armed with experience serving enterprises across six industries and the building of its own data network, which leveraged 20B interactions between businesses and people, integrate.ai’s product platform is increasing quality data access in healthcare research, financial services, industrial IoT and manufacturing, process automation, advertising, marketing and more.

Read More

BUSINESS INTELLIGENCE,BIG DATA MANAGEMENT

InfluxData Brings Native Data Collection to InfluxDB

InfluxData | August 24, 2022

InfluxData, creator of the leading time series platform InfluxDB, today announced new serverless capabilities to expedite time series data collection, processing, and storage in InfluxDB Cloud. InfluxDB Native Collectors enable developers building with InfluxDB Cloud to subscribe to, process, transform, and store real-time data from messaging and other public and private brokers and queues with a click of a button. Currently available for MQTT, Native Collectors introduce the fastest way to get data from third-party brokers into InfluxDB Cloud without the need for additional software or new code. Time series data comes from many different sources and widely distributed assets and applications. To make sense of all this data, developers need to consolidate time series data in a central location. However, the pipelines from data sources to the database are complex and require resource-intensive customizations, creating additional challenges for developers. Other systems require an intermediary layer to transfer and transform data from external systems to the cloud. InfluxDB Native Collectors expedites this process by removing that intermediary layer, allowing cloud data sources to connect directly to InfluxDB Cloud so developers can collect, transform, and store time series data in cloud environments directly and without writing new code. “Data is born in the cloud at an exponential rate, but existing data pipeline tools that integrate multi-vendor cloud services are expensive, complex, and a burden for developers to manage. “With Native Collectors, we’re expediting device to cloud data transfers so developers can focus on building and scaling applications with their time series data. These updates enable InfluxDB Cloud to become a serverless consumer of data through easily configured topic subscriptions, greatly simplifying time series data pipelines and applications alike.” Rick Spencer, Vice President of Products, InfluxData According to Gartner®, “Organizations want to make decisions faster and with more confidence. Data and analytics (D&A) leaders are under pressure to support these decisions with high-quality data governed across a range of users, use cases, architectures and deployment options. Data management teams are often too busy responding to requests (execution focused) to ensure data availability. This leaves them little time to focus on enablement and innovation. Many D&A leaders expect cloud migration/modernization to solve the above challenges and bring additional cost and time savings.” Expedite Time Series Cloud Ingestion with InfluxDB Native Collectors InfluxData’s Native Collectors gives teams a faster way to ingest time series data in the cloud in one step and without customization, orchestration, or additional hosting services. After setting up service-to-service integrations between third-party brokers and InfluxDB Cloud with a few simple steps, users can: Ingest data with simple low code setup: Ingest data into InfluxDB Cloud with a click of a button and without writing any code for immediate processing. Contextualize data across distributed architectures: Plug Native Collectors into device-to-cloud data streams to enhance application operations, performance, and security. Reduce complexity through platform consolidation: Consume data directly into InfluxDB Cloud through standard subscriptions with no additional agents or coding. Removes the need to run InfluxDB-specific processing in third-party platforms or code. Deliver real-time data ingestion: Faster data onboarding and ingestion with unprecedented simplicity, speed, and scale, directly from the InfluxDB Cloud user interface. Out-of-the-box data filtering and processing: Enrich, format, and process data before it’s ingested into InfluxDB Cloud for analysis. Reduce storage costs through automatic data filtering. “We are seeing new and fast-growing workloads within our cloud-native MQTT service, but most developers have difficulty in efficiently offloading this data into a time series database,” said Ian Skerrett, Vice President of Marketing, HiveMQ. “InfluxData’s Native Collectors eliminate this challenge, move the database integration workloads back to the database, and turn InfluxDB Cloud into a simple MQTT client – a model that IoT and application developers understand and use regularly.” Native MQTT is available immediately for InfluxDB Cloud users. Additional Native Collectors for Apache Kafka and AMQP are planned for late 2022, with new collectors to be rolled out in the future. About InfluxData InfluxData is the creator of InfluxDB, the leading time series platform. We empower developers and organizations, such as Cisco, IBM, Siemens and Tesla, to build real-time IoT, analytics and cloud applications with time-stamped data. Our technology is purpose-built to handle the massive volumes of data produced by sensors, systems or applications that change over time. Easy to start and scale, InfluxDB gives developers time to focus on the features and functionalities that give their apps a competitive edge. InfluxData is headquartered in San Francisco, with a workforce distributed worldwide.

Read More