BUSINESS INTELLIGENCE

Avantive Solutions Selects Hitachi Solutions America to Optimize Customer Experience with Advanced Data Analytics

Avantive Solutions | January 05, 2022

Advanced Data Analytics
Avantive Solutions, a global technology and business process outsourcer (BPO) specializing in innovative customer experience (CX), strategic sales, and digital marketing solutions, today announced a partnership with Hitachi Solutions America, Ltd., a leading provider of global industry solutions powered by cloud services from Microsoft, to enhance their data analytics capabilities through improved performance of Microsoft Power BI with Azure Databricks. This will allow Avantive to drive best-in-class performance using machine learning and artificial intelligence solutions.

"The goal of our partnership is to take our clients' results to the next level. This will allow Avantive to use optimized technology to increase our contacts, conversions, as well as improve our customers' reachability and level of trust."

Amy Brennan, Avantive's VP of Operational Excellence

As Avantive's digital transformation partner, Hitachi Solutions will develop a customized data platform — fueled by machine learning (ML) and artificial intelligence (AI) — that will let Avantive leverage the power of cutting-edge analytics and customer insights.

"With the ML and analytics scalability of Microsoft Azure and Databricks, Avantive will be able to collect and aggregate data in real time and make efficient moment-by-moment adjustments to live customer outreaches. This capability will markedly improve their contacts, close rates, and performance," explained John Young, Hitachi Solutions' VP of Data Science and Machine Learning.

The Avantive team is focused on how they can control and leverage data to make interactions more personalized and relevant for their clients. This is what sets them apart from the competition — using data-driven insights to successfully reach their customers and communicate with them on a more meaningful level.

"Hitachi Solutions is helping us to not just append the data, but to find the trends in the data in the blink of an eye. At 5:00pm on Tuesday, we will know which households in which state to call, and we will have the ability to personalize our conversations based on the demographic data we can access with speed and security," said Brennan.

Avantive will implement new insights, utilizing multi-faceted demographic appends, allowing greater ability to reach their customers. By providing new data analysis speed and capacity, they will supply clients with greater trending insights and, ultimately, personalized call scripts.

Avantive chose Hitachi Solutions as their partner due to the relationship CEO Frank Pettinato built with the Hitachi Solutions team over the past two years. Already using Power BI and Azure, Avantive sought to collaborate further with a Microsoft solutions and technology leader.

"Hitachi Solutions impressed us with their team and the comprehensive nature of the solution they provided based on our market position. We've been very impressed with their nimbleness to come up with this unique solution for our company," said Pettinato.

While the project is well underway, Brennan is working daily with the Hitachi Solutions team on delivery, review, and developing the release of this cutting-edge technology in the first quarter.

"Our goal is to drive strong, measurable performance for our clients. We are already a market leader in insights and innovation. We believe Hitachi Solutions will make that capability richer, allowing us to provide additional actionable insights and placing Avantive Solutions several years ahead of market competitors," added Brennan.

About Avantive Solutions
Avantive Solutions, founded in 1988, is a Purpose-Driven global technology and business process outsourcer (BPO) specializing in designing, building, and delivering innovative customer experience (CX), strategic sales, and digital marketing solutions. The Company's Omni-Touch™ integrated solution provides actionable insights and drives desired outcomes through advanced analytics, artificial intelligence (AI), and machine learning platforms. Avantive Solutions partners with the world's most recognized brands in communications and media, healthcare, energy, financial technology (Fintech), and eCommerce. To learn more about how Avantive Solutions is bringing purpose to the customer experience, go to avantivesolutions.com.

About Hitachi Solutions America, Ltd.
Hitachi Solutions America, Ltd. helps its customers successfully compete with the largest global enterprises using powerful, easy-to-use, and affordable industry solutions built on Microsoft cloud services. Hitachi Solutions America provides global capabilities with regional offices in the United States, Canada, Europe, India/Middle East, Japan, and Asia Pacific. To learn more about how Hitachi Solutions can support your organization leveraging Microsoft solutions and technologies, go to global.

Spotlight

This section presents the monthly timelines of Facebook and Instagram organic posts, in Figures 1 and 2. Figure 1 shows that monthly Facebook post volume picks up again from July to August 2017, as noted in the main report. It also shows very low activity till June 2015, and an increase in activity from that point onwards, which rises steeply from September to October 2016 (months immediately prior to the election in November 2016). In addition, it shows that post volume is very high in 2017, generally higher than 2016 and 2015 levels, and remains high even after the drop in summer 2017.


Other News
BUSINESS STRATEGY

InterSystems Takes Data Fabrics to the Next Level with Ever-Increasing Embedded Analytics Capabilities

InterSystems | August 17, 2022

InterSystems®, a provider of next-generation solutions dedicated to helping customers solve the most critical data challenges, has announced a series of new releases to its award-winning InterSystems IRIS® data platform. The company has also recently announced a string of new customer wins as well as a new partnership as it continues to take data fabrics to the next level. According to Gartner® analysts, “by 2024, data fabric deployments will quadruple efficiency in data utilization, while cutting human-driven data management tasks in half” (Gartner, Top Strategic Technology Trends for 2022: Data Fabric, October 2021). InterSystems IRIS empowers customers to adopt a microservices-based architecture without the typical issues associated with microservices in data-intensive applications. By adopting a unified data platform, instead of dozens of individual services, InterSystems customers avoid the challenges of building a data fabric from scratch, including integration time and risk, maintenance of the architecture, high costs of maintaining multiple overlapping infrastructure services, and the complexity associated with data duplication. Recent releases of InterSystems IRIS include new capabilities and enhancements that speed and simplify the creation of smart data fabric architectures, including Embedded Python and IntegratedML, as well as a new facility that lets data analysts and data scientists collaborate easily. Data analysts working on BI can develop measures, dimensions and labels that follow business needs, and are immediately usable by data scientists working on AI. Conversely, ML models created by data scientists are directly available to data analysts for use in dashboards, reports, and applications. This functionality connects AI and BI together under the hood without needing to move the data, thereby streamlining operations and enabling real-time insights for the business. Further enhancements have also been made to performance and scalability to handle high throughput and high-performance transactional-analytic use cases; and to Adaptive Analytics, which provides self-service capabilities, empowering business users to freely explore the data, ask ad hoc questions, and drill down via additional queries based on initial findings. When embedded into the data fabric, these analytics capabilities put the ‘smart’ into the next-gen ‘smart data fabric’ architectural approach which InterSystems champions. In doing so, InterSystems allows business users and data scientists alike to benefit from a wide range of built-in analytics capabilities, including data exploration, business intelligence, natural language processing, and machine learning. Adding to its data catalog, data lineage, and data governance capabilities, InterSystems has announced a partnership with Collibra, a data intelligence platform built for governance, quality, and privacy. The integration between the two platforms allows enterprise customers to take advantage of these extended capabilities on data that resides anywhere in the organization. Further to this, InterSystems has released enhancements to the InterSystems Kubernetes Operator (IKO) to make scale up and management of data intensive applications in Kubernetes easier. InterSystems IRIS® and InterSystems IRIS for Health™ are available as managed cloud services, as well as a variety of smart data services in the cloud, including InterSystems FHIR Transformation Services, and InterSystems FHIR Server. InterSystems is also celebrating a number of new customer wins, including UST and Harris Associates, demonstrating the continued appeal of InterSystems IRIS among key industries, such as financial services and supply chain. Next week’s Gartner Data and Analytics Summit will see InterSystems take to the stage alongside representatives from Harris Associates to discuss the role of the smart data fabric in delivering real-time data access and prescriptive analytics for financial services. About the Gartner Data & Analytics Summit The Gartner Data & Analytics Summit provides insights for data and analytics leaders to enable a data-and-analytics-centric culture within their organizations by tying strategy to business outcomes and promoting the adoption of technologies, such as artificial intelligence (AI), while creating a resilient culture that accelerates change and where data literacy, digital trust, governance and data-driven critical thinking are pervasive. About InterSystems Established in 1978, InterSystems is the leading provider of technology for critical data initiatives in the healthcare, finance, manufacturing and supply chain sectors, including production applications at most of the top global banks. Its cloud-first data platforms solve interoperability, speed, and scalability problems for large organizations around the globe. InterSystems is committed to excellence through its award-winning, 24×7 support for customers and partners in more than 80 countries. Privately held and headquartered in Cambridge, Massachusetts, InterSystems has 25 offices worldwide.

Read More

BIG DATA MANAGEMENT,DATA SCIENCE

Cloudflare Launches Data Localization Suite in Asia to Help Customers Achieve Data Sovereignty

Cloudflare | September 22, 2022

Cloudflare, Inc., the security, performance, and reliability company helping to build a better Internet, today announced that Cloudflare’s Data Localization Suite (DLS) is now available in three new countries in the Asia Pacific region: Australia, India, and Japan. The Data Localization Suite will help businesses based in these countries, as well as global companies who do business in these countries, to comply with their data localization obligations by using Cloudflare to easily set rules and controls on where their domestic data goes and who has access to it. This ultimately allows any business with customers in these countries to service their data locally while benefiting from the speed, security, and scalability of Cloudflare’s global network. Nearly 70% of countries in Asia have passed or drafted new data protection and privacy legislation. This often makes it difficult for regional companies to use foreign-based vendors to handle domestic traffic. Without regional support, many businesses are under pressure to use only in-country run vendors and may be required to restrict their application to one data center or one cloud provider’s region. This creates a trade-off between compliance and fast, secure experiences for end users. With the Data Localization Suite, businesses of any size or industry can now use Cloudflare to get more choice and control over how to meet their data locality needs, without sacrificing security or performance. “No business should have to choose between compliance with local data regulation and a superior experience for their customers. And yet, we hear time and again that companies are forced to do so in the face of a complex and ever-changing landscape of regional legislation,” said Matthew Prince, co-founder and CEO, Cloudflare. "By expanding our Data Localization Suite to our customers in Australia, India, and Japan, we're ensuring data locality doesn't have to come at the expense of the speed, security, and privacy users expect and deserve online." Now, businesses in Australia, India, and Japan can use Cloudflare’s Data Localization Suite to: Control where traffic is serviced: Companies can choose the data center locations where their traffic is inspected. Businesses can also use Cloudflare’s Geo Key Manager to choose where private keys are held. Build and deploy serverless code, with regional control: Build applications that allow developers to combine global performance with local compliance regulations. Jurisdiction Restrictions for Workers Durable Objects makes it easy to build serverless applications that are confined to a specific region. Use Cloudflare’s security features to protect their web properties: Customers can use WAF, Bot Management, DDoS protection and more to ensure their websites are safe and stay online. Align with global and regional security certifications: Businesses can trust that they are compliant with global privacy and security certifications like ISO 27001, 27701, and 27018 while still offering performance and speed at scale. “Asia Pacific has over 2.5 billion Internet users, representing more than half of the total Internet users in the world, and data protection and privacy have become increasingly important in this region. Preserving end-user privacy is core to Cloudflare’s mission of helping to build a better Internet, and we look forward to working with businesses across Australia, India, and Japan to enable them to provide fast, private, reliable, and secure services to their end-users.” Jonathon Dixon, VP and Managing Director, Asia Pacific, Japan, and China, Cloudflare Data Localization Suite has supported Cloudflare customers in alignment with European localization requirements and regulations since 2020. “We're thrilled to extend Cloudflare's localization benefits to our customers providing them greater control as they manage international data transfer requirements,” said Blake Brannon, Chief Strategy Officer, OneTrust. “Our partnership with Cloudflare supports our mission to empower our customers to navigate the evolving regulatory landscape with ease.” Today, Cloudflare’s global network spans more than 275 cities in over 100 countries including more than 100 points of presence across Asia Pacific to bring its security, performance, and reliability solutions to as close to its regional customers as possible. Cloudflare continues to invest in the region, with offices in Beijing, Singapore, Sydney, and Tokyo. In March, Cloudflare also announced 18 new cities added to their global network, including Bhubaneshwar, India; Fukuoka, Japan; Kanpur, India; and Naha, Japan. About Cloudflare Cloudflare, Inc. is on a mission to help build a better Internet. Cloudflare’s suite of products protect and accelerate any Internet application online without adding hardware, installing software, or changing a line of code. Internet properties powered by Cloudflare have all web traffic routed through its intelligent global network, which gets smarter with every request. As a result, they see significant improvement in performance and a decrease in spam and other attacks. Cloudflare was named to Entrepreneur Magazine’s Top Company Cultures 2018 list and ranked among the World’s Most Innovative Companies by Fast Company in 2019. Headquartered in San Francisco, CA, Cloudflare has offices in Austin, TX, Champaign, IL, New York, NY, San Jose, CA, Seattle, WA, Washington, D.C., Toronto, Lisbon, London, Munich, Paris, Beijing, Singapore, Sydney, and Tokyo.

Read More

BIG DATA MANAGEMENT,BUSINESS STRATEGY

New Relic Announces Support for Amazon VPC Flow Logs on Amazon Kinesis Data Firehose

New Relic | September 17, 2022

New Relic , the observability company, announced support for Amazon Virtual Private Cloud (Amazon VPC) Flow Logs on Amazon Kinesis Data Firehose to reduce the friction of sending logs to New Relic. Amazon VPC Flow Logs from AWS is a feature that allows customers to capture information about the IP traffic going to and from network interfaces in their Virtual Private Cloud (VPC). With New Relic support for Amazon VPC Flow Logs, both AWS and New Relic customers can quickly gain a clear understanding of a network’s performance and troubleshoot activity without impacting the network throughput or latency Network telemetry is challenging even for network engineers. To unlock cloud-scale observability, engineers need to explore VPC performance and connectivity across multiple accounts and regions to understand if an issue started in the network or somewhere else. To solve this, New Relic has streamlined the delivery of Amazon VPC Flow Logs by allowing engineers to send them to New Relic via Kinesis Data Firehose, which reliably captures, transforms, and delivers streaming data to data lakes, data stores, and analytics services. With New Relic’s simple “add data” interface, it only takes moments to configure Amazon VPC Flow Logs using the AWS Command Line Interface (AWS CLI) or an AWS CloudFormation template. Instead of digging through raw logs across multiple accounts, any engineer can begin with an Amazon Elastic Compute Cloud (Amazon EC2) instance they own and begin to explore the data that matters, regardless of the AWS account or AWS Region. “New Relic continues to invest in our relationship with AWS. Helping customers gain visibility into their cloud networking environment increases their overall application observability. “Our support for Amazon VPC shows our commitment to enhancing our joint customers’ observability experience.” Riya Shanmugam, GVP, Global Alliances and Channels at New Relic “AWS is delighted to continue our strategic collaboration with New Relic to help customers innovate and migrate faster to the cloud,” said Nishant Mehta, Director of PM – EC2 and VPC Networking at AWS. “New Relic’s connected experience for Amazon VPC Flow Logs, paired with the simplicity of using Kinesis Data Firehose, enables our joint customers to easily understand how their networks are performing, troubleshoot networking issues more quickly, and explore their VPC resources more readily.” With the New Relic support for Amazon VPC Flow Logs on Kinesis Data Firehose, customers can: Monitor and alert on network traffic from within New Relic. Visualize network performance metrics such as bytes and packets per second, as well as accepts and rejects per second across every TCP or UDP port. Explore flow log deviations to look for unexpected changes in network volume or health. Diagnose overly restrictive security group rules or potentially malicious traffic issues. ”Our architecture contains above 200 microservices running on AWS. When something goes wrong, we need to find the root cause quickly to put out what we at Gett term as ‘fires,’” said Dani Konstantinovski, Global Support Manager at Gett. “With New Relic capabilities we can identify the problem, understand exactly what services were affected, what’s the reason, and what we need to do to resolve it. New Relic gives us this observability—it helps us to provide better service for our customers.” “Proactively managing customer experience is essential to all businesses that provide part or all of their services through applications. Therefore it’s essential for engineers to have a clear understanding of their network performance and have the data needed to troubleshoot activity before it impacts customers. Also, the quality of the data is fundamental to making good decisions,” said Stephen Elliot, IDC Group Vice President, I&O, Cloud Operations and DevOps. “Solutions that ensure fast delivery of high-quality data provide engineers with the ability to act quickly and decisively with confidence, saving businesses from the costs associated with negative customer experiences.” About New Relic As a leader in observability, New Relic empowers engineers with a data-driven approach to planning, building, deploying, and running great software. New Relic delivers the only unified data platform that empowers engineers to get all telemetry—metrics, events, logs, and traces—paired with powerful full stack analysis tools to help engineers do their best work with data, not opinions. Delivered through the industry’s first usage-based consumption pricing that’s intuitive and predictable, New Relic gives engineers more value for the money by helping improve planning cycle times, change failure rates, release frequency, and mean time to resolution. This helps the world’s leading brands including Adidas Runtastic, American Red Cross, Australia Post, Banco Inter, Chegg, GoTo Group, Ryanair, Sainsbury’s, Signify Health, TopGolf, and World Fuel Services (WFS) improve uptime, reliability, and operational efficiency to deliver exceptional customer experiences that fuel innovation and growth.

Read More

BIG DATA MANAGEMENT,BUSINESS STRATEGY

Striim and Databricks Partner to Bring Real-Time Data Integration to the Databricks Lakehouse

Striim | September 21, 2022

Striim, a global leader in unified real-time data integration and streaming, today announced at the Big Data LDN conference and expo that Striim has joined the Databricks Technology Partner Program. Databricks Technology Partners integrate their solutions with Databricks to provide complementary capabilities for ETL, data ingestion, business intelligence, machine learning, and governance. Striim’s integration with Databricks enables enterprises to leverage the Databricks Lakehouse Platform’s reliability and scalability to innovate faster while deriving valuable data insights in real-time via Striim’s real-time streaming capabilities. “It’s clear that the businesses that can make accurate, data-driven decisions more quickly have a clear advantage over their competitors. “We intentionally partner with technology providers like Striim to enable our customers to speed their time-to-insight via Databricks AI/ML solutions. We are excited to partner with Striim, providing our customers access to a fully-managed cloud service to seamlessly connect data from databases, applications, and disparate clouds to Databricks in real-time.” Ariel Amster, director of strategic technology partners at Databricks The Databricks Lakehouse Platform combines the best elements of data lakes and data warehouses, enabling users to unify their data, analytics, and AI, build on open-source technology, and maintain a consistent platform across clouds. Striim enables Databricks’s AI/ML to create new models that leverage real-time data resulting in far more accurate business predictions. This, in turn, means better decisions more quickly, giving businesses and significant competitive edge. “In today's digital economy, customer experience, data movement, and data governance require real-time streaming data. Legacy batch processes simply are not enough to meet the demands of today’s AI/ML applications,” said Philip Cockrell, Striim’s senior vice president of Business Development. “Striim’s software-as-a-service offering delivers ‘best-in-class’ capabilities for real-time data integration, helping Databricks customers more fully realize the proven AI/ML functionality Databricks delivers.” Striim Cloud delivers these capabilities for the enterprise in a managed service format that eliminates the complexity of building low-latency streaming data pipelines at scale. Instead of spending weeks implementing this new infrastructure, global enterprises can now integrate data from disparate sources in just a few clicks. About Striim Striim, Inc. is the only supplier of unified, real-time data streaming and integration for analytics and operations in the Digital Economy. Striim Platform and Striim Cloud make it easy to continuously ingest, process, and deliver high volumes of real-time data from diverse sources (both on-premises or in the cloud) to support multi- and hybrid cloud infrastructure. Striim collects data in real time from enterprise databases (using non-intrusive change data capture), log files, messaging systems, and sensors, and delivers it to virtually any target on-premises or in the cloud with sub-second latency enabling real-time operations and analytics.

Read More

Spotlight

This section presents the monthly timelines of Facebook and Instagram organic posts, in Figures 1 and 2. Figure 1 shows that monthly Facebook post volume picks up again from July to August 2017, as noted in the main report. It also shows very low activity till June 2015, and an increase in activity from that point onwards, which rises steeply from September to October 2016 (months immediately prior to the election in November 2016). In addition, it shows that post volume is very high in 2017, generally higher than 2016 and 2015 levels, and remains high even after the drop in summer 2017.

Resources