Article | March 12, 2020
Homeless policy needs to join the big data revolution. A data tsunami is transforming our world. Ninety percent of existing data was created in the last two years, and Silicon Valley is leveraging it with powerful analytics to create self-driving cars and to revolutionize business decision-making in ways that drive innovation and efficiency.Unfortunately, this revolution has yet to help the homeless. It is not due to a lack of data. Sacramento alone maintains data on half a million service interactions with more than 65,000 homeless individuals. California is considering integrating the data from its 44 continuums of care to create a richer pool of data. Additionally, researchers are uncovering troves of relevant information in educational and social service databases.These data, however, are only useful if they are aggressively mined for insights, looking for problems to solve and successful practices to replicate. At that juncture California falls short.
Article | January 28, 2021
Since the internet became popular, the way we purchase things has evolved from a simple process to a more complicated process. Unlike traditional shopping, it is not possible to experience the products first-hand when purchasing online. Not only this, but there are more options or variants in a single product than ever before, which makes it more challenging to decide.
To not make a bad investment, the consumer has to rely heavily on the customer reviews posted by people who are using the product. However, sorting through relevant reviews at multiple eCommerce platforms of different products and then comparing them to choose can work too much. To provide a solution to this problem, Amazon has come up with sentiment analysis using product review data. Amazon performs sentiment analysis on product review data with Artificial Intelligence technology to develop the best suitable products for the customer. This technology enables Amazon to create products that are most likely to be ideal for the customer.
A consumer wants to search for only relevant and useful reviews when deciding on a product. A rating system is an excellent way to determine the quality and efficiency of a product. However, it still cannot provide complete information about the product as ratings can be biased. Textual detailed reviews are necessary to improve the consumer experience and in helping them make informed choices. Consumer experience is a vital tool to understand the customer's behavior and increase sales.
Amazon has come up with a unique way to make things easier for their customers. They do not promote products that look similar to the other customer's search history. Instead, they recommend products that are similar to the product a user is searching for. This way, they guide the customer using the correlation between the products.
To understand this concept better, we must understand how Amazon's recommendation algorithm has upgraded with time.
The history of Amazon's recommendation algorithm
Before Amazon started a sentiment analysis of customer product reviews using machine learning, they used the same collaborative filtering to make recommendations. Collaborative filtering is the most used way to recommend products online. Earlier, people used user-based collaborative filtering, which was not suitable as there were many uncounted factors.
Researchers at Amazon came up with a better way to recommend products that depend on the correlation between products instead of similarities between customers. In user-based collaborative filtering, a customer would be shown recommendations based on people's purchase history with similar search history. In item-to-item collaborative filtering, people are shown recommendations of similar products to their recent purchase history. For example, if a person bought a mobile phone, he will be shown hints of that phone's accessories.
Amazon's Personalization team found that using purchase history at a product level can provide better recommendations. This way of filtering also offered a better computational advantage. User-based collaborative filtering requires analyzing several users that have similar shopping history. This process is time-consuming as there are several demographic factors to consider, such as location, gender, age, etc. Also, a customer's shopping history can change in a day. To keep the data relevant, you would have to update the index storing the shopping history daily.
However, item-to-item collaborative filtering is easy to maintain as only a tiny subset of the website's customers purchase a specific product. Computing a list of individuals who bought a particular item is much easier than analyzing all the site's customers for similar shopping history. However, there is a proper science between calculating the relatedness of a product. You cannot merely count the number of times a person bought two items together, as that would not make accurate recommendations.
Amazon research uses a relatedness metric to come up with recommendations. If a person purchased an item X, then the item Y will only be related to the person if purchasers of item X are more likely to buy item Y. If users who purchased the item X are more likely to purchase the item Y, then only it is considered to be an accurate recommendation.
In order to provide a good recommendation to a customer, you must show products that have a higher chance of being relevant. There are countless products on Amazon's marketplace, and the customer will not go through several of them to figure out the best one. Eventually, the customer will become frustrated with thousands of options and choose to try a different platform. So Amazon has to develop a unique and efficient way to recommend the products that work better than its competition.
User-based collaborative filtering was working fine until the competition increased. As the product listing has increased in the marketplace, you cannot merely rely on previous working algorithms. There are more filters and factors to consider than there were before. Item-to-item collaborative filtering is much more efficient as it automatically filters out products that are likely to be purchased. This limits the factors that require analysis to provide useful recommendations.
Amazon has grown into the biggest marketplace in the industry as customers trust and rely on its service. They frequently make changes to fit the recent trends and provide the best customer experience possible.
Article | February 27, 2020
When it comes to adopting artificial intelligence (AI) and machine learning (ML) capabilities, it’s important to look at its range of effects from many different viewpoints.According to Senior Advisor for AI at the Cybersecurity and Infrastructure Security Agency (CISA) Martin Stanley, his agency wanted to look at adoption through three different perspectives: how CISA was going to use AI, how stakeholders will use AI, and how U.S. adversaries are going to use AI.You have to understand the needs of your stakeholders, but you also have to do it fast,” Stanley said at a Feb. 26 ServiceNow Federal Forum, adding that it’s a challenge to take in all the necessary information and deliver an outcome. AI and ML can help streamline this process. Stanley spoke about how a big percentage of the AI implementation is being purposeful in how the government’s data is managed and taking care of the data and technology is a key part to the adoption process. He also added that helping people by making work more efficient is key to why AI adoption is important saying: At the end of the day, this is all about helping people.
Article | January 6, 2021
As the organizations go digital the amount of data generated whether in-house or from outside is humongous. In fact, this data keeps increasing with every tick of the clock.
There is no doubt about the fact that most of this data can be junk, however, at the same time this is also the data set from where an organization can get a whole lot of insight about itself.
It is a given that organizations that don’t use this generated data to build value to their organization are prone to speed up their obsolescence or might be at the edge of losing the competitive edge in the market.
Interestingly it is not just the larger firms that can harness this data and analytics to improve their overall performance while achieving operational excellence. Even the small size private equity firms can also leverage this data to create value and develop competitive edge. Thus private equity firms can achieve a high return on an initial investment that is low.
Private Equity industry is skeptical about using data and analytics citing the reason that it is meant for larger firms or the firms that have deep pockets, which can afford the revamping cost or can replace their technology infrastructure. While there are few private equity investment professionals who may want to use this advanced data and analytics but are not able to do so for the lack of required knowledge.
US Private Equity Firms are trying to understand the importance of advanced data and analytics and are thus seeking professionals with the expertise in dealing with data and advanced analytics. For private equity firms it is imperative to comprehend that data and analytics’ ability is to select the various use cases, which will offer the huge promise for creating value. Top Private Equity firms all over the world can utilize those use cases and create quick wins, which will in turn build momentum for wider transformation of businesses.
Pinpointing the right use cases needs strategic thinking by private equity investment professionals, as they work on filling the relevant gaps or even address vulnerabilities. Private Equity professionals most of the time are also found thinking operationally to recognize where can they find the available data.
Top private equity firms in the US have to realize that the insights which Big data and advanced analytics offer can result in an incredible opportunity for the growth of private equity industry. As Private Equity firms realize the potential and the power of big data and analytics they will understand the invaluableness of the insights offered by big data and analytics.
Private Equity firms can use the analytics insights to study any target organization including its competitive position in the market and plan their next move that may include aggressive bidding for organizations that have shown promise for growth or leaving the organization that is stuffed with loads of underlying issues.
But for all these and also to build careers in private equity it is important to have reputed qualification as well. A qualified private equity investment professional will be able to devise information-backed strategies in no time at all.
In addition, with Big Data and analytics in place, private equity firms can let go of numerous tasks that are done manually and let the technology do the dirty work. There have been various studies that show how big data and analytics can help a private Equity firm.