Article | April 9, 2020
In some ways at least, technology has been able to tell us more about how and where the virus is spreading. Mostly, this has involved creatively harnessing the power of big data using temperature readings from smart thermometers to detect COVID-19 hot spots, or aggregating cellphone location data to point to the areas of the country where people are staying home. But against a backdrop of debate between civil liberties and public health, we also need to be asking where the line is digitally: How much surveillance is acceptable in the service of the greater good.
Article | February 23, 2020
Does the success of companies like Google depend on that of the algorithms or that of data? Today’s fascination with artificial intelligence (AI) reflects both our appetite for data and our excitement about the new opportunities in machine learning. Amalio Telenti, Chief Data Scientist and Head of Computational Biology at Vir Biotechnology Inc. argue that newcomers to the field of data science are blinded by the shiny object of magical algorithms and that they forget the critical infrastructures that are needed to create and to manage data in the first place.Data management and infrastructures are the little ugly duckling of data science but they are necessary for a successful program and therefore need to be built with purpose. This requires careful consideration of strategies for data capture, storage of raw and processed data and instruments for retrieval. Beyond the virtues of analysis, there are also the benefits of facilitated retrieval. While there are many solutions for visualization of corporate or industrial data, there is still a need for flexible retrieval tools in the form of search engines that query the diverse sources and forms of data and information that are generated at a given company or institution.
Article | March 23, 2021
Learn, re Learn and Unlearn
The times we are living in, we have to upgrade ourselves constantly in order to stay afloat with the industry be it Logistics, Traditional business, Agriculture, etc.. Technology is constantly changing our lives the way we used to live, living and will live. Anyone who thinks technology is not their cup of tea then I would say he /she will have no place in the world to live. It’s a blessing or curse on human race, only time will tell but effects are already surfacing in the market in the form of Job cut, poverty, some roles are no longer needed or replaced with.
Poor is getting poorer and rich is getting richer. Covid19 has not only brought the curse on human race but it has been a blessing in disguise for Tech giants and E-commerce. Technology not only changing the business but every human’s outlook towards life, family structure, the globalization of talents etc. It is nerve wrenching to imagine just what the world will look like in coming 20 years from now. Can all of us adapt to learn, re learn and unlearn quote? Or we have to depend upon countries/Governments to announce Minimum Wage to sustain our basic needs? Uncertainties are looming as the world is coming closer due to technology but emotionally going far. It’s sad to see children, colleagues communicating via emails and messages in the same home and office. Human is losing its touch and feel.
Repercussion to resists of learning, unlearning and relearning can bring down choices to none in the long run. Delay in adapting to change can be increasingly expensive as one can lose their place in a world earlier than one think. From 1992, where fewer people used to have facility of internet around , People used to stay in jobs for life but same people are now not wanted in the jobs when they go for interview as they lack in experience just because they have been doing what they were doing in one job without exposing themselves to the world’s new requirement of learn , re learn and unlearn. Chances of this group, getting a job will be negative. World has thrown different types of challenges to people, community, jobs, businesses , those people used to be applauded for remaining On one job for life ,same group of people are looked differently by corporate firms as redundant due to technology. So should people keep changing jobs after few years to just get on to learn, re learn and unlearn or continue waiting for their existing companies to face challenges and go off from the market? Only time and technology will determine what is store for human race next.
According to some of the studies, its shown the longer the delay in adopting technology for any given nation, the lower the per capita income of that nation. It shows extreme reliance on Technology but can all of us adopt to the technology at the same rate as its been introduced to us? Can our children or upcoming next generations adopt technology at same scale? Or future is Either Technology or nothing, in Short Job or Jobless there is no in between option?
Stephen Goldsmith, director of the Innovations in Government Program and Data-Smart City Solutions at the John F. Kennedy School of Government at Harvard University, said that in some areas, technological advancements have exceeded expectations made in 2000.
The Internet also has exploded beyond expectations. From 2000 to 2010, the number of Internet users increased 500 percent, from 361 million worldwide to almost 2 billion. Now, close to 4 billion people throughout the world use the Internet. People go online for everything from buying groceries and clothes to finding a date. They can register their cars online, earn a college degree, shop for houses and apply for a mortgage but again same question is arising , Can each one of us at the same scale use or advance their skill to use technology or we are leaving our senior generations behind and making them cripple in today’s society? Or How about Mid age people who are in their 50s and soon going to take over senior society , Can they get the job and advance their skill to meet technology demands or learn, unlearn and re learn or Not only pandemic but even Technology is going to make human redundant before their actual retirement and their knowledge, skill obsolete. There should be a way forward to achieve balance, absolute reliance on Technology is not only cyber threat to governments but in long term, Unemployment, Creating Jobs or paying minimum wage to unemployed mass will be a huge worry. At the end of the day, humans need basic and then luxury. Technology can bring ease of doing business, connecting businesses and out flows, connecting Wholesalers to end users but in between many jobs, heads will be slashed down and impact will be dire. Therefore Humans have to get themselves prepared to learn, unlearn and re learn to meet today’s technology requirement or prepare themselves for early retirement.
Article | December 21, 2020
Machine Learning (ML) has taken strides over the past few years, establishing its place in data analytics. In particular, ML has become a cornerstone in data science, alongside data wrangling, and data visualization, among other facets of the field. Yet, we observe many organizations still hesitant when allocating a budget for it in their data pipelines. The data engineer role seems to attract lots of attention, but few companies leverage the machine learning expert/engineer. Could it be that ML can add value to other enterprises too? Let's find out by clarifying certain concepts.
What Machine Learning is
So that we are all on the same page, let's look at a down-to-earth definition of ML that you can include in a company meeting, a report, or even within an email to a colleague who isn't in this field. Investopedia defines ML as "the concept that a computer program can learn and adapt to new data without human intervention." In other words, if your machine (be it a computer, a smartphone, or even a smart device) can learn on its own, using some specialized software, then it's under the ML umbrella. It's important to note that ML is also a stand-alone field of research, predating most AI systems, even if the two are linked, as we'll see later on.
How Machine Learning is different from Statistics
It's also important to note that ML is different from Statistics, even if some people like to view the former as an extension of the latter. However, there is a fundamental difference that most people aren't aware of yet. Namely, ML is data-driven while Statistics is, for the most part, model-driven. This statement means that most Stats-based inferences are made by assuming a particular distribution in the data, or the interactions of different variables, and making predictions based on our mathematical models of these distributions. ML may employ distributions in some niche cases, but for the most part, it looks at data as-is, without making any assumptions about it.
Machine Learning’s role in data science work
Let’s now get to the crux of the matter and explore how ML can be a significant value-add to a data science pipeline. First of all, ML can potentially offer better predictions than most Stats models in terms of accuracy, F1 score, etc. Also, ML can work alongside existing models to form model ensembles that can tackle the problems more effectively. Additionally, if transparency is important to the project stakeholders, there are ML-based options for offering some insight as to what variables are important in the data at hand, for making predictions based on it. Moreover, ML is more parametrized, meaning that you can tweak an ML model more, adapting it to the data you have and ensuring more robustness (i.e., reliability). Finally, you can learn ML without needing a Math degree or any other formal training. The latter, however, may prove useful, if you wish to delve deeper into the topic and develop your own models. This innovation potential is a significant aspect of ML since it's not as easy to develop new models in Stats (unless you are an experienced Statistics researcher) or even in AI. Besides, there are a bunch of various "heuristics" that are part of the ML group of algorithms, facilitating your data science work, regardless of what predictive model you end up using.
Machine Learning and AI
Many people conflate ML with AI these days. This confusion is partly because many ML models involve artificial neural networks (ANNs) which are the most modern manifestation of AI. Also, many AI systems are employed in ML tasks, so they are referred to as ML systems since AI can be a bit generic as a term. However, not all ML algorithms are AI-related, nor are all AI algorithms under the ML umbrella. This distinction is of import because certain limitations of AI systems (e.g., the need for lots and lots of data) don't apply to most ML models, while AI systems tend to be more time-consuming and resource-heavy than the average ML one. There are several ML algorithms you can use without breaking the bank and derive value from your data through them. Then, if you find that you need something better, in terms of accuracy, you can explore AI-based ones. Keep in mind, however, that some ML models (e.g., Decision Trees, Random Forests, etc.) offer some transparency, while the vast majority of AI ones are black boxes.
Learning more about the topic
Naturally, it's hard to do this topic justice in a single article. It is so vast that someone can write a book on it! That's what I've done earlier this year, through the Technics Publications publishing house. You can learn more about this topic via this book, which is titled Julia for Machine Learning(Julia is a modern programming language used in data science, among other fields, and it's popular among various technical professionals). Feel free to check it out and explore how you can use ML in your work. Cheers!