Article | September 2, 2021
Massive amount of data is collected and stored by companies in the search for the “Holy Grail”. One crucial component is the discovery and application of novel approaches to achieve a more complete picture of datasets provided by the local (sometimes global) event-based analytic strategy that currently dominates a specific field.
Bringing qualitative data to life is essential since it provides management decisions’ context and nuance. An NLP perspective for uncovering word-based themes across documents will facilitate the exploration and exploitation of qualitative data which are often hard to “identify” in a global setting. NLP can be used to perform different analysis mapping drivers.
Broadly speaking, drivers are factors that cause change and affect institutions, policies and management decision making. Being more precise, a “driver” is a force that has a material impact on a specific activity or an entity, which is contextually dependent, and which affects the financial market at a specific time. (Litterio, 2018). Major drivers often lie outside the immediate institutional environment such as elections or regional upheavals, or non-institutional factors such as Covid or climate change. In Total global strategy: Managing for worldwide competitive advantage, Yip (1992) develops a framework based on a set of four industry globalization drivers, which highlights the conditions for a company to become more global but also reflecting differentials in a competitive environment. In The lexicons: NLP in the design of Market Drivers Lexicon in Spanish, I have proposed a categorization into micro, macro drivers and temporality and a distinction among social, political, economic and technological drivers. Considering the “big picture”, “digging” beyond usual sectors and timeframes is key in state-of-the-art findings.
Working with qualitative data.
There is certainly not a unique “recipe” when applying NLP strategies. Different pipelines could be used to analyse any sort of textual data, from social media and reviews to focus group notes, blog comments and transcripts to name just a few when a MetaQuant team is looking for drivers.
Generally, being textual data the source, it is preferable to avoid manual task on the part of the analyst, though sometimes, depending on the domain, content, cultural variables, etc. it might be required. If qualitative data is the core, then the preferred format is .csv. because of its plain nature which typically handle written responses better. Once the data has been collected and exported, the next step is to do some pre-processing. The basics include normalisation, morphosyntactic analysis, sentence structural analysis, tokenization, lexicalization, contextualization. Just simplify the data to make analysis easier.
Topic modelling refers to the task of recognizing words from the main topics that best describe a document or the corpus of data. LAD (Latent Dirichlet Allocation) is one of the most powerful algorithms with excellent implementations in the Python’s Gensim package.
The challenge: how to extract good quality of topics that are clear and meaningful. Of course, this depends mostly on the nature of text pre-processing and the strategy of finding the optimal number of topics, the creation of a lexicon(s) and the corpora. We can say that a topic is defined or construed around the most representative keywords. But are keywords enough? Well, there are some other factors to be observed such as:
1. The variety of topics included in the corpora.
2. The choice of topic modelling algorithm.
3. The number of topics fed to the algorithm.
4. The algorithms tuning parameters.
As you probably have noticed finding “the needle in the haystack” is not that easy. And only those who can use creatively NLP will have the advantage of positioning for global success.
Article | September 2, 2021
Homeless policy needs to join the big data revolution. A data tsunami is transforming our world. Ninety percent of existing data was created in the last two years, and Silicon Valley is leveraging it with powerful analytics to create self-driving cars and to revolutionize business decision-making in ways that drive innovation and efficiency.Unfortunately, this revolution has yet to help the homeless. It is not due to a lack of data. Sacramento alone maintains data on half a million service interactions with more than 65,000 homeless individuals. California is considering integrating the data from its 44 continuums of care to create a richer pool of data. Additionally, researchers are uncovering troves of relevant information in educational and social service databases.These data, however, are only useful if they are aggressively mined for insights, looking for problems to solve and successful practices to replicate. At that juncture California falls short.
Article | September 2, 2021
If you want an explicit answer without having to know the extra details, then here it is: Yes, there is a possibility that quantum computers can replace supercomputers in the field of high performance computing, under certain conditions.
Now, if you want to know how and why this scenario is a possibility and what those conditions are, I’d encourage you to peruse the rest of this article. To start, we will run through some very simple definitions.
If you work in the IT sector, you probably would have heard of the terms ‘high performance computing’, ‘supercomputers’ and ‘quantum computers’ many times. These words are thrown around quite often nowadays, especially in the area of data science and artificial intelligence. Perhaps you would have deduced their meanings from their context of use, but you may not have gotten the opportunity to explicitly sit down and do the required research on what they are and why they are used. Therefore, it is a good idea to go through their definitions, so that you have a better understanding of each concept.
High Performance Computing: It is the process of carrying out complex calculations and computations on data at a very high speed. It is much faster than regular computing.
Supercomputer: It is a type of computer that is used to efficiently perform powerful and quick computations.
Quantum Computing: It is a type of computer that makes use of quantum mechanics’ concepts like entanglement and superposition, in order to carry out powerful computations.
Now that you’ve gotten the gist of these concepts, let’s dive in a little more to get a wider scope of how they are implemented throughout the world.
High performance computing is a thriving area in the sector of information technology, and rightly so, due to the rapid surge in the amount of data that is produced, stored, and processed every second. Over the last few decades, data has become increasingly significant to large corporations, small businesses, and individuals, as a result of its tremendous potential in their growth and profit. By properly analysing data, it is possible to make beneficial predictions and determine optimal strategies.
The challenge is that there are huge amounts of data being generated every day. If traditional computers are used to manage and compute all of this data, the outcome would take an irrationally long time to be produced. Massive amounts of resources like time, computational power, and expenses would also be required in order to effectuate such computations.
Supercomputers were therefore introduced into the field of technology to tackle this issue. These computers facilitate the computation of huge quantities of data at much higher speeds than a regular computer. They are a great investment for businesses that require data to be processed often and in large amounts at a time. The main advantage of supercomputers is that they can do what regular computers need to do, but much more quickly and efficiently. They have an overall
high level of performance.
Till date, they have been applied in the following domains:
• Nuclear Weapon Design
• Medical Diagnosis
• Weather Forecasting
• Online Gaming
• Study of Subatomic Particles
• Tackling the COVID-19 Pandemic
Quantum computers, on the other hand, use a completely different principle when functioning. Unlike regular computers that use bits as the smallest units of data, quantum computers generate and manipulate ‘qubits’ or ‘quantum bits’, which are subatomic particles like electrons or photons. These qubits have two interesting quantum properties which allow them to powerfully compute data –
• Superposition: Qubits, like regular computer bits, can be in a state of 1 or 0. However, they also have the ability to be in both states of 1 and 0 simultaneously. This combined state allows quantum computers to calculate a large number of possible outcomes, all at once. When the final outcome is determined, the qubits fall back into a state of either 1 or 0. This property iscalled superposition.
• Entanglement: Pairs of qubits can exist in such a way that two members of a pair of qubits exist in a single quantum state. In such a situation, changing the state of one of the qubits can instantly change the state of the other qubit. This property is called entanglement.
Their most promising applications so far include:
• Drug Designing
• Financial Modelling
• Weather Forecasting
• Artificial Intelligence
• Workforce Management
Despite their distinct features, both supercomputers and quantum computers are immensely capable of providing users with strong computing facilities. The question is, how do we know which type of system would be the best for high performance computing?
High performance computing requires robust machines that can deal with large amounts of data - This involves the collection, storage, manipulation, computation, and exchange of data in order to derive insights that are beneficial to the user. Supercomputers have successfully been used so far for such operations.
When the concept of a quantum computer first came about, it caused quite a revolution within the scientific community. People recognised its innumerable and widespread abilities, and began working on ways to convert this theoretical innovation into a realistic breakthrough.
What makes a quantum computer so different from a supercomputer? Let’s have a look at Table 1.1 below.
From the table, we can draw the following conclusions about supercomputers and quantum computers -
1. Supercomputers have been around for a longer duration of time, and are therefore more advanced. Quantum computers are relatively new and still require a great depth of research to sufficiently comprehend their working and develop a sustainable system.
2. Supercomputers are easier to provide inputs to, while quantum computers need a different input mechanism.
3. Supercomputers are fast, but quantum computers are much faster.
4. Supercomputers and quantum computers have some similar applications.
5. Quantum computers can be perceived as extremely powerful and highly advanced supercomputers.
Thus, we find that while supercomputers surpass quantum computers in terms of development and span of existence, quantum computers are comparatively much better in terms of capability and performance.
We have seen what supercomputers and quantum computers are, and how they can be applied in real-world scenarios, particularly in the field of high performance computing. We have also gone through their differences and made significant observations in this regard.
We find that although supercomputers have been working great so far, and they continue to provide substantial provisions to researchers, organisations, and individuals who require intense computational power for the quick processing of enormous amounts of data, quantum computers have the potential to perform much better and provide faster and much more adequate results.
Thus, quantum computers can potentially make supercomputers obsolete, especially in the field of high performance computing, if and only if researchers are able to come up with a way to make the development, deployment, and maintenance of these computers scalable, feasible, and optimal for consumers.
Article | September 2, 2021
Primarily,the IoT stack is going beyond merely ingesting data to data analytics and management, with a focus on real-time analysis and autonomous AI capacities. Enterprises are finding more advanced ways to apply IoT for better and more profitable outcomes. IoT platforms have evolved to use standard open-source protocols and components. Now enterprises are primarily focusing on resolving business problems such as predictive maintenance or usage of smart devices to streamline business operations.Platforms focus on similar things, but early attempts at the creation of highly discrete solutions around specific use cases in place of broad platforms, have been successful. That means more vendors offer more choices for customers, to broaden the chances for success. Clearly, IoT platforms actually sit at the heart of value creation in the IoT.