BIG DATA ANALYSIS AND ITS NEED FOR EFFECTIVE E-GOVERNANCE

| September 27, 2017

article image
Big Data refers to data sets that are so large and complex that traditional data processing tools and technologies cannot cope with. The process of examining such data to uncover hidden patterns in them is referred to as Big Data Analytics. Data is growing at a high speed and its analysis with various mining techniques giving rise to the valuable results in term of best perception for the future. This paper focuses on the impact of big data analysis for the E-governance.

Spotlight

Technica Corporation

Technica has been providing technology integration, professional services, products, and innovative technology solutions to defense, intelligence, law enforcement, and civilian agencies since 1991. We specialize in network operations and infrastructure; cyber defense and security; government application integration; software development and support; systems engineering and training; and product deployment planning, and support. Our research and development department provides customer-tailored, budget-sensitive solutions in emerging technologies such as big data analytics. As an experienced systems integrator, we serve as a trusted advisor to help you navigate the often complex government buying process and procure the services your agency needs…

OTHER ARTICLES

Exploiting IoT Data Analytics for Business Success

Article | January 21, 2021

The Internet of Things has been the hype in the past few years. It is set to play an important role in industries. Not only businesses but also consumers attempt to follow developments that come with the connected devices. Smart meters, sensors, and manufacturing equipment all can remodel the working system of companies. Based on the Statista reports, the IoT market value of 248 billion US dollars in 2020 is expected to reach a worth of 1.6 Trillion USD by 2025. The global market is in the support of IoT development and its power to bring economic growth. But, the success of IoT without the integration of data analytics is impossible. This major growth component of IoT is the blend of IoT and Big Data - together known as IoT Data Analytics. Understanding IoT Data Analytics IoT Data Analytics is the analysis of large volumes of data that has been gathered from connected devices. As IoT devices generate a lot of data even in the shortest period, it becomes complex to analyze the enormous data volumes. Besides, the IoT data is quite similar to big data but has a major difference in their size and number of sources. To overcome the difficulty in IoT data integration, IoT data analytics is the best solution. With this combination, the process of data analysis becomes cost-effective, easier, and rapid. Why Data Analytics and IoT Will Be Indispensable? Data analytics is an important part of the success of IoT investments or applications. IoT along with Data analytics will allow businesses to make efficient use of datasets. How? Let’s get into it! Impelling Revenue Using data analytics in IoT investments businesses will become able to gain insight into customer behavior. It will lead to the crafting offers and services accordingly. As a result, companies will see a hike in their profits and revenue. Volume The vast amount of data sets that are being used by IoT applications needs to be organized and analyzed to obtain patterns. It can easily be achieved by using IoT analytics software. Competitive Advantage In an era full of IoT devices and applications, the competition has also increased. You can gain a competitive advantage by hire developers that can help with the IoT analytics implementations. It will assist businesses in providing better services and stand out from the competition. Now the next question arises: Where is it being implemented? Companies like Amazon, Microsoft, Siemens, VMware, and Huawei are using IoT data analytics for product usage analysis, sensor data analysis, camera data analysis, improved equipment maintenance, and optimizing operations. The Rise of IoT Data Analytics With the help of IoT Data Analytics, companies are ready to achieve more information that can be used to improve their overall performance and revenue. Although it has not reached every corner of the market yet, it is still being used for making the workplace more efficient and safe. The ability to analyze and predict data in real-time is definitely a game-changer for companies that need all of their equipment to work efficiently all the time. It is continuously growing to provide insights that were never possible before.

Read More

Rethinking and Recontextualizing Context(s) in Natural Language Processing

Article | June 10, 2021

We discursive creatures are construed within a meaningful, bounded communicative environment, namely context(s) and not in a vacuum. Context(s) co-occur in different scenarios, that is, in mundane talk as well as in academic discourse where the goal of natural language communication is mutual intelligibility, hence the negotiation of meaning. Discursive research focuses on the context-sensitive use of the linguistic code and its social practice in particular settings, such as medical talk, courtroom interactions, financial/economic and political discourse which may restrict its validity when ascribing to a theoretical framework and its propositions regarding its application. This is also reflected in the case of artificial intelligence approaches to context(s) such as the development of context-sensitive parsers, context-sensitive translation machines and context-sensitive information systems where the validity of an argument and its propositions is at stake. Context is at the heart of pragmatics or even better said context is the anchor of any pragmatic theory: sociopragmatics, discourse analysis and ethnomethodological conversation analysis. Academic disciplines, such as linguistics, philosophy, anthropology, psychology and literary theory have also studied various aspects of the context phenomena. Yet, the concept of context has remained fuzzy or is generally undefined. It seems that the denotation of the word [context] has become murkier as its uses have been extended in many directions. Context or/ and contexts? Now in order to be “felicitous” integrated into the pragmatic construct, the definition of context needs some delimitations. Depending on the frame of research, context is delimitated to the global surroundings of the phenomenon to be investigated, for instance if its surrounding is of extra-linguistic nature it is called the socio-cultural context, if it comprises features of a speech situation, it is called the linguistic context and if it refers to the cognitive material, that is a mental representation, it is called the cognitive context. Context is a transcendental notion which plays a key role in interpretation. Language is no longer considered as decontextualized sentences. Instead language is seen as embedded in larger activities, through which they become meaningful. In a dynamic outlook on communication, the acts of speaking (which generates a form discourse, for instance, conversational discourse, lecture or speech) and interpreting build contexts and at the same time constrain the building of such contexts. In Heritage’s terminology, “the production of talk is doubly contextual” (Heritage 1984: 242). An utterance relies upon the existing context for its production and interpretation, and it is, in its own right, an event that shapes a new context for the action that will follow. A linguistic context can be decontextualized at a local level, and it can be recontextualized at a global level. There is intra-discursive recontextualization anchored to local decontextualization, and there is interdiscursive recontextualization anchored to global recontextualization. “A given context not only 'legislates' the interpretation of indexical elements; indexical elements can also mold the background of the context” (Ochs, 1990). In the case of recontextualization, in a particular scenario, it is valid to ask what do you mean or how do you mean. Making a reference to context and a reference to meaning helps to clarify when there is a controversy about the communicative status and at the same time provides a frame for the recontextualization. A linguistic context is intrinsically linked to a social context and a subcategory of the latter, the socio-cultural context. The social context can be considered as unmarked, hence a default context, whereas a socio-cultural context can be conceived as a marked type of context in which specific variables are interpreted in a particular mode. Culture provides us, the participants, with a filter mechanism which allows us to interpret a social context in accordance with particular socio-cultural context constraints and requirements. Besides, socially constitutive qualities of context are unavoidable since each interaction updates the existing context and prepares new ground for subsequent interaction. Now, how these aforementioned conceptualizations and views are reflected in NLP? Most of the research work has focused in the linguistic context, that is, in the word level surroundings and the lexical meaning. An approach to producing sense embeddings for the lexical meanings within a lexical knowledge base which lie in a space that is comparable to that of contextualized word vectors. Contextualized word embeddings have been used effectively across several tasks in Natural Language Processing, as they have proved to carry useful semantic information. The task of associating a word in context with the most suitable meaning from a predefined sense inventory is better known as Word Sense Disambiguation (Navigli, 2009). Linguistically speaking, “context encompasses the total linguistic and non-linguistic background of a text” (Crystal, 1991). Notice that the nature of context(s) is clearly crucial when reconstructing the meaning of a text. Therefore, “meaning-in-context should be regarded as a probabilistic weighting, of the list of potential meanings available to the user of the language.” The so-called disambiguating role of context should be taken with a pinch of salt. The main reason for language models such as BERT (Devlin et al., 2019), RoBERTA (Liu et al., 2019) and SBERT (Reimers, 2019) proved to be beneficial in most NLP task is that contextualized embeddings of words encode the semantics defined by their input context. In the same vein, a novel method for contextualized sense representations has recently been employed: SensEmBERT (Scarlini et al., 2020) which computes sense representations that can be applied directly to disambiguation. Still, there is a long way to go regarding context(s) research. The linguistic context is just one of the necessary conditions for sentence embeddedness in “a” context. For interpretation to take place, well-formed sentences and well-formed constructions, that is, linguistic strings which must be grammatical but may be constrained by cognitive sentence-processability and pragmatic relevance, particular linguistic-context and social-context configurations, which make their production and interpretation meaningful, will be needed.

Read More

Self-supervised learning The plan to make deep learning data-efficient

Article | March 23, 2020

Despite the huge contributions of deep learning to the field of artificial intelligence, there’s something very wrong with it: It requires huge amounts of data. This is one thing that both the pioneers and critics of deep learning agree on. In fact, deep learning didn’t emerge as the leading AI technique until a few years ago because of the limited availability of useful data and the shortage of computing power to process that data.Reducing the data-dependency of deep learning is currently among the top priorities of AI researchers.

Read More

DATA CENTRE MARKET EXPECTED TO ACCELERATE OWING TO INCREASING CLOUD DEMAND

Article | February 28, 2020

An enormous amount of data is generated daily through various medium and amid this their storage becomes a great concern for organizations. Currently, two significant styles of data storage capacities are available Cloud and Data Centre.The main difference between the cloud vs. data centre is that a data centre refers to on-premise hardware while the cloud refers to off-premise computing. The cloud stores the data in the public cloud, while a data centre stores the data on company’s own hardware. Many businesses are turning to the cloud. In fact, Gartner, Inc. predicted that the worldwide public cloud services market has grown to 17.5 percent in 2019 to total US$214.3 billion. For many businesses, utilizing the cloud makes sense. While, in many other cases, having an in-house data centre is a better option. Often, maintaining an in-house data centre is expensive, but it can be beneficial to be in total control of computing environment.

Read More

Spotlight

Technica Corporation

Technica has been providing technology integration, professional services, products, and innovative technology solutions to defense, intelligence, law enforcement, and civilian agencies since 1991. We specialize in network operations and infrastructure; cyber defense and security; government application integration; software development and support; systems engineering and training; and product deployment planning, and support. Our research and development department provides customer-tailored, budget-sensitive solutions in emerging technologies such as big data analytics. As an experienced systems integrator, we serve as a trusted advisor to help you navigate the often complex government buying process and procure the services your agency needs…

Events