Data Replication in Hadoop

| June 6, 2018

article image
In October 2017, Hortonworks launched DataPlane Service, a portfolio of solutions that enables businesses to manage, secure and govern their data spread across on-premise and in-cloud data lakes. Data Lifecycle Manager(DLM) was the first extensible service to be built on the DataPlane platform. We recently sat down with Niru Anisetti, Principal Product Manager to talk about DLM. Prior to joining Hortonworks, Niru was Program Director in the product management team for Spark services at IBM. Traditionally, Apache Hadoop has been associated with data storage and compute. Do you think there is an awareness of data replication in the Hadoop space? You are absolutely right that Apache Hadoop was associated with big data storage and batch compute in the past and is still true in most cases. What has significantly changed is where and how the data is consumed. DLM helps customers by moving the data where their business applications run whether it is in the cloud or in a specific data center in EU region to comply with GDPR regulation. Data replication, backup and restore are fairly mature technologies. Why has Hortonworks decided to enter this market? As noted in Gartner’s[i] July 2017 Magic Quadrant for Disaster Recovery as a Service report, “…  DRaaS is now a mainstream offering.” Gartner estimated “it to be a $2.02 billion business currently, and it is expected to reach $3.73 billion by 2021.”

Spotlight

Q4 Inc

Q4 is a leading global provider of cloud-based investor relations and capital market solutions. Q4 empowers customers to be leaders in IR through innovative technology and exceptional customer service. Our comprehensive portfolio of IR solutions, including quantitative and real-time shareholder analytics, IR desktop, websites, and webcasting arm industry professionals with the tools and insights required to run award-winning IR programs, make effective business decisions, and better engage with the street.

OTHER ARTICLES

Will Quantum Computers Make Supercomputers Obsolete in the Field of High Performance Computing?

Article | May 12, 2021

If you want an explicit answer without having to know the extra details, then here it is: Yes, there is a possibility that quantum computers can replace supercomputers in the field of high performance computing, under certain conditions. Now, if you want to know how and why this scenario is a possibility and what those conditions are, I’d encourage you to peruse the rest of this article. To start, we will run through some very simple definitions. Definitions If you work in the IT sector, you probably would have heard of the terms ‘high performance computing’, ‘supercomputers’ and ‘quantum computers’ many times. These words are thrown around quite often nowadays, especially in the area of data science and artificial intelligence. Perhaps you would have deduced their meanings from their context of use, but you may not have gotten the opportunity to explicitly sit down and do the required research on what they are and why they are used. Therefore, it is a good idea to go through their definitions, so that you have a better understanding of each concept. High Performance Computing: It is the process of carrying out complex calculations and computations on data at a very high speed. It is much faster than regular computing. Supercomputer: It is a type of computer that is used to efficiently perform powerful and quick computations. Quantum Computing: It is a type of computer that makes use of quantum mechanics’ concepts like entanglement and superposition, in order to carry out powerful computations. Now that you’ve gotten the gist of these concepts, let’s dive in a little more to get a wider scope of how they are implemented throughout the world. Background High performance computing is a thriving area in the sector of information technology, and rightly so, due to the rapid surge in the amount of data that is produced, stored, and processed every second. Over the last few decades, data has become increasingly significant to large corporations, small businesses, and individuals, as a result of its tremendous potential in their growth and profit. By properly analysing data, it is possible to make beneficial predictions and determine optimal strategies. The challenge is that there are huge amounts of data being generated every day. If traditional computers are used to manage and compute all of this data, the outcome would take an irrationally long time to be produced. Massive amounts of resources like time, computational power, and expenses would also be required in order to effectuate such computations. Supercomputers were therefore introduced into the field of technology to tackle this issue. These computers facilitate the computation of huge quantities of data at much higher speeds than a regular computer. They are a great investment for businesses that require data to be processed often and in large amounts at a time. The main advantage of supercomputers is that they can do what regular computers need to do, but much more quickly and efficiently. They have an overall high level of performance. Till date, they have been applied in the following domains: • Nuclear Weapon Design • Cryptography • Medical Diagnosis • Weather Forecasting • Online Gaming • Study of Subatomic Particles • Tackling the COVID-19 Pandemic Quantum computers, on the other hand, use a completely different principle when functioning. Unlike regular computers that use bits as the smallest units of data, quantum computers generate and manipulate ‘qubits’ or ‘quantum bits’, which are subatomic particles like electrons or photons. These qubits have two interesting quantum properties which allow them to powerfully compute data – • Superposition: Qubits, like regular computer bits, can be in a state of 1 or 0. However, they also have the ability to be in both states of 1 and 0 simultaneously. This combined state allows quantum computers to calculate a large number of possible outcomes, all at once. When the final outcome is determined, the qubits fall back into a state of either 1 or 0. This property iscalled superposition. • Entanglement: Pairs of qubits can exist in such a way that two members of a pair of qubits exist in a single quantum state. In such a situation, changing the state of one of the qubits can instantly change the state of the other qubit. This property is called entanglement. Their most promising applications so far include: • Cybersecurity • Cryptography • Drug Designing • Financial Modelling • Weather Forecasting • Artificial Intelligence • Workforce Management Despite their distinct features, both supercomputers and quantum computers are immensely capable of providing users with strong computing facilities. The question is, how do we know which type of system would be the best for high performance computing? A Comparison High performance computing requires robust machines that can deal with large amounts of data - This involves the collection, storage, manipulation, computation, and exchange of data in order to derive insights that are beneficial to the user. Supercomputers have successfully been used so far for such operations. When the concept of a quantum computer first came about, it caused quite a revolution within the scientific community. People recognised its innumerable and widespread abilities, and began working on ways to convert this theoretical innovation into a realistic breakthrough. What makes a quantum computer so different from a supercomputer? Let’s have a look at Table 1.1 below. From the table, we can draw the following conclusions about supercomputers and quantum computers - 1. Supercomputers have been around for a longer duration of time, and are therefore more advanced. Quantum computers are relatively new and still require a great depth of research to sufficiently comprehend their working and develop a sustainable system. 2. Supercomputers are easier to provide inputs to, while quantum computers need a different input mechanism. 3. Supercomputers are fast, but quantum computers are much faster. 4. Supercomputers and quantum computers have some similar applications. 5. Quantum computers can be perceived as extremely powerful and highly advanced supercomputers. Thus, we find that while supercomputers surpass quantum computers in terms of development and span of existence, quantum computers are comparatively much better in terms of capability and performance. The Verdict We have seen what supercomputers and quantum computers are, and how they can be applied in real-world scenarios, particularly in the field of high performance computing. We have also gone through their differences and made significant observations in this regard. We find that although supercomputers have been working great so far, and they continue to provide substantial provisions to researchers, organisations, and individuals who require intense computational power for the quick processing of enormous amounts of data, quantum computers have the potential to perform much better and provide faster and much more adequate results. Thus, quantum computers can potentially make supercomputers obsolete, especially in the field of high performance computing, if and only if researchers are able to come up with a way to make the development, deployment, and maintenance of these computers scalable, feasible, and optimal for consumers.

Read More

How to Overcome Challenges in Adopting Data Analytics

Article | April 20, 2020

Achieving organizational success and making data-driven decisions in 2020 requires embracing tech tools like Data Analytics and collecting, storing and analysing data isn’t.The real data-driven, measurable growth, and development come with the establishment of data-driven company culture.In this type of culture company actively uses data resources as a primary asset to make smart decisions and ensure future growth. Despite the rapid growth of analytic solutions, a recent Gartner survey revealed that almost 75% of organizations thought their analytics maturity had not reached a level that optimized business outcomes. Just like with any endeavor, your organization must have a planned strategy to achieve its analytical goals. Let’s explore ways for overcoming common blockers, and elements used in successful analytics adoption strategies. Table of Contents: - AMM: Analytic Maturity Model - What are the blockers to achieving a strategy-driven analytics? - What are the adoption strategies to achieve an analytics success? - Conclusion AMM: Analytic Maturity Model The Analytic Maturity Model (AMM) evaluates the analytic maturity of an organization.The model identifies the five stages an organization travels through to reach optimization. Organizations must implement the right tools, engage their team in proper training, and provide the management support necessary to generate predictable outcomes with their analytics. Based on the maturity of these processes, the AMM divides organizations into five maturity levels: - Organizations that can build reports. - Organizations that can build and deploy models. - Organizations that have repeatable processes for building and deploying analytics. - Organizations that have consistent enterprise-wide processes for analytics. - Enterprises whose analytics is strategy driven. READ MORE:EFFECTIVE STRATEGIES TO DEMOCRATIZE DATA SCIENCE IN YOUR ORGANIZATION What are the blockers to achieving a strategy-driven analytics? - Missing an Analytics Strategy - Analytics is not for everyone - Data quality presents unique challenges - Siloed Data - Changing the culture What are the adoption strategies to achieve analytic success? • Have you got a plan to achieve analytic success? The strategy begins with business intelligence and moves toward advanced analytics. The approach differs based on the AMM level. The plan may address the strategy for a single year, or it may span 3 or more years. It ideally has milestones for what the team will do. When forming an analytics strategy, it can be expensive and time consuming at the outset. While organizations are encouraged to seek projects that can generate quick wins, the truth is that it may be months before any actionable results are available. During this period, the management team is frantically diverting resources from other high-profile projects. If funds are tight, this situation alone may cause friction. It may not be apparent to everyone how the changes are expected to help. Here are the elements of a successful analytics strategy: • Keep the focus tied to tangible business outcomes The strategy must support business goals first. With as few words as possible, your plan should outline what you intend to achieve, how to complete it, and a target date for completion of the plan. Companies may fail at this step because they mistake implementing a tool for having a strategy. To keep it relevant, tie it to customer-focused goals. The strategy must dig below the surface with the questions that it asks. Instead of asking surface questions such as “How can we save money?”, instead ask, “How can we improve the quality of the outcomes for our customers?” or “What would improve the productivity of each worker?” These questions are more specific and will get the results the business wants. You may need to use actual business cases from your organization to think through the questions. • Select modern, multi-purpose tools The organization should be looking for an enterprise tool that supports integrating data from various databases, spreadsheets, or even external web based sources. Typically, organizations may have their data stored across multiple databases such as Salesforce, Oracle, and even Microsoft Access. The organization can move ahead quicker when access to the relevant data is in a single repository. With the data combined, the analysts have a specific location to find reports and dashboards. The interface needs to be robust enough to show the data from multiple points of view. It should also allow future enhancements, such as when the organization makes the jump into data science. Incorta’s Data Analytics platform simplifies and processes data to provide meaningful information at speed that helps make informed decisions. Incorta is special in that it allows business users to ask the same complex and meaningful questions of their data that typically require many IT people and data scientist to get the answers they need to improve their line of business. At the digital pace of business today, that can mean millions of dollars for business leaders in finance, supply chain or even marketing. Speed is a key differentiator for Incorta in that rarely has anyone been able to query billions of rows of data in seconds for a line of business owner. - Tara Ryan, CMO, Incorta Technology implementations take time. That should not stop you from starting in small areas of the company to look for quick wins. Typically, the customer-facing processes have areas where it is easier to collect data and show opportunities for improvement. • Ensure staff readiness If your current organization is not data literate, then you will need resources who understand how to analyze and use data for process improvement. It is possible that you can make data available and the workers still not realize what they can do with it. The senior leadership may also need training about how to use data and what data analytics makes possible. • Start Small to Control Costs and Show Potential If the leadership team questions the expense, consider doing a proof of concept that focuses on the tools and data being integrated quickly and efficiently to show measurable success. The business may favor specific projects or initiatives to move the company forward over long-term enterprise transformations (Bean & Davenport, 2019). Keeping the project goals precise and directed helps control costs and improve the business. As said earlier, the strategy needs to answer deeper business questions. Consider other ways to introduce analytics into the business. Use initiatives that target smaller areas of the company to build competencies. Provide an analytics sandbox with access to tools and training to encourage other non-analytics workers (or citizen data scientists) to play with the data. One company formed a SWAT team, including individuals from across the organization. The smaller team with various domain experience was better able to drive results. There are also other approaches to use – the key is to show immediate and desirable results that align with organizational goals. • Treating the poor data quality What can you do about poor data quality at your company? Several solutions that can help to improve productivity and reduce the financial impact of poor data quality in your organization include: • Create a team to set the proper objectives Create a team who owns the data quality process. This is important to prove to yourself and to anyone with whom you are conversing about data that you are serious about data quality. The size of the team is not as important as the membership from the parts of the organization that have the right impact and knowledge in the process. When the team is set, make sure that they create a set of goals and objectives for data quality. To gauge performance, you need a set of metrics to measure the performance. After you create the proper team to govern your data quality, ensure that the team focuses on the data you need first. Everyone knows the rules of "good data in, good data out" and "bad data in, bad data out." To put this to work, make sure that your team knows the relevant business questions that are in progress across various data projects to make sure that they focus on the data that supports those business questions. • Focus on the data you need now as the highest priority Once you do that, you can look at the potential data quality issues associated with each of the relevant downstream business questions and put the proper processes and data quality routines in place to ensure that poor data quality has a low probability of Successful Analytics Adoption Strategies, continuing to affect that data. As you decide which data to focus on, remember that the key for innovators across industries is that the size of the data isn’t the most critical factor — having the right data is (Wessel, 2016). • Automate the process of data quality when data volumes grow too large When data volumes become unwieldy and difficult to manage the quality, automate the process. Many data quality tools in the market do a good job of removing the manual effort from the process. Open source options include Talend and DataCleaner. Commercial products include offerings from DataFlux, Informatica, Alteryx and Software AG. As you search for the right tool for you and your team, beware that although the tools help with the organization and automation, the right processes and knowledge of your company's data are paramount to success. • Make the process of data quality repeatable It needs regular care and feeding. Remember that the process is not a one-time activity. It needs regular care and feeding. While good data quality can save you a lot of time, energy, and money downstream, it does take time, investment, and practice to do well. As you improve the quality of your data and the processes around that quality, you will want to look for other opportunities to avoid data quality mishaps. • Beware of data that lives in separate databases When data is stored in different databases, there can be issues with different terms being used for the same subject. The good news is that if you have followed the former solutions, you should have more time to invest in looking for the best cases. As always, look for the opportunities with the biggest bang for the buck first. You don't want to be answering questions from the steering committee about why you are looking for differences between "HR" and "Hr" if you haven't solved bigger issues like knowing the difference between "Human Resources" and "Resources," for example. • De-Siloing Data The solution to removing data silos typically isn’t some neatly packaged, off-the-shelf product. Attempts to quickly create a data lake by simply pouring all the siloed data together can result in an unusable mess, turning more into a data swamp. This is a process that must be done carefully to avoid confusion, liability, and error. Try to identify high-value opportunities and find the various data stores required to execute those projects. Working with various business groups to find business problems that are well-suited to data science solutions and then gathering the necessary data from the various data stores can lead to high-visibility successes. As value is proved from joining disparate data sources together to create new insights, it will be easier to get buy-in from upper levels to invest time and money into consolidating key data stores. In the first efforts, getting data from different areas may be akin to pulling teeth, but as with most things in life, the more you do it, the easier it gets. Once the wheels get moving on a few of these integration projects, make wide-scale integration the new focus. Many organizations at this stage appoint a Chief Analytics Officer (CAO) who helps increase collaboration between the IT and business units ensuring their priorities are aligned. As you work to integrate the data, make sure that you don’t inadvertently create a new “analytics silo.” The final aim here is an integrated platform for your enterprise data. • Education is essential When nearly 45% of workers generally prefer status quo over innovation, how do you encourage an organization to move forward? If the workers are not engaged or see the program as merely just the latest management trend, it may be tricky to convince them. Larger organizations may have a culture that is slow to change due to their size or outside forces. There’s also a culture shift required - moving from experience and knee-jerk reactions to immersion and exploration of rich insights and situational awareness. - Walter Storm, the Chief Data Scientist, Lockheed Martin Companies spend a year talking about an approved analytics tool before moving forward. The employees had time to consider the change and to understand the new skill sets needed. Once the entire team embraced the change, the organization moved forward swiftly to convert existing data and reports into the new tool. In the end, the corporation is more successful, and the employees are still in alignment with the corporate strategy. If using data to support decisions is a foreign concept to the organization, it’s a smart idea to ensure the managers and workers have similar training. This training may involve everything from basic data literacy to selecting the right data for management presentations. However, it cannot stop at the training; the leaders must then ask for the data to move forward with requests that will support conclusions that will be used to make critical decisions across the business. These methods make it easier to sell the idea and keep the organization’s analytic strategy moving forward. Once senior leadership uses data to make decisions, everyone else will follow their lead. It is that simple. Conclusion The analytics maturity model serves as a useful framework for understanding where your organization currently stands regarding strategy, progress, and skill sets. Advancing along the various levels of the model will become increasingly imperative as early adopters of advanced analytics gain a competitive edge in their respective industries. Delay or failure to design and incorporate a clearly defined analytics strategy into an organization’s existing plan will likely result in a significant missed opportunity. READ MORE:BIG DATA ANALYTICS STRATEGIES ARE MATURING QUICKLY IN HEALTHCARE

Read More

THE NOT-SO-DISTANT FUTURE OF WORK

Article | November 20, 2020

As smart machines, data, and algorithms usher in dramatic technological transformation, its global impact spans from cautious optimism to doomsday scenarios. Widespread transformation, displacement, and disaggregation of world labor markets is speculated in countries like India, with an estimated 600 million workforce by 2022, as well as the global labor market. Even today, we are witnessing the resurgence of 'hybrid' jobs where distinctive human abilities are paired with data and algorithms, and 'super' jobs that involve deep tech. Our historical response to such tectonic shifts and upheavals has been predictable so far - responding with trepidation and uncertainty in the beginning followed by a period of painful transition. Communities and nations that can sense and respond will be able to shape social, economic, and political order decisively. However, with general AI predictably coming of age by 2050-60, governments will need to frame effective policies to respond to their obligations to their citizens. This involves the creation of a new social contract between the individual, enterprise, and state for an inclusive and equitable society. The present age is marked by automation, augmentation, and amplification of human talent by transformative technologies. A typical career may go through 15-20 transitions. And given the gig economy, the shelf-life of skills is rapidly shrinking. Many agree that for the next 30 years, the nature and the volume of jobs will get significantly redefined. So even as it is nearly impossible to gaze into the crystal ball 100 years later, one can take a shot at what jobs may emerge in the next 20-30 years given the present state. So here is a glimpse into the kind of technological changes the next generation might witness that will change the employment scenario: RESTORATION OF BIODIVERSITY Our biodiversity is shrinking frighteningly fast - for both flora and fauna. Extinct species revivalists may be challenged with restoring and reintegrating pertinent elements back into the natural environment. Without biodiversity, humanity will perish. PERSONALIZED HEALTHCARE Medicine is rapidly getting personalized as genome sequencing becomes commonplace. Even today, Elon Musk's Neuralink is working on brain-machine interfaces. So you may soon be able to upload your brain onto a computer where it can be edited, transformed, and re-uploaded back into you. Anti-aging practitioners will be tasked with enhancing human life-spans to ensure we stay productive late into our twilight years. Gene sequencers will help personalize treatments and epigenetic therapists will manipulate gene expression to overcome disease and decay. Brain neurostimulation experts and augmentationists may be commonplace to ensure we are happier, healthier, and disease-free. In fact, happiness itself may get redefined as it shifts from the quality of our relationships to that between man-machine integration. THE QUANTIFIED SELF As more of the populace interact and engage with a digitized world, digital rehabilitators will help you detox and regain your sense of self, which may get inseparably intertwined with smart machines and interfaces. DATA-LED VALUE CREATION Data is exploding at a torrid pace and becoming a source of value-creation. While today's organizations are scrambling to create data lakes, future data-centers will be entrusted with sourcing high-value data, securing rights to it, and even licensing it to others. Data will increasingly create competitive asymmetries amongst organizations and nations. Data brokers will be the new intermediaries and data detectives, analysts, monitors or watchers, auditors, and frackers will emerge as new-age roles. Since data and privacy issues are entwined together, data regulators, ethicists, and trust professionals will thrive. Many new cyber laws will come into existence. HEALING THE PLANET As the world grapples with the specter of climate change, our focus on sustainability and clean energy will intensify. Our landfills are choked with both toxic and non-toxic waste. Plastic alone takes almost 1000 years to degrade, so landfill operators will use earthworm-like robots to help decompose waste and recoup precious recyclable waste. Nuclear fusion will emerge as the new source of clean energy, creating a broad gamut of engineers, designers, integrators, architects, and planners around it. We may even generate power in space. Since our oceans are infested with waste, a lot of initiatives and roles will emerge around cleaning the marine environment to ensure natural habitat and food security. TAMING THE GENOME As technologies like CRISPR and Prime-editing mature, we may see a resurgence of biohackers and programmable healthcare. Our health and nutrition may be algorithmically managed. CRISPR-like advancements will need a swathe of engineers, technicians, auditors, and regulators for genetically engineered health that may overcome a wide variety of diseases for longer life-expectancy. THE RISE OF BOTS Humanoid and non-humanoid robots will need entire workforce ecosystems around them spanning from suppliers, programmers, operators, and maintenance experts to ethicists and UI-designers. Smart robot psychologists will have to counsel them and ensure they are safe and friendly. Regulators may grant varying levels of autonomy to robots. DATA LOADS THE GUN, CREATIVITY FIRES THE TRIGGER Today's deep-learning Generative Adversarial Networks (GANs) can create music like Mozart and paintings like Picasso. Such advancements will give birth to a wide array of AI-enhanced professionals, like musicians, painters, authors, quantum programmers, cybersecurity experts, educators, etc. FROM AUGMENTATION TO AUTONOMY Autonomous driving is about to mature in the next few years and will extend to air and space travel. Safety will exceed human capabilities and we may soon reach a state of diminishing returns where we will employ fewer humans to prevent mishaps and unforeseen occurrences. This industry will need supportive command center managers, traffic analyzers, fleet managers, and people to ensure onboarding experience. BLOCKCHAIN BECOMES PERVASIVE Blockchain will create a lot of jobs for its mainstream and derivative applications. Even though most of its present applications are in Financial Services, Supply Chain, and Asset Management industries, very soon its adoption and integration will be a lot more expansive. Engineers, designers, UI/UX experts, analysts, auditors, and regulators will be required to manage blockchain-related applications. With Crypto being one of its better-known applications, a lot of transaction specialists, miners, insurers, wealth managers, and regulators will be needed. Crypto exchanges will come under the purview of the regulatory framework. 3D PRINTING TURNS GAME-CHANGER Additive manufacturing, also popularly called 3D printing, will mature in its precision, capabilities, and market potential. Lab-grown, 3D-printed food will be part of our regular diet. Transplantable organs will be generated using stem cell research and 3D printing. Amputees and the disabled will adopt 3D-printed limbs and prosthetics. Its applications for high-precision reconstructive surgery are already commonplace. Pills are being 3D printed as we speak. So again, we are looking at 3D printers, operators, material scientists, pharmacists, construction experts, etc. THE COLONIZATION OF OUTER SPACE Amazon's Blue Origin and Elon Musk's SpaceX signal a new horizon. As space tech gets into a new trajectory, a new breed of commercial space pilots, mission planners, launch managers, cargo experts, ground crew, experience designers, etc. will be required. Since we have ravaged the limited resources of our planet already, mankind will need to venture into asteroid mining for rare and precious metals. This will need scouts and surveyors, meteorologists, remote bot operators, remotely managed factories, and whatnot. THE HYPER-CONNECTED WORLD By 2020, we already have anywhere between 50-75 billion connected devices. By 2040, this will likely swell to more than 100 trillion sensors that will spew out a dizzying volume of real-time data ready for analytics and AI. A complete IoT system as we know it is aware, autonomous, and actionable, just like a self-driving car. Imagine the number of data modelers, sensor designers and installers, signal architects and engineers that will be needed. Home automation will be pervasive and smart medicines, implants, and wearables will be the norms of the day. DRONES USHER IN DISRUPTION Unmanned aerial and underwater drones are already becoming ubiquitous for applications in aerial surveillance, delivery, and security. Countries are awakening to their potential as well as possibilities of misuse. Command centers, just like that for space travel, will manage them as countries rush to put in a regulatory framework around them. An army of designers, programmers, security experts, traffic flow optimizers will harness their true potential. SHIELDING YOUR DATA With data come cyber threats, data breaches, cyber warfare, cyber espionage, and a host of other issues. The more data-dependent and connected the world is, the bigger the problem of cybersecurity will be. The severity of the problem will increase manifold from the current issues like phishing, spyware, malware, viruses and worms, ransomware, DoS/ DDoS attacks, hacktivism, and cybersecurity will indeed be big business. The problem is that threats are increasing 10X faster than investments in this space and the interesting thing is that it is a lot more about audits, governance, policies, and compliance than technology alone. FOOD-TECH COMES OF AGE As the world population grows to 9.7 billion people in 2050, cultured food and lab-grown meat will hit our tables to ensure food security. Entire food chains and value delivery networks will see an unprecedented change. Agriculture will be transformed with robotics, IoT, drones, and the food-tech sector will take off in a big way. QUANTUM COMPUTING SOLVES INTRACTABLE PROBLEMS Finally, while the list is very long, let’s touch upon the advent of qubits, or Quantum computing. With its ability to break the best encryption on the planet, the traditional asymmetric encryption, public key infrastructure, digital envelopes, and digital certificates in use today will be rendered useless. Bring in the quantum programmers, analysts, privacy and trust managers, health monitors, etc. As we brace for the world that looms large ahead of us, the biggest enabler that will be transformed itself will be Education 4.0. Education will cease to be a phase in your life. Life-long interventions will be needed to adapt, impart, and shape the skills of individuals that are ready for the future of work. More power to the people!

Read More

The case for hybrid artificial intelligence

Article | March 4, 2020

Deep learning, the main innovation that has renewed interest in artificial intelligence in the past years, has helped solve many critical problems in computer vision, natural language processing, and speech recognition. However, as the deep learning matures and moves from hype peak to its trough of disillusionment, it is becoming clear that it is missing some fundamental components.

Read More

Spotlight

Q4 Inc

Q4 is a leading global provider of cloud-based investor relations and capital market solutions. Q4 empowers customers to be leaders in IR through innovative technology and exceptional customer service. Our comprehensive portfolio of IR solutions, including quantitative and real-time shareholder analytics, IR desktop, websites, and webcasting arm industry professionals with the tools and insights required to run award-winning IR programs, make effective business decisions, and better engage with the street.

Events