Want Agile BI? You Need These Capabilities

| April 10, 2018

article image
The need for immediate access to data is a popular theme in business today. The most successful companies will be those that react quickly to changing environments. One of the biggest challenges organizations face today is ensuring that many different types of users get access to granular data in a timely manner in order to perform analytics on that data. Gone are the days when day-old, aggregated data was sufficient for making critical business decisions. This is why the notion of agile business intelligence (BI) is becoming top of mind. Agile BI allows data consumers to analyze all levels of data without the typical delays found in a traditional environment. Users don’t want to, or can’t, endure delays due to slow processes or technology limitations. They also don’t want to have to spell out exactly what they need up front prior to obtaining access to the data. They want the flexibility to explore a variety of data sets including ones they did not anticipate a need for–to get the insights to let them be successful in their job.

Spotlight

BuilDATAnalytics

BuilDatAnalytics is a business intelligence company for the commercial construction industry. BDA provides field crew with its flagship, patent pending software platform called CTBIM. CTBIM captures real-time field activities, which enables the production of insightful analytics for project owners and contractors. Simply stated, CTBIM™ transforms the current way of capturing and managing information during pre-construction, construction and post-construction to an integrated, streamlined and more efficient process. Users of CTBIM™ realize fewer costly errors, a reduction of risk, more efficient use of time and greater profits.

OTHER ARTICLES

THE NOT-SO-DISTANT FUTURE OF WORK

Article | November 20, 2020

As smart machines, data, and algorithms usher in dramatic technological transformation, its global impact spans from cautious optimism to doomsday scenarios. Widespread transformation, displacement, and disaggregation of world labor markets is speculated in countries like India, with an estimated 600 million workforce by 2022, as well as the global labor market. Even today, we are witnessing the resurgence of 'hybrid' jobs where distinctive human abilities are paired with data and algorithms, and 'super' jobs that involve deep tech. Our historical response to such tectonic shifts and upheavals has been predictable so far - responding with trepidation and uncertainty in the beginning followed by a period of painful transition. Communities and nations that can sense and respond will be able to shape social, economic, and political order decisively. However, with general AI predictably coming of age by 2050-60, governments will need to frame effective policies to respond to their obligations to their citizens. This involves the creation of a new social contract between the individual, enterprise, and state for an inclusive and equitable society. The present age is marked by automation, augmentation, and amplification of human talent by transformative technologies. A typical career may go through 15-20 transitions. And given the gig economy, the shelf-life of skills is rapidly shrinking. Many agree that for the next 30 years, the nature and the volume of jobs will get significantly redefined. So even as it is nearly impossible to gaze into the crystal ball 100 years later, one can take a shot at what jobs may emerge in the next 20-30 years given the present state. So here is a glimpse into the kind of technological changes the next generation might witness that will change the employment scenario: RESTORATION OF BIODIVERSITY Our biodiversity is shrinking frighteningly fast - for both flora and fauna. Extinct species revivalists may be challenged with restoring and reintegrating pertinent elements back into the natural environment. Without biodiversity, humanity will perish. PERSONALIZED HEALTHCARE Medicine is rapidly getting personalized as genome sequencing becomes commonplace. Even today, Elon Musk's Neuralink is working on brain-machine interfaces. So you may soon be able to upload your brain onto a computer where it can be edited, transformed, and re-uploaded back into you. Anti-aging practitioners will be tasked with enhancing human life-spans to ensure we stay productive late into our twilight years. Gene sequencers will help personalize treatments and epigenetic therapists will manipulate gene expression to overcome disease and decay. Brain neurostimulation experts and augmentationists may be commonplace to ensure we are happier, healthier, and disease-free. In fact, happiness itself may get redefined as it shifts from the quality of our relationships to that between man-machine integration. THE QUANTIFIED SELF As more of the populace interact and engage with a digitized world, digital rehabilitators will help you detox and regain your sense of self, which may get inseparably intertwined with smart machines and interfaces. DATA-LED VALUE CREATION Data is exploding at a torrid pace and becoming a source of value-creation. While today's organizations are scrambling to create data lakes, future data-centers will be entrusted with sourcing high-value data, securing rights to it, and even licensing it to others. Data will increasingly create competitive asymmetries amongst organizations and nations. Data brokers will be the new intermediaries and data detectives, analysts, monitors or watchers, auditors, and frackers will emerge as new-age roles. Since data and privacy issues are entwined together, data regulators, ethicists, and trust professionals will thrive. Many new cyber laws will come into existence. HEALING THE PLANET As the world grapples with the specter of climate change, our focus on sustainability and clean energy will intensify. Our landfills are choked with both toxic and non-toxic waste. Plastic alone takes almost 1000 years to degrade, so landfill operators will use earthworm-like robots to help decompose waste and recoup precious recyclable waste. Nuclear fusion will emerge as the new source of clean energy, creating a broad gamut of engineers, designers, integrators, architects, and planners around it. We may even generate power in space. Since our oceans are infested with waste, a lot of initiatives and roles will emerge around cleaning the marine environment to ensure natural habitat and food security. TAMING THE GENOME As technologies like CRISPR and Prime-editing mature, we may see a resurgence of biohackers and programmable healthcare. Our health and nutrition may be algorithmically managed. CRISPR-like advancements will need a swathe of engineers, technicians, auditors, and regulators for genetically engineered health that may overcome a wide variety of diseases for longer life-expectancy. THE RISE OF BOTS Humanoid and non-humanoid robots will need entire workforce ecosystems around them spanning from suppliers, programmers, operators, and maintenance experts to ethicists and UI-designers. Smart robot psychologists will have to counsel them and ensure they are safe and friendly. Regulators may grant varying levels of autonomy to robots. DATA LOADS THE GUN, CREATIVITY FIRES THE TRIGGER Today's deep-learning Generative Adversarial Networks (GANs) can create music like Mozart and paintings like Picasso. Such advancements will give birth to a wide array of AI-enhanced professionals, like musicians, painters, authors, quantum programmers, cybersecurity experts, educators, etc. FROM AUGMENTATION TO AUTONOMY Autonomous driving is about to mature in the next few years and will extend to air and space travel. Safety will exceed human capabilities and we may soon reach a state of diminishing returns where we will employ fewer humans to prevent mishaps and unforeseen occurrences. This industry will need supportive command center managers, traffic analyzers, fleet managers, and people to ensure onboarding experience. BLOCKCHAIN BECOMES PERVASIVE Blockchain will create a lot of jobs for its mainstream and derivative applications. Even though most of its present applications are in Financial Services, Supply Chain, and Asset Management industries, very soon its adoption and integration will be a lot more expansive. Engineers, designers, UI/UX experts, analysts, auditors, and regulators will be required to manage blockchain-related applications. With Crypto being one of its better-known applications, a lot of transaction specialists, miners, insurers, wealth managers, and regulators will be needed. Crypto exchanges will come under the purview of the regulatory framework. 3D PRINTING TURNS GAME-CHANGER Additive manufacturing, also popularly called 3D printing, will mature in its precision, capabilities, and market potential. Lab-grown, 3D-printed food will be part of our regular diet. Transplantable organs will be generated using stem cell research and 3D printing. Amputees and the disabled will adopt 3D-printed limbs and prosthetics. Its applications for high-precision reconstructive surgery are already commonplace. Pills are being 3D printed as we speak. So again, we are looking at 3D printers, operators, material scientists, pharmacists, construction experts, etc. THE COLONIZATION OF OUTER SPACE Amazon's Blue Origin and Elon Musk's SpaceX signal a new horizon. As space tech gets into a new trajectory, a new breed of commercial space pilots, mission planners, launch managers, cargo experts, ground crew, experience designers, etc. will be required. Since we have ravaged the limited resources of our planet already, mankind will need to venture into asteroid mining for rare and precious metals. This will need scouts and surveyors, meteorologists, remote bot operators, remotely managed factories, and whatnot. THE HYPER-CONNECTED WORLD By 2020, we already have anywhere between 50-75 billion connected devices. By 2040, this will likely swell to more than 100 trillion sensors that will spew out a dizzying volume of real-time data ready for analytics and AI. A complete IoT system as we know it is aware, autonomous, and actionable, just like a self-driving car. Imagine the number of data modelers, sensor designers and installers, signal architects and engineers that will be needed. Home automation will be pervasive and smart medicines, implants, and wearables will be the norms of the day. DRONES USHER IN DISRUPTION Unmanned aerial and underwater drones are already becoming ubiquitous for applications in aerial surveillance, delivery, and security. Countries are awakening to their potential as well as possibilities of misuse. Command centers, just like that for space travel, will manage them as countries rush to put in a regulatory framework around them. An army of designers, programmers, security experts, traffic flow optimizers will harness their true potential. SHIELDING YOUR DATA With data come cyber threats, data breaches, cyber warfare, cyber espionage, and a host of other issues. The more data-dependent and connected the world is, the bigger the problem of cybersecurity will be. The severity of the problem will increase manifold from the current issues like phishing, spyware, malware, viruses and worms, ransomware, DoS/ DDoS attacks, hacktivism, and cybersecurity will indeed be big business. The problem is that threats are increasing 10X faster than investments in this space and the interesting thing is that it is a lot more about audits, governance, policies, and compliance than technology alone. FOOD-TECH COMES OF AGE As the world population grows to 9.7 billion people in 2050, cultured food and lab-grown meat will hit our tables to ensure food security. Entire food chains and value delivery networks will see an unprecedented change. Agriculture will be transformed with robotics, IoT, drones, and the food-tech sector will take off in a big way. QUANTUM COMPUTING SOLVES INTRACTABLE PROBLEMS Finally, while the list is very long, let’s touch upon the advent of qubits, or Quantum computing. With its ability to break the best encryption on the planet, the traditional asymmetric encryption, public key infrastructure, digital envelopes, and digital certificates in use today will be rendered useless. Bring in the quantum programmers, analysts, privacy and trust managers, health monitors, etc. As we brace for the world that looms large ahead of us, the biggest enabler that will be transformed itself will be Education 4.0. Education will cease to be a phase in your life. Life-long interventions will be needed to adapt, impart, and shape the skills of individuals that are ready for the future of work. More power to the people!

Read More

COVID19: A crisis that necessitates Open Data

Article | November 20, 2020

The coronavirus outbreak in China has grown to a pandemic and is affecting the global health & social and economic dynamics. An ever increasing velocity and scale of analysis — in terms of both processing and access is required to succeed in the face of unimaginable shifts of market; health and social paradigms. The COVID-19 pandemic is accompanied by an Infodemic. With the global Novel Coronavirus pandemic filling headlines, TV news space and social media it can seem as if we are drowning in information and data about the virus. With so much data being pushed at us and shared it can be hard for the general public to know what is correct, what is useful and (unfortunately) what is dangerous. In general, levels of trust in scientists are quite high albeit with differences across countries and regions. A 2019 survey conducted across 140 countries showed that, globally, 72% of the respondents trusted scientists at “high” or “medium” levels. However, the proportion expressing “high” or “medium” levels of trust in science ranged from about 90% in Northern and Western Europe to 68% in South America and 48% in Central Africa (Rabesandratana, 2020). In times of crisis, like the ongoing spread of COVID-19, both scientific & non-scientific data should be a trusted source for information, analysis and decision making. While global sharing and collaboration of research data has reached unprecedented levels, challenges remain. Trust in at least some of the data is relatively low, and outstanding issues include the lack of specific standards, co-ordination and interoperability, as well as data quality and interpretation. To strengthen the contribution of open science to the COVID-19 response, policy makers need to ensure adequate data governance models, interoperable standards, sustainable data sharing agreements involving public sector, private sector and civil society, incentives for researchers, sustainable infrastructures, human and institutional capabilities and mechanisms for access to data across borders. The COVID19 data is cited critical for vaccine discovery; planning and forecasting for healthcare set up; emergency systems set up and expected to contribute to policy objectives like higher transparency and accountability, more informed policy debates, better public services, greater citizen engagement, and new business development. This is precisely why the need to have “open data” access to COVID-19 information is critical for humanity to succeed. In global emergencies like the coronavirus (COVID-19) pandemic, open science policies can remove obstacles to the free flow of research data and ideas, and thus accelerate the pace of research critical to combating the disease. UNESCO have set up open access to few data is leading a major role in this direction. Thankfully though, scientists around the world working on COVID-19 are able to work together, share data and findings and hopefully make a difference to the containment, treatment and eventually vaccines for COVID-19. Science and technology are essential to humanity’s collective response to the COVID-19 pandemic. Yet the extent to which policymaking is shaped by scientific evidence and by technological possibilities varies across governments and societies, and can often be limited. At the same time, collaborations across science and technology communities have grown in response to the current crisis, holding promise for enhanced cooperation in the future as well. A prominent example of this is the Coalition for Epidemic Preparedness Innovations (CEPI), launched in 2017 as a partnership between public, private, philanthropic and civil society organizations to accelerate the development of epidemic vaccines. Its ongoing work has cut the expected development time for a COVID-19 vaccine to 12–18 months, and its grants are providing quick funding for some promising early candidates. It is estimated that an investment of USD 2 billion will be needed, with resources being made available from a variety of sources (Yamey, et al., 2020). The Open COVID Pledge was launched in April 2020 by an international coalition of scientists, lawyers, and technology companies, and calls on authors to make all intellectual property (IP) under their control available, free of charge, and without encumbrances to help end the COVID-19 pandemic, and reduce the impact of the disease. Some notable signatories include Intel, Facebook, Amazon, IBM, Sandia National Laboratories, Hewlett Packard, Microsoft, Uber, Open Knowledge Foundation, the Massachusetts Institute of Technology, and AT&T. The signatories will offer a specific non-exclusive royalty-free Open COVID license to use IP for the purpose of diagnosing, preventing and treating COVID-19. Also illustrating the power of open science, online platforms are increasingly facilitating collaborative work of COVID-19 researchers around the world. A few examples include: 1. Research on treatments and vaccines is supported by Elixir, REACTing, CEPI and others. 2. WHO funded research and data organization. 3. London School of Hygiene and Tropical Medicine releases a dataset about the environments that have led to significant clusters of COVID-19 cases,containing more than 250 records with date, location, if the event was indoors or outdoors, and how many individuals became infected. (7/24/20) 4. The European Union Science Hub publishes a report on the concept of data-driven Mobility Functional Areas (MFAs). They demonstrate how mobile data calculated at a European regional scale can be useful for informing policies related to COVID-19 and future outbreaks. (7/16/20) While clinical, epidemiological and laboratory data about COVID-19 is widely available, including genomic sequencing of the pathogen, a number of challenges remain: 1. All data is not sufficiently findable, accessible, interoperable and reusable (FAIR), or not yet FAIR data. 2. Sources of data tend to be dispersed, even though many pooling initiatives are under way, curation needs to be operated “on the fly”. 3. In addition, many issues arise around the interpretation of data – this can be illustrated by the widely followed epidemiological statistics. Typically, the statistics concern “confirmed cases”, “deaths” and “recoveries”. Each of these items seem to be treated differently in different countries, and are sometimes subject to methodological changes within the same country. 4. Specific standards for COVID-19 data therefore need to be established, and this is one of the priorities of the UK COVID-19 Strategy. A working group within Research Data Alliance has been set up to propose such standards at an international level. Given the achievements and challenges of open science in the current crisis, lessons from prior experience & from SARS and MARS outbreaks globally can be drawn to assist the design of open science initiatives to address the COVID-19 crisis. The following actions can help to further strengthen open science in support of responses to the COVID-19 crisis: 1. Providing regulatory frameworks that would enable interoperability within the networks of large electronic health records providers, patient mediated exchanges, and peer-to-peer direct exchanges. Data standards need to ensure that data is findable, accessible, interoperable and reusable, including general data standards, as well as specific standards for the pandemic. 2. Working together by public actors, private actors, and civil society to develop and/or clarify a governance framework for the trusted reuse of privately-held research data toward the public interest. This framework should include governance principles, open data policies, trusted data reuse agreements, transparency requirements and safeguards, and accountability mechanisms, including ethical councils, that clearly define duties of care for data accessed in emergency contexts. 3. Securing adequate infrastructure (including data and software repositories, computational infrastructure, and digital collaboration platforms) to allow for recurrent occurrences of emergency situations. This includes a global network of certified trustworthy and interlinked repositories with compatible standards to guarantee the long-term preservation of FAIR COVID-19 data, as well as the preparedness for any future emergencies. 4. Ensuring that adequate human capital and institutional capabilities are in place to manage, create, curate and reuse research data – both in individual institutions and in institutions that act as data aggregators, whose role is real-time curation of data from different sources. In increasingly knowledge-based societies and economies, data are a key resource. Enhanced access to publicly funded data enables research and innovation, and has far-reaching effects on resource efficiency, productivity and competitiveness, creating benefits for society at large. Yet these benefits must also be balanced against associated risks to privacy, intellectual property, national security and the public interest. Entities such as UNESCO are helping the open science movement to progress towards establishing norms and standards that will facilitate greater, and more timely, access to scientific research across the world. Independent scientific assessments that inform the work of many United Nations bodies are indicating areas needing urgent action, and international cooperation can help with national capacities to implement them. At the same time, actively engaging with different stakeholders in countries around the dissemination of the findings of such assessments can help in building public trust in science.

Read More

Self-supervised learning The plan to make deep learning data-efficient

Article | November 20, 2020

Despite the huge contributions of deep learning to the field of artificial intelligence, there’s something very wrong with it: It requires huge amounts of data. This is one thing that both the pioneers and critics of deep learning agree on. In fact, deep learning didn’t emerge as the leading AI technique until a few years ago because of the limited availability of useful data and the shortage of computing power to process that data.Reducing the data-dependency of deep learning is currently among the top priorities of AI researchers.

Read More

How to Overcome Challenges in Adopting Data Analytics

Article | November 20, 2020

Achieving organizational success and making data-driven decisions in 2020 requires embracing tech tools like Data Analytics and collecting, storing and analysing data isn’t.The real data-driven, measurable growth, and development come with the establishment of data-driven company culture.In this type of culture company actively uses data resources as a primary asset to make smart decisions and ensure future growth. Despite the rapid growth of analytic solutions, a recent Gartner survey revealed that almost 75% of organizations thought their analytics maturity had not reached a level that optimized business outcomes. Just like with any endeavor, your organization must have a planned strategy to achieve its analytical goals. Let’s explore ways for overcoming common blockers, and elements used in successful analytics adoption strategies. Table of Contents: - AMM: Analytic Maturity Model - What are the blockers to achieving a strategy-driven analytics? - What are the adoption strategies to achieve an analytics success? - Conclusion AMM: Analytic Maturity Model The Analytic Maturity Model (AMM) evaluates the analytic maturity of an organization.The model identifies the five stages an organization travels through to reach optimization. Organizations must implement the right tools, engage their team in proper training, and provide the management support necessary to generate predictable outcomes with their analytics. Based on the maturity of these processes, the AMM divides organizations into five maturity levels: - Organizations that can build reports. - Organizations that can build and deploy models. - Organizations that have repeatable processes for building and deploying analytics. - Organizations that have consistent enterprise-wide processes for analytics. - Enterprises whose analytics is strategy driven. READ MORE:EFFECTIVE STRATEGIES TO DEMOCRATIZE DATA SCIENCE IN YOUR ORGANIZATION What are the blockers to achieving a strategy-driven analytics? - Missing an Analytics Strategy - Analytics is not for everyone - Data quality presents unique challenges - Siloed Data - Changing the culture What are the adoption strategies to achieve analytic success? • Have you got a plan to achieve analytic success? The strategy begins with business intelligence and moves toward advanced analytics. The approach differs based on the AMM level. The plan may address the strategy for a single year, or it may span 3 or more years. It ideally has milestones for what the team will do. When forming an analytics strategy, it can be expensive and time consuming at the outset. While organizations are encouraged to seek projects that can generate quick wins, the truth is that it may be months before any actionable results are available. During this period, the management team is frantically diverting resources from other high-profile projects. If funds are tight, this situation alone may cause friction. It may not be apparent to everyone how the changes are expected to help. Here are the elements of a successful analytics strategy: • Keep the focus tied to tangible business outcomes The strategy must support business goals first. With as few words as possible, your plan should outline what you intend to achieve, how to complete it, and a target date for completion of the plan. Companies may fail at this step because they mistake implementing a tool for having a strategy. To keep it relevant, tie it to customer-focused goals. The strategy must dig below the surface with the questions that it asks. Instead of asking surface questions such as “How can we save money?”, instead ask, “How can we improve the quality of the outcomes for our customers?” or “What would improve the productivity of each worker?” These questions are more specific and will get the results the business wants. You may need to use actual business cases from your organization to think through the questions. • Select modern, multi-purpose tools The organization should be looking for an enterprise tool that supports integrating data from various databases, spreadsheets, or even external web based sources. Typically, organizations may have their data stored across multiple databases such as Salesforce, Oracle, and even Microsoft Access. The organization can move ahead quicker when access to the relevant data is in a single repository. With the data combined, the analysts have a specific location to find reports and dashboards. The interface needs to be robust enough to show the data from multiple points of view. It should also allow future enhancements, such as when the organization makes the jump into data science. Incorta’s Data Analytics platform simplifies and processes data to provide meaningful information at speed that helps make informed decisions. Incorta is special in that it allows business users to ask the same complex and meaningful questions of their data that typically require many IT people and data scientist to get the answers they need to improve their line of business. At the digital pace of business today, that can mean millions of dollars for business leaders in finance, supply chain or even marketing. Speed is a key differentiator for Incorta in that rarely has anyone been able to query billions of rows of data in seconds for a line of business owner. - Tara Ryan, CMO, Incorta Technology implementations take time. That should not stop you from starting in small areas of the company to look for quick wins. Typically, the customer-facing processes have areas where it is easier to collect data and show opportunities for improvement. • Ensure staff readiness If your current organization is not data literate, then you will need resources who understand how to analyze and use data for process improvement. It is possible that you can make data available and the workers still not realize what they can do with it. The senior leadership may also need training about how to use data and what data analytics makes possible. • Start Small to Control Costs and Show Potential If the leadership team questions the expense, consider doing a proof of concept that focuses on the tools and data being integrated quickly and efficiently to show measurable success. The business may favor specific projects or initiatives to move the company forward over long-term enterprise transformations (Bean & Davenport, 2019). Keeping the project goals precise and directed helps control costs and improve the business. As said earlier, the strategy needs to answer deeper business questions. Consider other ways to introduce analytics into the business. Use initiatives that target smaller areas of the company to build competencies. Provide an analytics sandbox with access to tools and training to encourage other non-analytics workers (or citizen data scientists) to play with the data. One company formed a SWAT team, including individuals from across the organization. The smaller team with various domain experience was better able to drive results. There are also other approaches to use – the key is to show immediate and desirable results that align with organizational goals. • Treating the poor data quality What can you do about poor data quality at your company? Several solutions that can help to improve productivity and reduce the financial impact of poor data quality in your organization include: • Create a team to set the proper objectives Create a team who owns the data quality process. This is important to prove to yourself and to anyone with whom you are conversing about data that you are serious about data quality. The size of the team is not as important as the membership from the parts of the organization that have the right impact and knowledge in the process. When the team is set, make sure that they create a set of goals and objectives for data quality. To gauge performance, you need a set of metrics to measure the performance. After you create the proper team to govern your data quality, ensure that the team focuses on the data you need first. Everyone knows the rules of "good data in, good data out" and "bad data in, bad data out." To put this to work, make sure that your team knows the relevant business questions that are in progress across various data projects to make sure that they focus on the data that supports those business questions. • Focus on the data you need now as the highest priority Once you do that, you can look at the potential data quality issues associated with each of the relevant downstream business questions and put the proper processes and data quality routines in place to ensure that poor data quality has a low probability of Successful Analytics Adoption Strategies, continuing to affect that data. As you decide which data to focus on, remember that the key for innovators across industries is that the size of the data isn’t the most critical factor — having the right data is (Wessel, 2016). • Automate the process of data quality when data volumes grow too large When data volumes become unwieldy and difficult to manage the quality, automate the process. Many data quality tools in the market do a good job of removing the manual effort from the process. Open source options include Talend and DataCleaner. Commercial products include offerings from DataFlux, Informatica, Alteryx and Software AG. As you search for the right tool for you and your team, beware that although the tools help with the organization and automation, the right processes and knowledge of your company's data are paramount to success. • Make the process of data quality repeatable It needs regular care and feeding. Remember that the process is not a one-time activity. It needs regular care and feeding. While good data quality can save you a lot of time, energy, and money downstream, it does take time, investment, and practice to do well. As you improve the quality of your data and the processes around that quality, you will want to look for other opportunities to avoid data quality mishaps. • Beware of data that lives in separate databases When data is stored in different databases, there can be issues with different terms being used for the same subject. The good news is that if you have followed the former solutions, you should have more time to invest in looking for the best cases. As always, look for the opportunities with the biggest bang for the buck first. You don't want to be answering questions from the steering committee about why you are looking for differences between "HR" and "Hr" if you haven't solved bigger issues like knowing the difference between "Human Resources" and "Resources," for example. • De-Siloing Data The solution to removing data silos typically isn’t some neatly packaged, off-the-shelf product. Attempts to quickly create a data lake by simply pouring all the siloed data together can result in an unusable mess, turning more into a data swamp. This is a process that must be done carefully to avoid confusion, liability, and error. Try to identify high-value opportunities and find the various data stores required to execute those projects. Working with various business groups to find business problems that are well-suited to data science solutions and then gathering the necessary data from the various data stores can lead to high-visibility successes. As value is proved from joining disparate data sources together to create new insights, it will be easier to get buy-in from upper levels to invest time and money into consolidating key data stores. In the first efforts, getting data from different areas may be akin to pulling teeth, but as with most things in life, the more you do it, the easier it gets. Once the wheels get moving on a few of these integration projects, make wide-scale integration the new focus. Many organizations at this stage appoint a Chief Analytics Officer (CAO) who helps increase collaboration between the IT and business units ensuring their priorities are aligned. As you work to integrate the data, make sure that you don’t inadvertently create a new “analytics silo.” The final aim here is an integrated platform for your enterprise data. • Education is essential When nearly 45% of workers generally prefer status quo over innovation, how do you encourage an organization to move forward? If the workers are not engaged or see the program as merely just the latest management trend, it may be tricky to convince them. Larger organizations may have a culture that is slow to change due to their size or outside forces. There’s also a culture shift required - moving from experience and knee-jerk reactions to immersion and exploration of rich insights and situational awareness. - Walter Storm, the Chief Data Scientist, Lockheed Martin Companies spend a year talking about an approved analytics tool before moving forward. The employees had time to consider the change and to understand the new skill sets needed. Once the entire team embraced the change, the organization moved forward swiftly to convert existing data and reports into the new tool. In the end, the corporation is more successful, and the employees are still in alignment with the corporate strategy. If using data to support decisions is a foreign concept to the organization, it’s a smart idea to ensure the managers and workers have similar training. This training may involve everything from basic data literacy to selecting the right data for management presentations. However, it cannot stop at the training; the leaders must then ask for the data to move forward with requests that will support conclusions that will be used to make critical decisions across the business. These methods make it easier to sell the idea and keep the organization’s analytic strategy moving forward. Once senior leadership uses data to make decisions, everyone else will follow their lead. It is that simple. Conclusion The analytics maturity model serves as a useful framework for understanding where your organization currently stands regarding strategy, progress, and skill sets. Advancing along the various levels of the model will become increasingly imperative as early adopters of advanced analytics gain a competitive edge in their respective industries. Delay or failure to design and incorporate a clearly defined analytics strategy into an organization’s existing plan will likely result in a significant missed opportunity. READ MORE:BIG DATA ANALYTICS STRATEGIES ARE MATURING QUICKLY IN HEALTHCARE

Read More

Spotlight

BuilDATAnalytics

BuilDatAnalytics is a business intelligence company for the commercial construction industry. BDA provides field crew with its flagship, patent pending software platform called CTBIM. CTBIM captures real-time field activities, which enables the production of insightful analytics for project owners and contractors. Simply stated, CTBIM™ transforms the current way of capturing and managing information during pre-construction, construction and post-construction to an integrated, streamlined and more efficient process. Users of CTBIM™ realize fewer costly errors, a reduction of risk, more efficient use of time and greater profits.

Events