Over the past decade business intelligence has been revolutionized. Data exploded, and became big. We all gained access to the cloud. Spreadsheets finally took a backseat to actionable and insightful data visualizations and interactive business dashboards. The rise of self-service analytics democratized the data product chain. Suddenly advanced analytics wasn’t just for the analysts.
2016 was a particularly major year for the business intelligence industry. The trends we presented last year will continue to play out through 2017. But the BI landscape is evolving and there are emerging trends in business intelligence to keep an eye on. In 2017 business intelligence strategy will become increasingly customized to each business. Businesses, of all sizes, are no longer asking if they need increased access to business intelligence analytics but what is the best BI solution for their specific business. Businesses are no longer wondering if data visualizations improve analyses but what is the best way to tell each data story. 2017 will be the year of collaboration and embedded BI tools: clean and secure data combined with simple and powerful presentation. It will also be a year of digitization and artificial intelligence. datapine is excited to see what 2017 will bring. Read on to see our top 11 business intelligence trends for 2017!
1) Predictive and Prescriptive Analytics Tools
Business analytics of tomorrow is focused on the future and tries to answer the questions: What will happen? How can we make it happen? Accordingly, predictive and prescriptive analytics are by far the most discussed business intelligence trends among the BI professionals.
Predictive analytics is the practice of extracting information from existing data sets in order to forecast future probabilities. It’s an extension of data mining which refers only to past data. Predictive analytics includes estimated future data and therefore always includes the possibility of errors from its definition. Predictive analytics indicates what might happen in the future with an acceptable level of reliability, including a few alternative scenarios and risk assessment. Applied to business, predictive analytics is used to analyze current data and historical facts in order to better understand customers, products and partners and to identify potential risks and opportunities for a company.
Industries harness predictive analytics in different ways. Airlines use it to decide how many tickets to sell at each price for a flight. Hotels try to predict the number of guests they can expect on any given night in order to adjust prices to maximize occupancy and increase revenue. Marketers determine customer responses or purchases and set up cross-sell opportunities, whereas bankers use it to generate a credit score – the number generated by a predictive model that incorporates all of the data relevant to asses’ person’s creditworthiness.
Among different predictive analytics methods two attract recently the most publicity – Artificial Neural Networks (ANN) and Autoregressive Integrated Moving Average (ARIMA). In neural networks data is being processed in a similar way like in biological neurons. Technology duplicates biology – information flows into the mathematical neuron, is processed by it and the results flow out. This single process becomes a mathematical formula that is repeated multiple times. As in the human brain, the power of neural networks lies in their capability to connect sets of neurons together in layers and create a multidimensional network. The input to the second layer is from the output of the first layer, and the situation repeats itself with every layer. This procedure allows for capturing associations or discovering regularities within a set of patterns with the considerable volume, number of variables or diversity of the data.
ARIMA is a model used for time series analysis that applies data from the past to model the existing data and make predictions about the future. The analysis includes inspection of the autocorrelations – comparing how the current data values depend on past values – especially choosing how many steps into the past should be taken into consideration when making predictions. Each part of ARIMA takes care of different side of model creation – autoregressive part (AR) tries to estimate current value by considering the previous one. Any difference between predicted data and real value are used by the moving average (MA) part. We can check if these values are normal, random and stationary – with constant variation. Any deviations in these points can bring insight into the data series behavior, predicting new anomalies or helping to discover underlying patterns not visible by bare eye. ARIMA techniques are complex and drawing conclusions from the results may not be as straight forward as for more basic statistical analysis approaches. But once the basic principles are grasped, the ARIMA provides very powerful tool for predictive analysis.
Prescriptive analytics goes a step further into the future. It examines data or content to determine what decisions should be made and what steps taken to achieve an intended goal. It is characterized by techniques such as graph analysis, simulation, complex event processing, neural networks, recommendation engines, heuristics, and machine learning. Prescriptive analytics tries to see what the effect of future decisions will be in order to adjust the decisions before they are actually made. This improves decision-making a lot as future outcomes are taken into consideration in the prediction. Prescriptive analytics can help you optimize scheduling, production, inventory and supply chain design to deliver what your customers want in the most optimized way.
2) Artificial Intelligence (AI)
This is the trend number #1 chosen by Gartner in their 2017 Strategic Technology trends report. Artificial intelligence (AI) is the science aiming to make machines execute what is usually done by complex human intelligence.
Often seen as the highest foe friend of the human race in movies (Skynet in Terminator, The Machines of Matrix or the Master Control Program of Tron), AI is not yet on the verge to destroy us, in spite the fear of some reputed scientists and tech-entrepreneurs.
In the meantime we work on programs to avoid such inconvenience, AI and machine learning are revolutionizing the way we interact with our analytics and data management.
We are evolving from static, passive reports of things that have already happened to proactive analytics with real-time dashboards helping businesses to see what is happening at every second and give alerts when something is not how it should be. The datapine solution includes an AI algorithm based on the most advanced neural networks, providing a high accuracy in anomaly detection as it learns from historical trends and patterns. That way, any unexpected event will be notified and will send you an alert.
The demand for real-time data analysis tools is increasing and the arrival of the IoT (Internet of Things) is also bringing an uncountable amount of data, which will promote the statistics analysis and management at the top of the priorities list. However, businesses today want to go further and predictive analytics is another trend to be closely monitored, as we have seen above. Gartner predicts that more than half of all large organizations worldwide will use advanced analytics and algorithms built on them to be more competitive by 2018. AI will be at the heart of those algorithms that understand the data and can predict what is upcoming, and its deep learning will probably make the machines operate autonomously and take decision in the place of a real person. Such a change would mightily transform decision-making and managers will need to know how algorithms reach their conclusion and eventually adjust. Businesses will also have to decide whether (semi) automated decision-making should be in the hands of algorithms or not.
3) Business Intelligence Center of Excellence
Moving towards a more secure, simpler, and effective business intelligence strategy won’t all fall on IT. The complexity of the data management bundle in big companies is staggering, and the need to reinforce and clarify it is becoming a priority. As one of the major business intelligence trends in 2017 we will see an increasing number of organizations establishing a BI and Analytics Center of Excellence (CoE) to foster adoption of self-service analytics. These centers will play a critical role in implementing a data-driven culture and extract a maximum of benefit from a BI solution.
Through tools like online forums and one-on-one training, the CoE’s will empower even non-experts to incorporate data into their decision-making. It is a good way to get people, processes and technology all aligned in a structured manner, and hence plays a great role in change management: interactions between the different geographies, cultures and units are facilitated. According to Liquidhub, three different models can be implemented according to the reporting culture a company has:
- CoE can be part of an IT unit reporting to the CIO.
- CoE can be part of a functional shared services model.
- CoE can be part of a corporate shared services model, leveraged by all the divisions.
Over time, these centers will increasingly enable data to inform workflow across the entire organization and strategies formulation as well as resources organization gets streamlined.
4) Collaborative Business Intelligence
Today managers and workers need to interact differently as they face an always-more competitive environment. More and more, we see a new kind of business intelligence rising: the collaborative BI. It is a combination of collaboration tools, including social media and other 2.0 technologies, with business intelligence software. This is developed in a context of enhanced collaboration addressing the new challenges the fast-track business provides, where more analyses are done and reports edited. When talking about collaborative BI, the term “self-service BI” quickly pops up in the sense that those self-service BI tools do not require an IT team to access, interpret and understand all the data.
These BI tools make the sharing easier in generating automated reports that can be scheduled at specific times and to specific people for instance; they enable you to set up intelligent alerts, share public or embedded dashboards with a flexible level of interactivity. All these possibilities are accessible on all devices which enhances the decision-making and problem-solving processes.
Collaborative information, information enhancement and collaborative decision-making are the key focus of new BI tools. But collaborative BI does not only remain around some documents exchanges or updates. It has to track the various progress of meetings, calls, emails exchanges and ideas collection. As the founder of 9sight consulting Barry Devlin says, “It is much more than sharing the results from a particular BI tool; it’s about sharing the set of information that is being gathered within a team.”
5) Cloud Analytics
The ubiquity of cloud is nothing new for anybody who stays up-to-date with Business Intelligence trends. In 2017 the cloud will continue its reign with more and more companies moving towards it as a result of the proliferation of cloud-based tools available on the market. Moreover, entrepreneurs will learn how to embrace the power of cloud analytics, where most of the elements – data sources, data models, processing applications, computing power, analytic models and data storage – are located in the cloud.
6) Embedded Business Intelligence
This business intelligence trend refers to the integration of a BI tool like datapine or selected features, into another business application to fill the gaps in the application’s analytics or reporting functionality. With embedded BI you can turn raw data into interactive dashboards, enhancing the user experience with real-time analytics and innovative data visualizations, enabling people to make data-driven decisions faster and on their own.
These capabilities may be located outside of the application but they have to be easily accessible from inside the app so that the user doesn’t have to alternate between systems and become accustomed to another user interface and structure. This way, embedded BI adds features that are usually specific to BI software, enriching the application and making things simple for the user who also won’t need to install or adapt to a new tool. The time between collection of data and analysis of it is also shortened.
Today, collecting data has become easier than ever before, but some critics often say that by the time the business users would get the reports/dashboards, it would already be too late to undertake any action. This is where embedded BI steps in, facilitating in dealing and addressing that issue in shifting from reactive analytics to proactive analytics.
At datapine, our Authentication and Value Communication Module (AVCM) eases and speeds up the complex process of displaying only the relevant content for the user and restricting the access to the only data the user is allowed to see. Such module will enable you to deliver user-specific content on all your embedded dashboards, which can themselves be styled and re-branded so as to have the same aspect as your current application.
Security is without doubt one of the biggest business intelligence trends in the last years. The news seem to be filled with reports of data breaches and data security issues, including huge data losses by big brands such AOL, MySpace, Compass Bank, AT&T, NHS, LinkedIn, Apple, JP Morgan Chase, and Anthem. While the big companies make the news, concerns are also being raised over the vulnerability of small businesses.
Database security has become a hot debate, both in the public and private organizations. This will only pick up speed in 2017. Business owners will increasingly search for the most secure solution that averts the risk of data breach and losses.
In this context a usually hot debate is the decision between on-premises or cloud-based BI tools: whether the software is installed locally in the company’s own servers, or if the solution is hosted in the cloud. At datapine we support both options. Earlier this year, we wrote an article balancing the advantages and drawbacks of the different solutions, which can be summarized with the chart below.
8) Data Governance
According to the DGI (Data Governance Institute), data governance is “the exercise of decision-making and authority for data-related matters.” In other words, it is the control over any data entry that has to be made accordingly to particular standards. Data, access, and security issues don’t all deal with data breaches. In 2017, organizations will increase focus on data governance and data quality. As data is only useful when it’s accessible, organizations will increasingly look to strike a balance between data access and security. They also must learn to remain agile and adapt it as the business changes.
New data preparation tools and methods will help fuel this trend and decrease the cultural gap between business and technology. Organizations are learning that data governance can help nurture a culture of analytics and meet business needs. Also, people are more likely to dig into their data when they have centralized, clean, and fast data sources. As Gartner analyst Merv Adrian recently tweeted “Well-managed data is mandatory before you move to advanced analytics. Build controls for your Big Data & Advanced Analytics Pipeline (BAAP).
The rush to implement self-service business intelligence capabilities has led to major Excel-like governance issues for a lot of organizations. In 2017, organizations will look to reinstate trust and reliability back into analytics practices.
Digitization is the process of turning any kind of analog signal (or image, sound, video) into a digital format that will be understood by computers and electronic devices. This information is often easier to store, access and share than the original format (for instance, turning a song recorded into binary code).
Applied to companies, that would mean to transform manual or offline business processes to online network and computer-supported processes will be a major business intelligence trend in 2017. According to a McKinsey study, the benefits of digitizing information-intensive processes are tremendous: the costs can be cut up to 90% and a huge improvement in the turnaround times can be made too. Developing and using software to take over paper and manual process enables businesses to collect and monitor data in real time, which helps managers to see and address issues before they become critical. In this way, they can understand process-performance better, as well as costs drivers or risk causes.
In the future, the most important raw material will be smart data that will need to be taken care of with the right tools. In order to avoid lagging behind, companies will have to hop on the digitization train but also to implement new data sources such as sensors or devices connected to the Internet, and develop new models to drive new businesses processes that used to be analog.
10) Visual Data Discovery
Big Data has reached a volume that is now insurmountable even for data scientists. When they step inside the data, they don’t know initially where it will lead. Often, they begin their analysis with visual data discovery to find patterns or structures in data sets that seem at first sight impenetrable. With the use of different data visualization tools they try to discover relationships between data elements across multiple data sets for subsequent data analysis. That’s the value of visual data discovery – you arrive at unexpected data insights, identified on the fly in real-time and respond quickly and decisively to reduce risk, enhance profits or jump on short-lived business opportunities.
Similarly to visual data discovery, explorational visual analytics tools allow you to dig into big data with the use of visualizations and best practices in visual perception exploration. Such tools support business agility and self-service BI through a variety of innovations that may include in-memory processing and mashing of multiple data sources to inform your decisions. Explorational visual analytics is based on experimentation, creativity and predefined questions, visualizations are often created ad hoc to check different alternatives.
11) Data Storytelling and Data Journalism
The recent years have witnessed a major shift from written to visual communication. The volume of inflowing information increases, attention spans get shorter, and we’re used to jump from headline to headline or from bullet point to bullet point rather than dig into the text. In order to catch and retain our attention, journalists, or other professionals assigned with the task of passing on information, turn to infographics. Thanks to its capability to communicate a complex set of data on a single meaningful graph, data visualization is worth a 1000 words.
In 2016 the use of programming to gather and combine information will be an obvious necessity. Moreover, the use of data visualizations will amplify, as more and more data presenters will notice that it’s the attractive visuals rather than tables with numbers or paragraphs of text that succeed at grabbing our attention.
Become data-driven in 2017!
Being data driven is no longer an ideal; it is an expectation in the modern business world. 2017 will be an exciting year of looking past all the hype and moving towards to extract the maximum value from state-of-the-art business intelligence software.