As this year comes to an end, the internet is indeed swirling with trends and buzzwords to follow next year, and we are excited to present you our top 10 of analytics and business intelligence buzzwords for 2018. If you want to know what are the next big things to watch out for, read on!
Virtual Assistant (VA) was on everyone’s lips in 2017, and we want to open our top 10 business intelligence buzzwords with it. VA is a term that encompasses a rather big scope of definition, since it can be 100% human, or a 100% robot, and both are as important. On the human side, a VA is someone providing support services to other people or businesses, from a remote location. In theory they do everything that a support staff do (secretary work for instance), but it is not exclusively limited to clerical work: many virtual assistants also provide help with web design, marketing, and other PR work.
However, we are here more interested on the „virtual“ and digitized side of the VA, as this year came to a head since Amazon, Google or Sonos each held big events to draw attention to their latest smart speakers – and virtual assistants. These virtual assistants, also called AI assistant for Artificial Intelligence, are application programs understanding natural language voice commands. Their goal is to complete tasks for the user: dictation, emails readings, looking up for phone numbers or theater projections, saving up appointments in the agenda, giving weather forecast, or controlling other smart home devices. Today’s popular virtual assistants are Amazon’s Alexa, Google Now, Apple’s Siri, or Microsoft’s Cortana.
The rise of virtual assistants is clear, and Gartner predicts that within two years, in 2019, a quarter of the households in developed countries will use VA as the primary interface to home services. How about businesses? The numerous daily technical tasks might be quickly overtaken by VAs, and the customer interaction is also gradually replaced by chatbots. Likewise, the business intelligence teams within companies might also be transfigured: with a VA connected to all of the organization’s data and information, business users would only need to ask the VA their strategic questions and it will automatically, and faster than any human could, proceed to analyze the data and glean actionable insights to make important decisions. That would be an incredible amount of time saved and a big competitive advantage.
These virtual assistants are based on artificial intelligence platforms, machine learning, natural language processing and speech recognition. All of them require a big amount of data to „feed“ their algorithms that can learn from data input, and improve their assistance thanks to better predictions.
This need for data input has raised however a lot of privacy concerns, as smart home assistants like Alexa are always “listening”, as they need to answer to the wake-up word “Alexa” – and then a light turns on to let the user know that it is ready to respond to voice command. The VA need to remember all of the interactions and commands to learn from them, and provide a better user experience: if you often visit Chinese restaurants, they know that you like Chinese food and will hence be more likely to spot Chinese places when you ask “where can I eat out tonight?”. Cortana for instance needs to use data from the user’s device (emails, contacts, texts, location, search history, …) to work better. There is always the possibility not to sign in and use it “anonymously” (as anonymous as our era lets us be…), but the assistance gets of course limited. And that is all the problem and downward cycle with virtual assistant – and AI in general – of how much do we give for how much do we get. For many it is opening the Pandora box and abandoning our privacy for the sake of technology. AI, natural language processing and voice recognition need us and our data to improve and work better. For businesses, the concern would be even greater as industrial spying and data collection would be taken to a whole new level.
For the moment, all we can say is that their development is at the image of the innovation pace: exponential. With the speech recognition improving everyday, and virtual assistance moving towards cognitive computing, who knows what tomorrow holds in store? „Her“ might not be that far away in time after all. Just try not to fall in love with your AI.
Pattern recognition is one of the strong asset to use in business intelligence. Going through the voluminous datasets of the past is a long and tedious task; but you can ease your work by applying a machine-learning algorithm to identify trends, find patterns, and that applies forecasting techniques to predict the next expected value of a data series.
Pattern recognition in business intelligence lets its users predict the value of an upcoming period based on their historical data. With it, they can compare the expected value to the actual, real outcome and draw conclusions on their strategy and efforts out of it. It can also show some unexpected change in any part of the business chain, but mainly the happening of an unexpected behavior that doesn’t fit in the known pattern.
As today, decision-makers need accurate, coherent, up-to-date information, we need to deliver actionable analytics and insights to help them build solid strategies and make tactical decisions. Pattern BI is here to help at providing them just that.
Automation is not exactly a new business intelligence buzzword, but it is getting more and more important over the years and has grown exponentially over the past year. Formerly considered rather unreliable, business intelligence is today the fastest-growing branch of artificial intelligence (AI), pattern recognition and machine learning. More and more companies are now making informed decisions thanks to data and information, and the next stage is automation. Every company who has adopted a business intelligence strategy shows cost benefits, revenue growth and an increase in customer satisfaction, and an overall growth in terms of performance.
Thanks to online BI tools, organizations manage their voluminous datasets better and can find more accurate insights and answers to their questions, producing quickly reports, dashboards and visualizations. It is an improvement, but there are still many important challenges ahead, especially with the overwhelming volume of big data and its complexity: which questions should business users ask for, and what type of analysis do we need? BI produces so many analyses and insights because it has so much data at hands, that it becomes hard to find which are valuable and which can actually make an impact.
Automation with artificial intelligence addresses these issues. From the insights discovery (automating the valuable questions to ask and areas to explore, not just what the user thinks of and enters as input) to the insights synthesis (automating the ranking of insights from the most impactful to the least, and recognition of relationships between insights), you can benefit from a faster and purposeful decision-making. But BI automation can even be taken a step further, with the automation of insights delivery in an understandable language (like English), explaining each result with a little text. It can also help in removing the human error due to bias (bad habits, outdated beliefs, inaccurate perceptions, personal beliefs, etc), and decrease that way the margin for error and the risk that results will be missing crucial information.
Gartner predicts that by 2020, 40% of data science tasks will be automated, which also means that fewer data scientists will be needed in the future to do the same amount of work. Which also means that data scientists should sharpen other skills to stay relevant! Every type of job will be affected by automation, and it is likely that we will start seeing some white-collar jobs disappear all the same – even if it is said that AI will create more jobs than it takes, the question of whether the workers of the jobs taken down will find a similar-level position remains.
In any case, 2018 business intelligence will improve even faster as automation and machine learning are developing exponentially.
Humanized Big Data
By now, everyone heard, learned, and knows about big data – these datasets gathered everywhere by everything that are so humongous that traditional data processing software are unfit to deal with them. So, the whole concept of “humanizing” big data might seem paradoxical at first, but thinking twice – data starts with humans, so they have to be back in the loop at some point. Getting back in touch with the human side is what is at stake here, and what we’ll discuss with this third data analytics buzzword.
For many, data cannot be entirely automated, and we cannot abandon the processing task to super-smart machines because the human element cannot be removed from the analytical process. Big data biggest strength (the quantity of information) is also its greatest weakness, as it loses touch with reality – the human reality. In his book Humanizing Big Data, Colin Strong asks a lot of questions about how organizations manage to deal with the daily tidal wave of information Big Data delivers. How to approach it? What to do with it? Can only marketers benefit from it? Do we forget consumers in the process, by rendering all of our strategies to soulless technology?
Humanizing big data would mean working directly with it to have the full story, gleaning real insights into human behavior, and driving the consumer strategy. Humanizing aims at bringing more context to information to tell the stories of such information: who and what is generating it? Humanizing wants to simplify the complex, to make abstract concrete and ease the difficulty. Non-data scientists should have the possibility to derive answers and insights from big data analyses that allow for a faster and more accurate decision-making. The value of humanized big data provides organizations with an incredible business value and greater innovation opportunities.
Also on our business intelligence buzzwords shortlist last year, data wrangling continues to be talked about. Data wrangling is the process of converting manually a raw format into another one, allowing thereby an easier and more convenient consumption of the data. This is a very time-consuming step of cleaning data that is necessary in any data analysis.
The data wrangling process follows three steps: data extraction, data wrangling (thanks to algorithms), and finally collection of the results that will be used in the future. This is an increasingly important process to master, as the amount of data gathered and collected by organizations is not going to decrease any time soon and big data is here to stay. With the technological improvements in business intelligence in general and with self-service BI tools, data wrangling will however become easier and not as long and complex for IT teams and data scientists in the future.
Continuous Integration / Continuous Delivery / Continuous Deployment
Our next business intelligence buzzword are more on the IT-side, and tackles software development. Continuous Integration (CI) is a method that requires developers to integrate code into a shared repository several times a day. The goal is to bypass code errors and detect them as soon as possible, as each check-in is verified by automated build and run tests. That way, teams can also locate them more easily than if they had to go through hundred lines of code at the end of a day, or when they wait for the release to merge their changes into the release branch. CI highlights the test automation, to ensure that the application is not broken when new changes are integrated.
Continuous Delivery (CD) is not to be mistaken with continuous deployment, which also has the CD acronym. Continuous delivery also makes sure that it is possible to release changes quickly and sustainably. The automated test is topped by an automated release process that lets you release your app and the changes faster and more frequently, with the click of a button. It is used to reduce risk, time and costs by enabling more incremental updates to apps in production.
With Continuous Deployment, you go a step further by bypassing human intervention: the release is automatic and is only stopped by a failed test in the integration. Continuous deployment relieves the teams from a lot of stress as the “release day” no longer takes place: developers can focus more on building the software and witness their efforts at work minutes after they have finished.
In short, continuous deployment is like continuous delivery but with automatic releases, while continuous integration is stage one of both continuous delivery and deployment.
Informed Data Lake
Once again, Informed Data Lake made it into our business intelligence buzzwords shortlist for 2018. Data lakes and data warehouses, that is the opposition we usually do, so let’s go over some definition to clear it up.
A data lake, according to Gartner’s research director Nick Heudecker, is “marketed as an enterprise wide data management platform, for analyzing disparate sources of data in its native format”. The idea behind it is rather simple: moving data in its original format into a data lake, instead of placing it in a purpose-built data store. That way, you can easily bypass the often-heavy upfront costs of data ingestion. Once you have place your data in the lake, anyone in the organization can access it and ultimately, analyze it.
The difference with a data warehouse is that data lakes can support all types of data and all types of users; they can also adapt to changes more easily and they provide faster insights. That doesn’t mean that data warehouses are left behind, as on the other side, relational database software continue to be improved and their development heads towards making data warehouses more scalable, faster and more reliable.
This data analytics buzzword is somehow a déjà-vu. Augmented analytics was indeed previously referred to as “Smart Data Discovery”. It is the combination of several data processes that, instead of just giving back data, while provide valuable, strategy-changing recommendation. It is an approach that automates insights, using natural language generation and machine learning – and as we have seen all along this article, machine learning automation is everywhere and affecting everything, transforming the way we build, analyze and consume analytics in the future.
Augmented analytics would include augmented data preparation, but also augmented data discovery and finally augmented data science and machine learning. As Gartner explains, what is central to the development of augmented analytics is the use machine learning automation to improve human intelligence and the understanding of the context across the whole analytics workflow. Augmented analytics will help in providing unbiased material to make better decisions and a more impartial context comprehension, and transform the way we interact with data.
Augmented analytics will also pave the way to an upcoming trend not developed nor democratized yet, that we mentioned at the beginning of this post with the virtual assistants’ development: conversational analytics. Using natural language processing (voice or text), business users will be able to explore their data, generate queries, receive and act on insight via VA or mobile devices.
Another data analytics buzzword that we like and chose to add to our list is the digital twin. Imagine if you had a digital version of yourself that you could run tests on to see how your life would turn out according to different choices you made.
Oh, you want to see what that move to Paris would have been like? Or how getting that extra degree would have affected your current state of affairs?
A digital twin is sort of like that, but for supply chains and industrial applications. Basically, the technology works by “building” a virtual version of a factory, or a wind turbine, or an element in a supply chain.
This digital twin can then be subjected to tests designed to increase efficiency without doing any physical work. The digital twin also can get data from sensors on the physical twin, allowing monitoring of equipment safety and status without physically checking the equipment as often.
This technology is already being used in CAD 3D models and manufacturing simulations, but in 2017 things are going to be taken further. And results are already being seen. As GE reports, that Black & Decker is using digital twins on one of their factory’s assembly lines, leading to “labor utilization improvements of 12% and a 10% increase in throughput.”
Last but not least, we will finish this round of analytics and business intelligence buzzwords with accessible BI. As a result of another BI buzzword aforementioned – automation – and permanent innovation, business intelligence will become widely accessible for the common of mortals.
The image of SQL experts, data scientists and system analysts working on data to extract the maximum possible is becoming obsolete. BI already helped simplifying data analysis for many business users, and the widespread adoption of self-service BI software democratized data within organizations. Automated business intelligence increases that process and will make BI accessible to anyone and everyone: it will no longer be restricted to small groups of specialized people, and “citizen data scientists” will become the norm. Modern BI means less specialization, more automation, and an easy approach to data analytics for everyone.
By creating more streamlined processes to dig deep into business data, productivity will increase, and it will also help overcoming the skills gap. Business intelligence will hence become more accessible, democratizing data in 2018 more than ever before.
What Are The Analytics & Business Intelligence Buzzwords For 2018?
To survive and thrive in our digital era, it is important to stay up-to-date with the latest innovation and trends. Here’s our top 10 analytics and business intelligence buzzwords we’ll all discuss next year:
- Virtual Assistant
- Pattern BI
- Humanized Big Data
- Data Wrangling
- Continuous Integration/Continuous Delivery/Continuous Deployment
- Informed Data Lake
- Augmented Analytics
- Digital Twin
- Accessible BI
Everything evolves and changes, today faster than ever before. Once again, we expect a big future for the BI industry, and we are more than excited to see how all the buzz we discussed above will develop in 2018!