The modern world is changing more and more quickly with each passing year. If you don’t pay attention to these new changes, it’s easy to fall behind the times (and the market) while other companies beat you to the punch. The solution? To keep abreast of current changes – at least at a level of basic understanding. Adding to that, if you can’t understand the buzzwords others are using in conversation, it’s much harder to look smart while participating in that conversation. In this post, we’re going to give you the 12 IT & technology buzzwords you won’t be able to avoid in 2017 so that you can stay poised to take advantage of market opportunities and new conversations alike.
Alexa, Siri, Google Now, or Cortana: we all know those names, and use them more and more often: “Siri, what are the movies showing at Cineplex 7?”
Google Now and Siri have become ubiquitous, thanks in large part to the voice recognition software becoming much more powerful over the past year. Siri, Cortana, and Alexa all share similar roles – enabling us to live one step closer to the futuristic notions of having AI virtual assistants that can do anything we need on a whim.
Some of these crazy applications are now reality. For example, iPhone users can now use Siri to book an Uber or Lyft – without ever opening the app. All you have to do is say, “Siri, take me home with an UberX” and Siri will use your current location, along with your stored location of your home to get a ride.
Artificial Intelligence (AI)
Scientists have been working on AI for years and in 2017 we will see its new applications. We have already mentioned Artificial Intelligence in our Business Intelligence trends for 2017, and we will probably talk about it again in the future, given the major importance it assumes.
AI refers to the autonomous intelligent behavior of software or machines that have a human-like ability to make decisions and to improve over time by learning from experience. Currently, popular approaches include statistical methods, computational intelligence and traditional symbolic AI. There are a large number of tools used in AI, including versions of search and mathematical optimization, logic, methods based on probability and economics, and many others. In business intelligence, we are evolving from static reports of on what has already happened to proactive analytics with real-time dashboards assisting businesses with more accurate reporting. They indeed enable you to see what is happening at every moment, and send alerts when something is off-trend. For instance, the datapine solution comprises an AI algorithm based on the most advanced neural networks for its alerts. That way, any anomaly is identified with a high accuracy, as it learns from historical trends and patterns: every unexpected event will be notified and an alert sent.
Some more examples of AI application can be found in various domains: in 2017 we will experience more AI in combination with Big Data in healthcare. Heart monitors, health monitors and EEG signal processing algorithms are already on the research frontline. Likewise, major advances have been made in the field of self-driving cars, as Google’s stand alone company Waymo logged 800,000 miles of autonomous driving in 2016. Additionally, Uber is teaming up with Mercedes Benz to have self-driving Mercedes on the road sooner rather than later. Another direction in which AI is heading is the introduction of a truly smart smartphone which would be able to make stuff for us. For example, you could tell your phone about the trip you plan and it would book the most convenient flight, hotel and rental car for you. Who knows, maybe AI will even have the potential to make or at least help us with our strategic business decisions in the near future.
A Business Insider article shows the stunning potential for growth in this space, stating. “Research firm Markets and Markets estimates that the AI market will grow from $420 million in 2014 to $5.05 billion by 2020.”
Augmented Reality / Virtual Reality
Another trending tech buzzword is Augmented Reality. It exploded onto the world stage in a big way with the release of Pokemon Go. With over 500 million downloads in the two months after the game was released, Pokemon GO was extremely popular, to say the least. Augmented reality is being used in small ways by smartphones to project data onto the physical environment, and this trend will only continue.
If you don’t know how augmented reality works, we’ll use Pokemon Go as an example. When you are using the app, as you walk around during your daily life you can see Pokemon transposed over reality (as streamed in through your smartphone’s camera):
Pokemon Go is just the beginning – after the monumental success of this app, we’re sure to see many other developers looking to cash in on the AR space.
On the virtual reality side of things, Oculus Rift was released in 2016, and the potential applications of this technology are only just beginning to be explored. Essentially, the Oculus Rift technology allows people to see and hear in a virtual reality environment, through the use of goggles with non-flickering screens and 360 degree sound. While mostly used for games, Facebook has been exploring work applications of the Oculus Rift, and the Norwegian Army has used a development kit of the Rift for training armored unit drivers. Fox Sports even produced some college football coverage designed for consumption via the Oculus Rift.
In a previous post on the BI buzzwords for 2017, we tried to think of an application of Augmented Reality to the data visualization domain, and came out with the possibility of visualizing big data with augmented reality, in supply chain or logistics for instance, smoothing the data management.
Deep Learning / Advanced Machine Learning
Deep learning is the new big trend in machine learning and an IT buzzword that we will see trending in 2017. It promises more powerful and fast machine learning, moving us one step closer to AI.
An exemplary application of this trend would be Artificial Neural Networks (ANN) – the predictive analytics method of analyzing data. Neural networks create a system of interconnected layers with each subsequent layer acting as a filter for more and more complex features that combine those of the previous layer. This feature hierarchy and the filters which model significance in the data, make it possible for the layers to learn from experience. Thus, deep nets can crunch unstructured data that was previously not available for unsupervised analysis.
The blockchain is the trendiest trend of the moment. Everyone has heard of it but few actually understand how it works due to its relative complexity, so we will try to introduce it here. The blockchain works with Bitcoin, and depending on the circles you run in, Bitcoin was either:
- The shadiest thing that has ever happened to the internet
- The coolest innovation in modern currency, ever
- A non factor since you had no idea what it was
In order to clear up any potential confusion, Bitcoin is a system of currency that doesn’t rely on banks, countries, or any outside institutions. This is potentially a very big deal, as there are many people living in developing countries that have to deal with issues like hyperinflation, not being able to exchange their currency for others, and having to exchange currency on the black market.
Bitcoin can solve all that – but the technology underlying Bitcoin, called blockchain, is the real star of the show. Blockchain is what enables Bitcoin users to be able to exchange currency without any fear of being ripped off or getting “counterfeit” Bitcoin. Basically, blockchain works by keeping a record of each transaction that happens using Bitcoin as a currency. This record is completely transparent to everyone and is part of the fundamental structure of Bitcoin.
As Investopedia puts it: “To use conventional banking as an analogy, the blockchain is like a full history of banking transactions. Bitcoin transactions are entered chronologically in a blockchain just the way bank transactions are. Blocks, meanwhile, are like individual bank statements.”
This blockchain structure makes it very difficult to forge Bitcoin or do any sort of fraudulent activities involving the currency itself. While Bitcoin is not felt to take off this second, it still has made history last week, when it surpassed the price of gold for the first time. And indeed, its solidity is shown as blockchain is starting to be used by major banks around the world as a way to pay large amounts of money with less time spent on security, thanks to the safety of the blockchain. For example, in October 2016 Wells Fargo and The Commonwealth Bank of Australia made history by using blockchain to facilitate paying for a shipment of cotton from the U.S. to China.
Everything On-Demand (The Uber Effect)
Uber has dramatically changed some aspects of modern life. For example, Uber has originated a whole category of apps which give you the ability to get things on demand via your phone. Let’s take a look at some of them:
- Food delivered to you in 15 minutes or less for $8 – via UberEATS
- Flowers delivered to any location you like within 90 minutes – via BloomThat
- Groceries delivered directly to your door within 1 hour- via Instacart
- Your laundry picked up, laundered or dry-cleaned, and delivered to you – via DryV
- Alcohol delivered to your door within 1 hour – via Saucey
This trend will continue to affect our lives as more and more industries get ‘Uberfied”.
Imagine if you had a digital version of yourself that you could run tests on to see how your life would turn out according to different choices you made. So, you want to see what that move to Paris would have been like? Or how getting that extra degree would have affected your current state of affairs?
A digital twin is sort of like that, but for supply chains and industrial applications. Basically, the technology works by “building” a virtual version of a factory, or a wind turbine, or an element in a supply chain. This digital twin can then be subjected to tests designed to increase efficiency without doing any physical work. The digital twin also can get data from sensors on the physical twin, allowing monitoring of equipment safety and status without physically checking the equipment as often.
This technology is already being used in CAD 3D models and manufacturing simulations, but in 2017 things are going to be taken further, and results are already being seen. As GE reports, Black & Decker is using digital twins on one of their factory’s assembly lines, leading to a “labor utilization improvements of 12%, and a 10% increase in throughput.”
Smart Factory / Industry 4.0
The first industrial revolution introduced the mechanization of production using water and steam power. A hundred years later electricity, petroleum, new materials and substances, including alloys and chemicals, and communication technologies gave rise to the second industrial revolution. The third one was digital, with electronics and IT further automating production.
For the last few years scientists have been talking about the fourth revolution which is happening now as an effect of the advent of collective technologies and concepts of value chain organization. In short, cyber-physical systems (i.e. workpiece carriers, assembly stations and products), humans and smart factories will be able to connect and communicate with each other via the Internet of Things and the Internet of Services. The factory of the future – the smart factory – will be a merger of IT and manufacturing. Machines will be interconnected and intelligent, and thanks to software applications will make use data to make smart, timely and data-driven decisions.
Actionable Analytics / Self-service analytics
Actionable Analytics will be one of the most used technology buzzwords in 2017 as the volume, velocity and variety of big data will push business analytics to become more actionable. In 2017, software will crunch and correlate structured data and move closer to the point of action in real time. As opposed to older systems that primarily aggregated and computed structured data, actionable analytics tools will be able to reason, learn and deliver prescriptive advice.
New self-service business intelligence tools will provide users with advanced analytics and friendly user interfaces that would enable autonomous and informed decision-making. Business people will be able to pull valuable data insights whenever and wherever they want as cloud-based warehousing enables easy access on the fly. In theory, missed business decisions won’t belong in the future but some decision-makers will still ignore data and follow their guts – humans will remain the weakest part of the system.
Internet of Things / Device Mash / Ambient UX
Without a doubt the Internet of Things (IoT) is one of the most influential it buzzwords of recent years and will continue to grow in popularity as its applications will become more and more tangible.
With the Internet of Things, the physical world will become one big information system. Everyday physical objects will be connected to Internet and to each other creating the ambient intelligence. The new task of designers will be facilitating of an ambient user experience that smoothly flows across and exploits different devices. The device mesh refers to an expanding set of endpoints people use to access applications and information. The device mesh includes mobile devices, wearables, consumer and home electronic devices, automotive devices and environmental devices — all sensors in the Internet of Things that interact and cooperate with each other and that will constitute our Internet-connected reality.
The major holdup right now that is preventing the “smart home” revolution from happening is that there are too many differing platforms on the market. Google, Amazon, and Apple each have their own “ecosystems” that don’t play well with each other. However, it’s likely that soon this problem will be solved by the forces of innovation and capitalism – so keep an eye on this space in 2017.
React JS / React Native
The main idea of ReactJS is building reusable components that make code reuse and testing more affordable. ReactJS improves teamwork by enforcing workflow patterns, increasing the speed and simplifying the coding process. The new concept of componentized user interfaces is the future of web development.
Last but not least, Quantum Computing will be a rising tech buzzword in 2017. In general quantum computers can solve much more complex problems than classical electronic computers by using quantum bits (qubits) instead of binary digits (bits). This means that the data doesn’t have to be limited to two defined states any more: 0 or 1. For this reason quantum computing is much more flexible by allowing computations to be performed in parallel.
However the real challenge is how do quantum machines really carry out these quantum computations. Scientists have been researching in this field for decades, so it may still take some years before quantum computing becomes a reality.
The rate of change in our modern world is accelerating with each passing year. However, by keeping pace with these changes, you can take advantage of new opportunities in the market that your slower competitors either can’t see or can’t act upon quickly enough.
Hopefully by reading these 12 technology buzzwords to watch out for in 2017 you’ve learned what to pay attention to in the headlines – the rest is up to you!