These days, it seems whenever technology is mentioned in the media, there is a new buzzword to boot.
Navigating them can become a nightmare, and you’d be forgiven for not being able to tell your VR from your AI and your IoT.
I-O-what-to-who?
I’ve decided to gather the super-futuristic tech lingo in one place and lay it out in layman’s terms.
If you want to sound cool in front of your web developer pals, or just want to see a simple, clear list of the tech advances that will shape the landscape over the next few decades, you’re in the right place. Interested in one term in particular? Simply select it from the list below to skip right to it!
- Net Neutrality
- Big Data
- Data Mining
- Actionable Analytics
- Artificial Intelligence (AI)
- Machine Learning
- Personalization
- Voice Recognition
- Chatbots
- Augmented Reality (AR)
- Virtual Reality (VR)
- Robotics
- Smart Industry 4.0
- Internet Of Things (IoT)
- Quantum Computing
- Blockchain
- Technological Unemployment
If you’ve got more terms you think are a) misunderstood or b) baffling, let us know and we’ll add them in!
1. Net Neutrality
One term that always seems to come back into view is Net Neutrality. It has it all: futuristic, political, and a little bit sinister. Net Neutrality is the concept that a government or an Internet Provider should treat all data on the internet the same way, no matter where it comes from, where it is going, or what it contains.
This concept is vital when it comes to internet privacy because it prevents companies or governments from paying to prioritize their traffic.
Picture this: Without Net Neutrality, an Internet Provider could pay to prioritize the traffic of their video streaming service over Netflix’s. This could mean that Netflix’s service would be terrible in that area, forcing users to sign up to the Internet Provider’s service instead.
Conversations around Net Neutrality are currently coming up again in the United States. If you live in the US, I urge you to inform yourself and join the discussion – even contact your government representatives and put the pressure on!
2. Big Data
A lot of the buzzwords we hear these days have to do with data, the most common being Big Data. This buzzword is used to describe very large amounts of data collected by companies or institutions.
Big Data typically refers to a set of data so big that traditional analysis software struggles to analyze it.
This could be data about what users click on a particular website or user’s watching habits on Netflix. This type of data can be useful for companies, but only if they know how to glean information from them.
3. Data Mining
The concept of discovering patterns from large amounts of data is known as Data Mining.
However, it is important to note, that the term Data Mining is often misused to mean the process by which those patterns are determined or the computer applications that use that information.
Some of the terms below are tools or techniques used to mine data.
4. Actionable Analytics
Actionable Analytics is a term you will hear in conversations about Big Data.
It means the act of analyzing data which leads to a company making some sort of concrete action.
In other words, it means analyzing data to find problem areas and their causes, to then determine what to do to solve those issues.
For example, a data-driven company could apply Actionable Analytics to determine why they aren’t selling a particular product and figure out what they need to do to sell more of it.
5. Artificial Intelligence (AI)
When talking about data and gaining insights from it, the term “Artificial Intelligence” will often be thrown around.
AI is a broad term which refers to the display of intelligence by machines. It’s a field of study which focuses on a machine’s ability to capture data about its environment and learn or adapt from it to achieve some goal.
Through AI advances, machines have bested the best players at chess and Jeopardy, and Go. And that’s even before Chat GPT and its associates arrived on the scene!
While the current discourse says “It won’t be long before machines are better than humans at almost everything”, that’s not quite right. Instead, if you have a good command of the building blocks of coding, AI and web development will be a winning combination.
6. Machine Learning
Machine Learning is a type of Artificial Intelligence which gives computer programs the ability to learn from new data without being explicitly programmed to do so.
Machines dig through data to find patterns and then modify the way they work accordingly.
For example, Facebook uses Machine Learning algorithms to personalize the content of your news feed, based on what type of links you often click on.
7. Personalization
Personalization is the concept of customizing the information presented to a user of a product. This is often the reason why companies collect and analyse large amounts of data: so they can personalize your experience with their product, in order to keep you coming back for more.
Personalization is often applied by large companies in the form of targeted ads or recommendation tools, such as Google Ads, Amazon’s product recommendation, or Facebook’s friend recommendation.
8. Voice Recognition
Voice Recognition is the concept of translating human speech into text that can be understood by computers.
This text can be used by computer programs in a multitude of ways, whether to write documents or carry out commands. This field can be incredibly complex because language itself is so complex – for this reason, CareerFoundry UX Design Course writers have worked with Amazon Alexa to create a specialization course for Voice Design.
A word’s meaning can be entirely dependent on the context in which it appears.
As a simple example, the word pair, meaning two of something, sounds exactly like the fruit pear. Voice Recognition programs need to be able to determine which of these words the user said based on context from the rest of the sentence and apply it later if such a case comes up again; a great example of Machine Learning.
9. Chatbots
Chatbots are computer programs which conduct “human like” conversations with users, typically via text. They are typically used in Instant Messaging applications, such as Slack, or on websites to help users with frequently asked questions.
Machine Learning and Language Analysis are used to help make chatbots seem more human like and improve the way they communicate with their users.
10. Augmented Reality (AR)
AR is the notion of adding computer generated elements, such as sounds, videos or GPS data, to the real world. These are typically done through some piece of technology like a smartphone or AR glasses.
Pokemon Go is a great example of an Augmented Reality game, as it uses interaction which varies depending on where you are in the world and how you interact with it.
11. Virtual Reality (VR)
Virtual Reality is similar to AR but it usually encompasses more of the user’s senses.
A user will typically wear goggles and headphones which block out the outside world and completely immerses them into the virtual reality created by the program. Examples include the Oculus Rift, the Google Daydream, or the PlayStation VR.
Virtual Reality has become one of the most promising trends in the gaming industry but it can also be used to train people who operate in complex procedures, such as flying a plane or performing a surgery.
12. Robotics
Robotics is the field based on science and engineering, which focuses on designing, creating, and building robots and the computer programs that control them. Robotics has been around for a while but it has become more important than ever with advances in Artificial Intelligence.
Robotics have a multitude of uses ranging from the military (ever seen a bomb disposal robot in action?) to commercial products or to manufacturing, which leads us to our next term.
13. Smart Industry 4.0
Industry 4.0 is the term which is surfacing to describe the current trend of automation and sharing of data in the manufacturing sectors.
The goal of this trend is to create “Smart Factories”, which are factories that analyze the way they work, make improvements to their processes through Machine Learning and then share their results with other factories so they can adapt as well.
14. Internet of Things (IoT)
The Internet of Things is a term that encompasses everything involved in connecting everyday devices to the Internet, in order to collect data from them, exchange data between devices or control them from a distance.
These devices can be cars, home automation systems or your everyday toaster.
15. Quantum Computing
Traditional computers work with bits, which can only have two states, 0 or 1.
Quantum computers on the other hand, work with qubits which can have more than two states. In a nutshell, qubits mean faster and more powerful computers.
Quantum computing is still in its infancy but there is interesting research being conducted based on new advances in quantum theory.
Quantum Computing is considered the future of computing, and could well mean a complete overhaul of everything we know about computers thus far.
16. BlockChain
Blockchain is a new type of database, which is distributed (not just in one place) and encrypted by default.
Bear with me:
Traditional databases overwrite a record when a change occurs on it, but with BlockChain, every change creates a new record, that is timestamped and contains a link to the previous version of the record.
This means that you can see all the transactions that ever occurred on a particular record since it was created, which facilitates verification and validation.
The popular crypto-currency BitCoin is based on the BlockChain technology, and it’s currently being evaluated for use in medical and banking records.
17. Technological Unemployment
Our last term is a fiery hot one right now – possibly the gloomy side-effect of the exciting developments explored above, depending on how government and society responds:
Technological Unemployment is the notion of humans losing jobs to machines. Advances in robotics, machine learning and automation have led to jobs being taken over by machines.
Currently physical labor jobs, such as manufacturing or truck driving are the most affected, but intellectual jobs are getting further threatened as AI advances.
Discussions around Technological Unemployment have led to interesting conversations around Universal Basic Income, which means everyone getting an income for governmental agencies which would cover the basic costs of living.
Interesting Universal Basic Income trials have started to take place in Oakland, Ontario, Uganda and Finland, to name only a few.
People who oppose the scheme claim that giving people money for doing nothing will destroy the economy. One thing is clear – we need to start thinking about new approaches to careers and unemployment.
What should we do now? Wait for the tech apocalypse?
Well – you could sit back and hope universal basic income is introduced in your town – or you can get ahead of the curve and learn tech skills!