Artificial intelligence has become a bit of a buzzword over the past few years. It’s gotten to the point where the companies working on almost every aspect of our lives — from the cars we drive to the music streaming platforms we use to our search engines — may say that they’re using some kind of artificial intelligence. So, how does AI work?
But while many companies claim to use AI in their products, it’s less clear what work this AI is doing. Some will say that AI helps automate work or find patterns in data. Others will claim that AI helps make their technology more “intelligent.”
Rarely, however, will you get a clear breakdown of what that really means.
Below, we’ll cover the basics of AI. This includes how it’s defined, how it works and how computer scientists implement AI in practice.
What Is Artificial Intelligence?
Part of the problem of discussing AI is that researchers in the field haven’t really settled on a definition.
If you look in an AI textbook, you may see AI defined as any tech that analyzes its environment to maximize its chance of solving problems.
Others focus on defining AI as being “intelligent.”
Intelligent technology is any tech that emulates how humans think and learn.
In current practice, AI mostly means algorithms and predictive models powered by massive amounts of data. These models are designed to find patterns and relationships when presented with new information similar to data they’ve already seen.
How Does AI Work?
Any successful AI project starts with as much relevant and high-quality data as possible. For example, some researchers may want to build an AI algorithm that helps doctors find signs of illness in scans. They would probably start with a database of scans with relevant information, like whether or not the tissue in a given scan showed signs of disease.
The researchers then “train” the AI on this dataset. Over time, the AI gets better at identifying patterns in similar sets of information.
There are a few different popular approaches to AI development. The ones you’re most likely to hear about are neural networks, machine learning and deep learning.
Each uses a different technique to work towards roughly the same goal — an AI algorithm that can solve a problem when presented with new data.
Examples of AI in Practice
Because AI has become such a common approach to problem-solving, you can find great examples of AI in almost every field.
In medicine, some researchers have used AI to improve cancer detection in tissue samples. Other have used AI to smooth out the movements of a surgical robot that replicates the live hand motions of a surgeon.
In tech, entertainment and retail, AI helps companies provide more relevant recommendations in search engines and online storefronts.
Soon, AI in the form of “machine vision” may help autonomous cars navigate the world’s roads. Logistics companies are already using similar tech in warehouses, where it powers self-piloting, cargo-moving robots.
In some cases, developers use AI to generate wholly new content, as well. With an approach called a generative adversarial network (GAN), researchers make two neural networks compete against each other, with both attempting to generate new data with statistics similar to the original data set.
As a result, these algorithms learn, over time, to create new content or data that looks like it could have come from the original dataset.
In practice, these algorithms generate content like the computer-created portraits at This Person Does Not Exist.
How AI Is Changing Our Use of Technology
Artificial intelligence is on track to become a commonly-used option for businesses wanting to analyze and respond to massive amounts of data.
Most of this AI tech works in about the same way. An AI learns from existing data to make predictions or analyze new data. If you see a company claiming to use AI to boost their tech, what they’re probably doing is using AI to find new, subtle patterns in information.
Follow Us On
Get the latest tech stories and news in seconds!
Sign up for our newsletter below to receive updates about technology trends