Top 20
AI terms

July 2020 | Article

By Bhavik Patel

By Bhavik Patel


In 1950, Alan Turing changed history with a simple question:

Can machines think?

Artificial intelligence answers Turing’s question with an emphatic `yes’, but while we’ve all heard of AI, what exactly is it? To be honest, you’ll get a different answer depending on who you ask, but here’s the dictionary definition of the 2 words:

Artificial – Made or produced by human beings
Intelligence – the ability to learn, understand, make judgements or have opinions based on reason

What is AI?

Put simply, artificial intelligence is the ability of a machine or computer program to think and learn like a human. A machine can therefore perform tasks that typically require human decision-making.

Google’s SEO, Sundar Pichai has described AI as “one of the most important things that humanity is working on… More profound than electricity or fire”.

AI has become so much a part of our daily lives that we barely notice it. If you’re reading this article on LinkedIn, it’s because of AI. Movie recommendations on Netflix? That’s AI. Siri and Alexa, Google, Bing and every other search engine you can think of… The algorithms return results based on what they have learnt from your previous searches.

Given all this, it’s hardly surprising that a PwC report predicted that AI will contribute $15.7 trillion to the global economy by 2030 – more than the current output of China and India combined.

It’s the future of our economy, so we’ve put together 20 widely-used AI terms you should be aware of.

Top 20 AI terms

1. Algorithm

A process or set of rules that precisely defines a sequence of operations to be followed in calculations or other problem-solving operations, especially by a computer.

2.Artificial intelligence (AI)

The ability of a machine to think and learn from data in order to perform tasks that would normally require human intelligence – for example, visual perception, speech recognition, decision-making and language translation.

3. Chatbot

A computer program designed to simulate conversation with human users, especially over the internet. A chatbot, short for `chatterbot’ communicates with us via text. Chatbots are often found on websites, apps or instant messengers.

4. Cognitive computing

A computerized model which mimics the human thought process, often used in complex situations where answers may be ambiguous. Cognitive computing overlaps with AI and involves many of the same underlying technologies.

5. Data mining

A process for finding patterns, correlations and anomalies within a larger set of raw data in order to predict outcomes.

6. Deep learning

A subset of machine learning inspired by the human brain, in which machines learn from experience and acquire skills without human involvement. Deep learning algorithms perform a task repeatedly, each time tweaking it a little to improve the outcome.

7. Digital ecosystem

An interdependent group of IT resources, enterprises and people which share standardised digital platforms and function as a unit for mutually beneficial purposes. A digital ecosystem is typically made up of suppliers, customers, partners and applications.

8. Hyperautomation

The drawing together of AI and machine learning, robotic process automation, process mining, analytics and advanced tools in order to eliminate silos and automate processes more intelligently.

9. Intelligent automation

A combination of RPA and AI which turns vast amounts of data into processable information and synthesizes it into a usable format. Because it is `intelligent’, it can make decisions, learn as it goes and recommend courses of action.

10. Machine learning (ML)

The ability of a machine or system to learn from experience and improve, without being explicitly programmed to do so.

11. Natural language processing (NLP)

A subfield of linguistics, computer science, information engineering and artificial intelligence concerned with the interactions between computers and human languages. NLP allows computers to process, understand, interpret and manipulate large amounts of natural human language in order to make decisions.

12. Neural network

A computing system comprising multiple layers of highly interconnected processing elements. Designed to mimic the functionality of the human brain, the network can learn the best way to perform tasks using both example and experience, without the need for task-specific programming.

13. Optical character recognition (OCR)

The use of photoelectric devices and computer software to distinguish printed characters. The technology reads typed, printed or handwritten characters and converts them into machine-encoded text, allowing it to be searchable and editable.

14. Predictive analytics

The use of data, statistics and modelling to calculate the likelihood of future outcomes based on current and historical data.

15. Process mining

A process analysis method that helps organizations understand their current business processes. It uses available event logs in the systems to find variations, repetition, problems and bottlenecks so that an organization can gauge whether it’s worth investing in improvements.

16. Robotic process automation (RPA)

Software that can be easily programmed to perform complex rules-based work. Robots are given a defined set of instructions which enables them to carry out tasks just as a human would, although with zero errors.

17. Structured data

Any data which resides in a fixed field within a record or file. This includes data contained in relational databases or spreadsheets such as Excel, where information is placed in orderly columns and rows.

18. Supervised learning

A form of machine learning in which both the input data and required output data are provided. The algorithm is effectively `taught’ or supervised by a training dataset that knows the correct answer. The machine iteratively makes predictions on the training data, is corrected by the `teacher’ and stops learning once it achieves an acceptable level of performance.

19. Unstructured data

The opposite of structured data, usually referring to information that does not reside in a traditional row-column database.

20. Unsupervised learning

A form of machine learning in which only the input data is provided, with no required output data. The machine is left on its own to find patterns from the input data. The aim is to model the underlying structure or distribution in the data in order to learn more about the data.

According to a Statista survey, 84% of enterprises believe that investing in AI will lead to greater competitive advantages. Every industry and sector can benefit from it, including yours.


For information and ideas on how AI can transform your organization,
get in touch with the Quantanite experts.

Share this post

Share on linkedin
Share on facebook
Share on email
Share on print