Explanations of machine learning are often either too complex or overly simplistic. I’ve had some luck explaining it to people in person with some simple analogies:

The Jet of Machine Learning

Scratch the surface, and you see that machine learning is basically a kind of ‘statistical thinking.’ We’ve long had tools for doing statistical analysis on data. Machine learning just automates that analysis so we can do it at much larger scale. The basic techniques have been around for decades, but machine learning didn’t really explode in popularity until just a few years ago with the advent of powerful new processors (Graphics Processing Units and later Tensor Processing Units) and large-scale data sets from Internet services like Google Search, Amazon and Facebook.

Andrew Ng makes the analogy that compute power is the jet engine and data is the jet fuel of machine learning. Rather than fly you to Chicago, this jet builds statistical models that draw on their underlying data to simulate reality, somewhat analogously to the way we simulate reality in our own brains. These algorithmic models extend our biological brains to help them do something they’re not really built for: thinking statistically.

Big Data and Models

Before this powerful new jet showed up, we were using machine learning to automate the building of statistical models. It saved a lot of time and energy over the labor-intensive statistical techniques we used to use, and that opened up interesting new applications, such as analyzing inventory levels in a warehouse, estimating the threat of over-fishing from commercial boats, and predicting stocks prices.

These kinds of applications are what is often described as “Big Data,” or data analytics. In this work’s early phases, the models were typically static, a kind of snapshot analysis of the underlying data. Despite this limitation, these techniques proved valuable for analyzing large datasets. That made them very popular in large corporations and resulted in a thriving ecosystem of data analytics companies.

Deepening the Automation

It’s worth calling out one of the specific tricks that we now use to automate building these statistical models. It’s called Deep Learning and it is a technique that has taken the machine learning world by storm. The reason Deep Learning is so popular is that it allows developers to automatically build models through exposure to large datasets. These neural networks have multiple layers, much the way animal brains do. The lower layers of these networks focus on identifying the simplest and most concrete features of a model, handing off their results to subsequent layers, which work on progressively more complex and holistic interpretations of the data. The below graphic from Nvidia illustrates an example of layers in a deep neural network for identifying cars, starting with rudimentary lines, moving to wheel wells, doors, and other car parts, and finally on to full cars.

http://www.the-vital-edge.com
Contributor

Recently Published

Key Takeaway: A study has found that humble leaders can become more promotable by growing others through a “humility route”. Human capital theory suggests that employees’ value can be enhanced by investing in their knowledge, skills, and abilities. Humble leaders focus on the learning and growth of their followers, creating human capital value for themselves. […]

Top Picks

Key Takeaway: The current economic climate is particularly concerning for young people, who are often financially worse off than their parents. To overcome this, it is important to understand one’s financial attachment style, which can be secure, anxious, or avoidant. Attachment theory, influenced by childhood experiences and education, can help shape one’s relationship with money. […]
Key Takeaway: Wellness culture, which claims to provide happiness and meaning, has been criticized for its superficial focus on superficial aspects like candles and juice cleanses. Psychological research suggests that long-term wellbeing comes from a committed pursuit of both pleasure and meaning. Martin Seligman’s Perma model, which breaks wellbeing into five pillars: positive emotions, engagement, […]
Key Takeaway: Quantum computing, which uses entanglement to represent information, has the potential to revolutionize everyday life. However, the development of quantum computers has been slow due to the need to demonstrate an advantage over classical computers. Only a few notable quantum algorithms have been developed, such as the BB84 protocol and Shor’s algorithm, which […]
Key Takeaway: China’s leaders have declared a GDP growth target of 5% in 2024, despite facing economic problems and a property crisis. The country’s rapid economic growth has been attributed to market incentives, cheap labor, infrastructure investment, exports, and foreign direct investment. However, none of these drivers are working effectively. The government’s determination to deflate […]
Key Takeaway: Neuralink, founded by Elon Musk, aims to implant a brain-computer interface (BCI) in people’s brains, allowing them to control computers or phones by thought alone. This technology holds the promise of alleviating human suffering and allowing people with disabilities to regain lost capacities. However, the long-term aspirations of Neuralink include the ability to […]

Trending

I highly recommend reading the McKinsey Global Institute’s new report, “Reskilling China: Transforming The World’s Largest Workforce Into Lifelong Learners”, which focuses on the country’s biggest employment challenge, re-training its workforce and the adoption of practices such as lifelong learning to address the growing digital transformation of its productive fabric. How to transform the country […]

Join our Newsletter

Get our monthly recap with the latest news, articles and resources.

Login

Welcome to Empirics

We are glad you have decided to join our mission of gathering the collective knowledge of Asia!
Join Empirics