Member-only story

My Data Science Friend Explained Linear Algebra for Machine Learning to Me — In One Simple Analogy

LORY
11 min readNov 21, 2024

--

How a Rubber Sheet Helped Me Understand Linear Algebra Concepts in 20 Minutes

Background

Like most developers, I’ve always been curious about how neural networks, computer vision, and large language models (LLMs) work under the hood. After talking to a few data scientists, they all pointed me back to algebra and calculus, saying that understanding these would make everything much clearer. So, I spent last month studying and then started the conversation below with my friend to try to piece it all together.

The Story

Me: “Hey man, I’ve been diving back into some math because you told me that linear algebra is essential for machine learning. I also remember you told me to study vector spaces and matrices, SVD, PCA, etc. So, I did… but it’s all still kind of fuzzy. I don’t get what all these concepts really mean — things like linear transformations, dot products, eigenvectors, and matrix multiplication. May I ask you to help me understand all these without any formula?”

DS Friend: “Take it easy, bro! Linear algebra can feel complex, but it’s pretty straightforward when using the right analogies. Let me help you with a simple picture. Once you get it, everything will make sense because it’s all interconnected.”

Dot Product

--

--

LORY
LORY

Written by LORY

A channel which focusing on developer growth and self improvement

Responses (1)