Notifications
Clear all

Is linear algebra more important than calculus for machine learning?

5 Posts
6 Users
0 Reactions
102 Views
0
Topic starter

Ive been messing around with scikit-learn for my car price predictor project in Berlin—trying to get it live by next month—but Ive hit a wall with the theory side. My logic was that calculus would be the big one because of gradient descent but now that Im digging into SVD and eigenstuff for dimensionality reduction Im seeing linear algebra everywhere. Its like everything is just matrices under the hood. I always heard calc was the foundation but now Im kinda pivoting my study plan because LA feels way more practical for actually building the models. Am I crazy for wanting to skip the heavy multivariable calc to just grind out more matrix theory instead?


5 Answers
11

Quick question, are you using gradient boosting or just basic regression for the car project? In my experience, you shouldnt blow your budget on expensive courses or new books.

  • get a used Texas Instruments TI-84 Plus CE Graphing Calculator for checking manual calcs
  • pick up O'Reilly Hands-On Machine Learning with Scikit-Learn Keras and TensorFlow 3rd Edition second hand You honestly dont need the heavy theory until you start writing custom loss functions.


10

Re: "Quick question, are you using gradient boosting or..."

  • LA was my focus early on for saving cash. Grab a used Dover Publications Linear Algebra Second Edition and youll be set. Lmk if you need tips!


3

I would suggest you be careful before skipping calculus entirely. Linear algebra is the language of data, but calculus is the engine of optimization. In my experience, neglecting derivatives can lead to unstable models that fail later.

  • Study matrix decompositions for SVD
  • Dont skip partial derivatives You might want to consider Cambridge University Press Mathematics for Machine Learning Hardcover for a reliable roadmap.


1

Works great for me


1

Tbh, skipping multivariable calc makes me a bit nervous for your project. Like someone mentioned, LA is the language, but I would suggest being really careful about dropping the calculus side entirely. I tried taking that shortcut once and it just led to a lot of confusion when my gradients started acting weird... it is kind of like trying to fix a car engine without knowing how combustion actually works. I have been comparing Pearson Calculus Early Transcendentals 15th Edition with W. H. Freeman Linear Algebra and Its Applications 6th Edition lately. While the LA book feels way more immediate for the coding part, the calculus one is what actually explains why your model is even learning in the first place. You might want to consider just grinding out the partial derivatives at least. Itll make your life way easier when you have to debug why your car price predictor is giving wonky numbers next month. Let me know if you want some pointers on which specific chapters are actually useful so you dont waste time on the super dense theory stuff.


Share: