honestly im so sick of these machine learning bootcamps and easy tutorials that say you dont need any math background because its a total lie. I started this one popular course last month—paid $40 for it on sale—and everything was fine when we were just importing scikit-learn and hitting fit but as soon as we got to the actual mechanics of gradient descent and backpropagation i hit a brick wall. it feels like every time i think i understand what a model is doing i see some notation with capital Sigmas or partial derivatives and my brain just shuts off. its so frustrating because i really want to build this recommendation tool for my vintage camera collection site but i feel like an idiot staring at these formulas.
the problem is i havent touched a math textbook since high school about 10 years ago and even then i barely scraped by in pre-calc. i dont have the time or the money to go back and get a full degree in math but i cant just keep guessing what the hyperparameters are doing. i tried looking up math for ml and the lists are endless. they say i need multivariate calculus, linear algebra, probability theory, statistics... its overwhelming. like do i really need to know how to calculate a Hessian matrix by hand or is that just overkill?
i have about $60 left in my learning budget for this quarter and i really want to spend it on the right things. if i spend the next three months focusing on just one or two areas of math so i can actually understand what is happening under the hood of these neural networks, what should they be? like if you had to pick the absolute essentials that actually show up in the code vs just theory, what am i looking at? is it just linear algebra or is statistics more important for things like error distribution? i just need a roadmap that isnt learn everything because i’m ready to give up on this whole career pivot if i keep hitting these walls...
@Reply #2 - good point! Honestly, most of those 'zero to hero' courses are kind of a disaster. I had issues with those fast-track tutorials too, they're just not as good as expected when you actually need to debug a model or fix an error. Unfortunately, you can't really skip the foundations if you want to stop guessing. It's pretty unreliable to just tweak numbers without knowing the underlying mechanics. Quick question tho before you buy anything - are you building a simple content-based filter for the cameras or something with deep learning? That changes which math to focus on first. If you've got $60, try these:
> it feels like every time i think i understand what a model is doing i see some notation with capital Sigmas or partial derivatives and my brain just shuts off. Man, i totally get that. I remember trying to build my first image classifier years ago and feeling like a total fraud cuz i couldnt explain a simple dot product to save my life. I wasted way too much money on heavy university textbooks that just gathered dust before realizing you don't actually need a full degree for this. If you only have 60 bucks left, honestly skip the fancy paid bootcamps for now. You should grab Cambridge University Press Mathematics for Machine Learning Hardcover or just find the legal free PDF version online if you wanna save that cash for a hardware upgrade. In my experience, Linear Algebra is 80% of the battle because everything is just matrices and vectors under the hood. Once you get how those transform, those scary Sigmas just look like simple loops. Just focus on matrix multiplication and the basic chain rule for backprop... thatll get you way further than any generic tutorial.
Like someone mentioned, focusing on specific math instead of trying to learn everything at once is the only way to stay sane. i remember when i was first trying to build a custom ranking algorithm for a side project and i almost quit because every tutorial assumed i already knew multivariable calculus. i ended up buying a few different books to see what stuck, and some were way too academic for a dev just trying to get code running. here are a couple of reliable options that i found actually helpful without being boring:
Re: "> it feels like every time i think..."