Tech Industry
Yesterday
450
Our company has alternate Fridays off, Does your company have?
Software Engineering Career
Yesterday
2980
L4 Google -> 45 interviews, 5 offers, AMA
Tech Industry
Yesterday
3455
BREAKING: Internal sources confirm another round of layoffs just hit emails at Tesla. For real.
Tech Industry
Yesterday
1189
Question about women in their 30’s?
Tech Industry
4d
43012
What happens when most of your team is Indian?
The math is pretty trivial relatively speaking. As the previous post stated ML and statistics have a huge amount of overlap. Calculus, need it for some derivatives, maybe other stuff like optimal transport if you consider that ML. Just run through some tutorials to get a feel. Look at the solutions to some kaggle winners.
Thanks for the info, will review some statistics stuff!
Also brush up on linear algebra and probability. That will help.
I think spending a little time on foundations (linear algebra, stats) will go a long way. And spending time on basic ML methods (linear regression, trees, pca...) will be very useful. Then you can go head on into DL and CV My two cents
Lol at the guy who thinks you need to understand pvalues to do machine learning. Don't listen to that OP. Pick up a Kaggle challenge and learn as you solve it. Use PyTorch. If you don't know any linear algebra then you're doomed that's true. Otherwise, only basic probability is necessary to get started.
Can you please provide more details on why you made that remark on my comment, just curious. I understand some very basic stuff may be required to get started but that would only lead to weak fundamentals as OP will not be able to understand how exactly the algorithms work, or how they would be impacted in case of changes.
Sure. Modern machine learning (neutral nets) has little to do with frequentist statistics. Bayesian stats, yes. I've written and read many papers in the field, and use neutral nets every day at work. I have literally never seen pvals used in the same paper as neutral nets (though I'm sure you can find some at the intersection of ml and Neuroscience). Going back in ML-time, GPs are pure Basian stats (read GPML by Chris Williams). SVMs have more to do with old school stats. But if you know linalg, calculus, and basic Bayesian probability, you'll know more than necessary to get a great start in neutral nets. Finally, I honestly believe practice is as necessary as fundamentals -- doing deep learning right is still very much a matter of intuition, unfortunately.
ML and statistics are closely related. For example if you obtain a model and don't know how to interpret pvalue, it can be a challenge. Also determining how your confidence intervals around best fit curves can change with increase in data or new independent parameters helps in model interpretation. I would suggest taking a basic statistics course before diving into ML. ML is not about using scikit learn and being happy with a 90% adj rsq. Calculus I am not so sure unless you are trying to deep dive into the maths behind each algorithm like actually building a lasso model without using a library.