Multivariate Regression and Gradient Descent
In a previous video, we used linear and logistic regressions as a means of testing the gradient descent algorithm. I was asked to do a video on logistic regression, when I realized the example I used in the gradient descent video was rather complex. Not only is the model more complicated than linear regression, I was using multiple features as well. This would be akin to fitting a plane to 3-d data rather than a line to 2-d in the case of linear regression. In addition there is a lot of matrix manipulation to vectorize the code. In this video, I want to go over that matrix manipulation using the simple case of linear regression and show how by doing this, we can not only get a speed benefit by vectorizing, but also generalize our code to handle multiple parameters with a single gradient function.
Gradient Descent Part 1: https://youtu.be/trvgzYjUr-Y
Gradient Descent Part 2: https://youtu.be/J1ghebX8XGY
Gradient Descent Part 3:https://youtu.be/Twxe59IjHDk
Linear Regression: https://youtu.be/jmKfDvk4k6g
More on Linear Regression: https://youtu.be/UX_b6ZuZLbI
Tip Jar: https://paypal.me/kpmooney
-
18:30
kpmooney
3 years agoGradient Descent (Part 3)
10 -
9:23
kpmooney
3 years agoGradient Descent (Part 1)
14 -
17:00
kpmooney
3 years agoGradient Descent (part 2)
13 -
2:02
The PyTorch Channel
3 years agoTutorial 2: Train a linear regression model using the SGD optimizer
35 -
9:59
kpmooney
3 years agoMore on Linear Regression
20 -
18:51
kpmooney
3 years agoAn Introduction to Logistic Regression
37 -
35:45
mokv300
1 year agoLinear Regression Analysis | Linear Regression in Python | Machine Learning Algorithms
12 -
10:30
AV
7 months ago#95 Training Matrix
401 -
2:43
Conteúdo no copyright
2 years agoGradient Cxdy
1 -
5:20
Up and Atom Archive Channel
11 months agoThe Problem With Linear Regression | Data Analysis
4