MLearning
/
Algorithms
- 1 Supervised ML 4
-
Classifier S
-
Linear model S
-
Basis expansion S
-
Regularization S
- 2 Matplotlib 2
-
Subplots S
-
Pyplot S
- 3 Datasets 4
-
Iris species S
-
Diabetes S
-
Breast cancer S
-
Simulated data S
- 4 Numpy 7
-
Matrices S
-
Sparse matrices S
-
Vectorize S
-
Average S
-
Standard deviation S
-
Reshape S
-
Multiplication S
- 5 Pandas 5
-
Read data S
-
Data cleaning S
-
Find values S
-
Group rows S
-
Merge data S
- 6 Calculus 2
-
Derivatives S
-
Integrals S
- 7 Algorithms 3
-
K nearest neighbors S
-
Linear regression S
-
Gradient descent S
S
R
Q
ML Algorithms Linear Regression
1. Load training data 2. Choose a line for model 3. Compute errors sum 4. Minimize error 5. Make predictionsLinear-regression
Algorithm
Use the line of best fit to make predictions.\(
f(x) = ax^2 \enspace then \enspace f'(x) = 2ax
\)

""" Linear Regression Algorithm
Collect data, points with x, y values
x, y = [1, 2, 3], [2, 4, 5]
Choose a line of best fit
y = ax + b
Calculate the error difference
error = y_pred - y
Minimize the error, find the best fit
use optimization algorithms (gradient descent)
Make predictions
use the line of best fit to make predictions
"""
import numpy as np
x = np.array([1, 2, 3, 4, 5])
y = np.array([2, 4, 5, 4, 5])
m = 0
b = 0
learning_rate = 0.01
num_iterations = 1000
for i in range(num_iterations):
y_pred = m*x + b
error = y_pred - y
m_derivative = -(2/len(x)) * sum(x * (y - y_pred))
b_derivative = -(2/len(x)) * sum(y - y_pred)
m = m - learning_rate * m_derivative
b = b - learning_rate * b_derivative
m = round(m, 2)
b = round(m, 2)
print(f"y = {m}x + {b}") # y = 0.62 x + 2.14
➥ Questions
Last update: 59 days ago