Beruflich Dokumente
Kultur Dokumente
The problem of
overfitting
Machine Learning
Size
Price
Price
Price
Size
Size
x2
x1
x2
x1
x1
( = sigmoid function)
Andrew Ng
size of house
no. of bedrooms
no. of floors
age of house
average income in neighborhood
kitchen size
Price
Addressing overfitting:
Size
Andrew Ng
Addressing overfitting:
Options:
1. Reduce number of features.
Manually select which features to keep.
Model selection algorithm (later in course).
2. Regularization.
Keep all the features, but reduce magnitude/values of
parameters .
Works well when we have a lot of features, each of
which contributes a bit to predicting .
Andrew Ng
Regularization
Cost function
Machine Learning
Price
Price
Intuition
Size of house
Size of house
really small.
Andrew Ng
Regularization.
Small values for parameters
Simpler hypothesis
Less prone to overfitting
Housing:
Features:
Parameters:
Andrew Ng
Price
Regularization.
Size of house
Andrew Ng
to minimize
to minimize
Price
Size of house
Andrew Ng
Regularization
Regularized linear
regression
Machine Learning
Gradient descent
Repeat
Andrew Ng
Normal equation
Andrew Ng
Non-invertibility (optional/advanced).
Suppose
,
(#examples) (#features)
If
Andrew Ng
Regularization
Regularized
logistic regression
Machine Learning
x1
Cost function:
Andrew Ng
Gradient descent
Repeat
Andrew Ng
Advanced optimization
function [jVal, gradient] = costFunction(theta)
jVal = [ code to compute
];
];
];
];
];
Andrew Ng