Processing math: 100%

Pages

Linear Regression Error Analysis


Linear Regression error Analysis: 
As mentioned earlier that there is an error in the solution obtained by a linear regression method.
We are assuming input from p-dimensional space ( p number of the features) and mapping it to R (real numbers).
We can write linear regression simply asˆβ=(XTX)−1XTyf(X)=β0+Σpj=1Xjβj
Now Residual sum of squares is given by RSS(β)=ΣNi=1(yi−f(xi))2=Σ(yi−β0−Σpj=1xijβj)2
Here it is assumed that input data points are also independent of each other.
Now, the aim is to minimize the error in the result. Let X denote the matrix of order Nx(p+1) where N is the number of data points, p is the number of the features and 1 is added because of intercept, and y is N-vector of outputs.
Hence we can write Residual sum of squares as RSS(β)=(y−Xβ)T(y−Xβ)

Differentiate it with respect to β, we will obtain∂RSS∂β=−2XT(y−Xβ)∂2RSS∂β∂βT=XTX

So a unique solution to minimize the error is given by finding β, which is obtained by putting the first derivative equal to zero. So the solution now is given by, Ë†Î²=(XTX)−1XTy





No comments:

Post a Comment

If you have any doubt, let me know

Email Subscription

Enter your email address:

Delivered by FeedBurner

INSTAGRAM FEED