The document discusses two-variable analysis in regression models, focusing on estimating the population regression function using methods like ordinary least squares (OLS) and maximum likelihood (ML). OLS, attributed to Carl Friedrich Gauss, is favored for its simplicity and ability to minimize the sum of squared residuals, providing unbiased estimators for coefficients. The statistical properties of OLS estimators include ease of computation from observable quantities and the provision of point estimates for population parameters.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
12 views
Ch 3 Two Variable Analysis & Estimation
The document discusses two-variable analysis in regression models, focusing on estimating the population regression function using methods like ordinary least squares (OLS) and maximum likelihood (ML). OLS, attributed to Carl Friedrich Gauss, is favored for its simplicity and ability to minimize the sum of squared residuals, providing unbiased estimators for coefficients. The statistical properties of OLS estimators include ease of computation from observable quantities and the provision of point estimates for population parameters.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 6
Chap 03:Two Variable Analysis:
Regression Model
• Our first task is to estimate the population regression
function (PRF) on the basis of the sample regression function (SRF) as accurately as possible. • wo generally used methods of estimation: (1) ordinary least squares (OLS) and (2) maximum likelihood (ML). • By and large, it is the method of OLS that is used extensively in regression analysis primarily because it is intuitively appealing & mathematically much simpler than the method of maximum likelihood. Ordinary Least Square Method: OLS • The method of ordinary least squares is attributed to Carl Friedrich Gauss, a German mathematician. • By definition: A Regression technique which is used to estimate/find best-fitted line that minimizes the sum of square of residuals/error between observed data points and the predicted values in a linear regression model. • It is widely use in linear regression to estimate the relationship b/w variables by choosing parameters that minimize error, providing us unbiased estimators for coefficient like B1 , B2 etc. Ordinary Least Square Method: OLS • To understand this method, we first explain the least squares principle. Ordinary Least Square Method: OLS • Now given n pairs of observations on Y and X, we would like to determine the SRF in such a manner that it is as close as possible to the actual Y. • To this end, we may adopt the following criterion: Choose the SRF in such a way that the sum of the residuals uˆ i = (Yi - Yˆi) is as small as possible . (BUT WE CAN’T) • We may take square of residual , (BUT WE CAN’T) • Then What ? • Sum of Square of Residual. OLS Estimations: Statistical Properties Of OLS Estimators • Under certain assumptions the method of least squares has some very attractive statistical properties that have made it one of the most powerful and popular methods of regression analysis. • I. The OLS estimators are expressed solely in terms of the observable (i.e., sample) quantities (i.e., X and Y). Therefore, they can be easily computed • II.These are point estimators; that is, given the sample, each estimator will provide only a single (point) value of the relevant population parameter. • III. Once the OLS estimates are obtained from the sample data, the sample regression line can be easily obtained.