Econometrics for finance (2017-I)
Econometrics for finance (2017-I)
1. False - Homoscedasticity refers to the assumption that the variance of the errors (residuals) is
constant across all levels of the independent variable(s). It does not specifically refer to the
linearity of the random errors for all values of x.
2. True - Plotting values of X against corresponding residuals is indeed a method for determining
the linearity of X-Y data. If the residuals show a pattern or trend, it may indicate non-linearity in
the relationship between X and Y.
3. True - The prediction line in a regression model provides a point estimate of the value of Y for
any given x. It is an estimate of the expected value of Y based on the regression equation.
4. False - A strong linear relationship between X and Y does not necessarily imply causality.
Correlation and causation are distinct concepts. A strong linear relationship simply indicates that
there is a strong association between X and Y, but it does not establish a cause-and-effect
relationship.
5. True - A high coefficient of determination (R-squared) indicates that a larger proportion of the
variation in Y is explained by X. It suggests that the regression model accounts for a significant
portion of the variability in the dependent variable (Y).
6. False - Plotting X against the variability of the residual is not a method for determining the
variance of the difference between observed and predicted values of Y in simple linear
regression. This is typically done by calculating the mean squared error (MSE) or the standard
deviation of the residuals.
Explanation: The correlation coefficient (r) measures the strength and direction of the linear
relationship between two variables. In this case, a correlation coefficient of 0.7 indicates a
moderate to strong positive linear relationship. The coefficient of determination (R-squared) is
the square of the correlation coefficient, which represents the proportion of variance in the
dependent variable (Y) that is predictable from the independent variable (X). Therefore, $r^2 =
0.7^2 = 0.49$, indicating that 49% of the variations in Y are explained by X.
Explanation: The Dickey-Fuller test is used to test for the presence of a unit root in a time series,
which is related to the concept of stationarity. Heteroscedasticity, on the other hand, refers to the
assumption that the variance of the errors (residuals) is not constant across all levels of the
independent variable(s). Therefore, the pair "Dicky- Fuller test – Hetero scedasticity" is not
correctly matched.
3.
4. C
5. B
6. B
7. D
8. B
9. A
a) Specification error.
Explanation: Dropping a variable from a model can result in a specification error because the
model may no longer accurately represent the relationship between the remaining variables and
the dependent variable.
Explanation: Residuals represent the difference between the actual observed values of the
dependent variable (Y) and the predicted values obtained from the regression model.
12. Assuming a linear relationship between X and Y, which of the following is true if the
coefficient of correlation $(r)$ equals 0?
d. is no correlation
Explanation: If the coefficient of correlation $(r)$ equals 0, it indicates that there is no linear
correlation between the variables X and Y.
14. In linear regression, the variable being predicted is usually called which of the following?
a. Dependent variable.
Explanation: In linear regression, the variable being predicted is referred to as the dependent
variable.
Explanation: The Y intercept $(b0)$ represents the predicted value of the dependent variable (Y)
when the independent variable (X) is equal to zero.
- Continuous variables: Variables that can take on an infinite number of values within a given
range (e.g., height, weight).
- Discrete variables: Variables that can only take on specific, distinct values (e.g., gender,
number of children).
- Categorical variables: Variables that represent categories or groups (e.g., race, marital status).
- Covariance: Measures the average product of the differences between paired scores of two
variables.
- Correlation: Measures the strength and direction of the linear relationship between two
variables. It is standardized covariance.
- Coefficient of determination: Also known as $R^2$, it represents the proportion of the total
variation in the dependent variable that is explained by the independent variable in a regression
model.
When you include an irrelevant variable in a regression, it can lead to inefficiency and potential
multicollinearity issues. The irrelevant variable may not contribute to predicting the dependent
variable and may even distort the estimates of the relevant variables.