№ |
Condition |
free/or 0.5$ |
m68203 | Explain briefly the meaning of
a. Autocorrelation
b. First-order autocorrelation
c. Spatial correlation |
buy |
m68204 | Explain briefly the meaning of:
a. Categorical variables.
b. Qualitative variables.
c. Analysis-of-variance (ANOVA) models.
d. Analysis-of-covariance (ANCOVA) models.
e. The dummy variable trap.
f. Differential intercept dummies.
g. Differential slope dummies. |
buy |
m68205 | Explain briefly what is meant by
a. Log-log model
b. Log-lin model
c. Lin-log model
d. Elasticity coefficient
e. Elasticity at mean value |
buy |
m68206 | Explain carefully the meaning of
(1) Under identification,
(2) Exact identification, and
(3) Over identification. |
buy |
m68207 | Explain carefully the meaning of
a. An unbiased estimator.
b. A minimum variance estimator.
c. A best, or efficient, estimator.
d. A linear estimator.
e. A best linear unbiased estimator (BLUE). |
buy |
m68208 | Explain carefully the meaning of each of the following terms:
a. Population regression function (PRF).
b. Sample regression function (SRF).
c. Stochastic PRF.
d. Linear regression model.
e. Stochastic error term (ui).
f. Residual term (ei).
g. Conditional expectation.
h. Unconditional expectation.
i. Regression coefficients or parameters,
j. Estimators of regression coefficients. |
buy |
m68226 | Explain step by step the procedure involved in
a. Testing the statistical significance of a single multiple regression coefficient.
b. Testing the statistical significance of all partial slope coefficients. |
buy |
m68234 | Explain the meaning of
a. Degrees of freedom.
b. Sampling distribution of an estimator.
c. Standard error. |
buy |
m68235 | Explain the meaning of
a. Expected value
b. Variance
c. Standard deviation
d. Covariance
e. Correlation
f. Conditional expectation |
buy |
m68236 | Explain the meaning of
a. Least squares.
b. OLS estimators.
c. The variance of an estimator.
d. Standard error of an estimator.
e. Homoscedasticity.
f. Heteroscedasticity.
g. Autocorrelation.
h. Total sum of squares TSS..
i. Explained sum of squares ESS.,
j. Residual sum of squares RSS..
k. r2
l. Standard error of estimate.
m. BLUE.
n. Test of significance.
o. t test.
p. One-tailed test.
q. Two-tailed test.
r. Statistically significant. |
buy |
m68237 | Explain the meaning of the following terms:
a. Dynamic models
b. Distributed lag models
c. Autoregressive models |
buy |
m68243 | Explain whether the following statements are true or false. Give reasons.
a. Although the expected value of an r.v. can be positive or negative, its variance is always positive.
b. The coefficient of correlation will have the same sign as that of the covariance between the two variables.
c. The conditional and unconditional expectations of an r.v. mean the same thing.
d. If two variables are independent, their correlation coefficient will always be zero.
e. If the correlation coefficient between two variables is zero, it means that the two variables are independent.
f. E(1 / X) = 1 / E(X)
g. E[X - µX]2 = [E(X - µX)]2 |
buy |
m68270 | Express the following in the X notation:
a. x1 + x2 + x3 + x4 + x5
b. x1 + 2x2 + 3x3 + 4x4 + 5x5
c. (x21 + y21) + (x22 + y22) + ... + (x2k + y2k) |
buy |
m68302 | Figure 4-1 gives you the normal probability plot for Example 4.4.
a. From this figure, can you tell if the error term in Eq. (4.62) follows the normal distribution? Why or why not?
b. Is the observed Anderson-Darling A1value of 0.468 statistically significant? If it is, what does that mean? If it is not, what conclusion do you draw?
c. From the given data, can you identify the mean and variance of the error term?
Normal probability plot for Example 4.4 AD = Anderson-Darling statistic |
buy |
m68304 | Fill in the blanks in Table 5-12.
FUNCTIONAL FORMS OF
REGRESSION MODELS |
buy |
m68305 | Fill in the gaps in the manner of (a) below.
a. The expected value or mean is a measure of central tendency.
b. The variance is a measure of...
c. The covariance is a measure of...
d. The correlation is a measure of... |
buy |
m68446 | Find the critical t values in the following cases:
a. n = 4, α = 0.05 (two-tailed test)
b. n = 4, α = 0.05 (one-tailed test)
c. n = 14, α = 0.01 (two-tailed test)
d. n = 14, α = 0.01 (one-tailed test)
e. n = 60, α = 0.05 (two-tailed test)
f. n = 200, α = 0.05 (two-tailed test) |
buy |
m68456 | Find the critical Z values in the following cases:
a. α = 0.05 (two-tailed test)
b. α = 0.05 (one-tailed test)
c. α = 0.01 (two-tailed test)
d. α = 0.02 (one-tailed test) |
buy |
m68483 | Find the expected value of the following PDF:
f(X) = X2 / 9 0 ≤ x ≤ 3 |
buy |
m68637 | For a sufficiently large d.f., the chi-square distribution can be approximated by the standard normal distribution as: Z = √2x2 - √2k - 1 ~ N (0, 1). Let k = 50.
a. Use the chi-square table to find out the probability that a chi-square value will exceed 80.
b. Determine this probability by using the preceding normal approximation.
c. Assume that the d.f. are now 100. Compute the probability from the chi-square table as well as from the given normal approximation. What conclusions can you draw from using the normal approximation to the chi-square distribution? |
buy |