(Limited Information Maximum Likelihood Estimation) Consider a bivariate distribution for x and y that is a function of two parameters, α and β. The joint density is f (x, y | α, β). We consider maximum likelihood estimation of the two parameters. The full information maximum likelihood estimator is the now familiar maximum likelihood estimator of the two parameters. Now, suppose that we can factor the joint distribution as done in Exercise 3, but in this case, we have f (x, y | α, β) = f (y | x, α, β) f (x | α). That is, the conditional density for y is a function of both parameters, but the marginal distribution for x involves only α.
a. Write down the general form for the log likelihood function using the joint density.
b. Since the joint density equals the product of the conditional times the marginal, the log-likelihood function can be written equivalently in terms of the factored density. Write this down, in general terms.
c. The parameter α can be estimated by itself using only the data on x and the log likelihood formed using the marginal density for x. It can also be estimated with β by using the full log-likelihood function and data on both y and x. Show this.
d. Show that the first estimator in Part c has a larger asymptotic variance than the second one. This is the difference between a limited information maximum likelihood estimator and a full information maximum likelihood estimator.
e.
1) You can buy this solution for 0,5$.
2) The solution will be in 8 hours.
3) If you want the solution will be free for all following visitors.
4) The link for payment paypal.me/0,5usd
5) After payment, please report the number of the task to the oneplus2014@gmail.com
New search. (Also 1294 free access solutions)
Use search in keywords. (words through a space in any order)