Ordered Model
Ordered Model
0
yi = xi + "i ; i = 1; :::; N (1)
in which the latent variable yi ranging from 1 to 1 is mapped to an
observed variable yi . The variable yi is thought of as incomplete information
about an underlying yi according to the measurement model:
yi = 1 if 0 = 1 yi < 1
= 2 if 1 yi < 2
=
= J if j 1 yi < J =1 (2)
Also can be expressed in the following way:
0 0
Pr [yi = mjxi ] = Pr "i j xi Pr "i
xi ; j = 0; 1; :::; J j 1
(4)
with a full set of normalizations, the likelihood function for estimation of the
model parameters is given for:
0 0
Pr [yi = mjxi ] = F j xi F j 1 xi ; j = 0; 1; :::; J (5)
0 0
@ Pr [yi = mjxi ] @F j xi @F j 1 xi
= (6)
@xk @xk @xk
0 0
= k f j 1 xi f j xi (7)
1
2. Random parameters and heteregeneity in the ordered choice
model
0 0
Pr [yi = mjxi ] = F j xi F j 1 xi > 0; j = 0; 1; :::; J (8)
i = +W i (9)
0 0
Pr [yi = mjxi ; i] =F j i xi F j 1 i xi (11)
where i vary with unobserved terms. However, the probability in (11) con-
tains the unobserved random terms i . The terms that enter to the log likelihood
function for estimation purposes must be unconditional on the unobservables.
Then, they are integrated out, thus, I have the next equation:
Z
0 0
Pr [yi = mjxi ] = F j i xi F j 1 i xi f ( i) d i (12)
i
log Ls ( ; ; W )
XN R
1X 0 0
= F j ( +W ir ) xi F j 1 ( +W ir ) xi (13)
n=1
R r=1
2
where ir is of R multivariate random draws for simulation. The parameters
estimated must be transformed to yield estimates of the marginal changes, that
is, to determine how a marginal change in one regressor changes the distribution
of all the outcome probabilities.
Taking the partial derivate of (13) with respect to xi yields the marginal
e¤ect,
Z " 0 0
#
@ Pr [yi = mjxi ] @F j i xi @F j 1 i xi
= f ( i) d i
@xk @xk @xk
i
Z
0 0
= f j 1 i xi f j i xi ik f ( i) d i (14)
i