0% found this document useful (0 votes)
137 views4 pages

JMR

This document introduces an Excel spreadsheet template for computing starting values for latent variable (LV) interactions and quadratics using a single indicator specification. The spreadsheet uses parameter estimates from a measurement model of the LVs to calculate loadings and covariances for the LV interactions and quadratics. It can provide starting values to help with convergence in structural equation models containing LV interactions and quadratics. The document discusses how to properly use and interpret the spreadsheet.

Uploaded by

Wahyu Hidayat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
137 views4 pages

JMR

This document introduces an Excel spreadsheet template for computing starting values for latent variable (LV) interactions and quadratics using a single indicator specification. The spreadsheet uses parameter estimates from a measurement model of the LVs to calculate loadings and covariances for the LV interactions and quadratics. It can provide starting values to help with convergence in structural equation models containing LV interactions and quadratics. The document discusses how to properly use and interpret the spreadsheet.

Uploaded by

Wahyu Hidayat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 4

EXCEL Template for Computing Starting or Fixed Values

for Latent Variable (LV) Interactions and Quadratics


Using a Single Indicator Interaction Specification

The APA citation for this paper is Ping, R.A. (2017). "EXCEL template for computing starting or fixed values for
latent variable (LV) interactions and quadratics using a single indicator interaction specification." [on-line paper].
http://www.wright.edu/~robert.ping/jmr(1).doc .

(Note a previous version of this paper is available at www.wright.edu/~robert.ping/jmr.doc .)

The EXCEL spreadsheet is intended to assist in the specification of the single


interaction/quadratic indicators, x:z, x:x and z:z (i.e., Ping 1995, 1996 single indicators). It uses
measurement model parameter estimates for the loadings, measurement error variances, the
variances associated with the latent variables X and Z, and the correlations between X and Z;
along with the SAS, SPSS, etc. covariances among X, Z, XZ, XX and ZZ. The spreadsheet
assumes that X and Z are unidimensional, preferably consistent (i.e., their measurement models
fit the data) with mean-centered indicators, and it assumes that there are no correlated
measurement errors involving X or Z.
To use this spreadsheet, plan to estimate a measurement model containing at least X and
Z (a larger or a full model measurement model could be used as long as the model latent
variables are all unidimensional). If starting values for all the phi's (covariances) associated with
XX, XZ and ZZ are desired, also obtain SAS, SPSS, etc. estimates of the covariance matrix for
X, Z, XX, XZ and ZZ.
The spreadsheet may be a bit confusing/vague about 1) “averaging,” about 2) the “phi’s”
on the spreadsheet and the “Error Attenuated Cov’s” on the spreadsheet, and 3) “two-steps.” I
prefer “averaging” the XX, XZ and ZZ loadings even though Ping 1995 and 1996 describe
unaveraged results—the spreadsheet assumes averaging because otherwise the model covariance
matrix can produce determinants that are “too large” for some computers.
Regarding 2), the “Error Attenuated Cov's” on the spreadsheet are optional unless “phi”
starting values for XX, XZ or ZZ are desired. (I usually use phi starting values—they are
formally correct for multivariate normal X and Z and they seem to speed up estimation. Note that
X=(x1+...+xn)/n and Z=(z1+...+zm)/m are used to create the “Error Attenuated Cov’s” for
averaging.)
About 3): Ping 1996 suggested estimating XX, XZ, etc. in “two steps.” Step 1 is to use
the spreadsheet’s (fixed) loadings, error variances, etc. for XX, XZ, etc. from the measurement
model for X and Z. Step 2 is to use fixed loadings, measurement error variances, etc. for XX, XZ
from the structural model estimated using the spreadsheet in a second structural model. The
loadings, error variances, etc. for XX, XZ, etc. will be different between the two structural
models (due to non normality), and the second structural model results should be reported, unless
all the model’s t-values are practically unchanged between the two structural model estimations
(e.g., different in the second decimal place or less). (Experience suggests a third estimation or
more is usually never required, unless there are errors in the XX, XZ, etc. specifications. If more
than two structural models are necessary to obtain “convergence” in model estimates, please
email me for suggestions.)
The bold entries, and the italicized entries, on the spreadsheet should be deleted (to avoid
contamination with the example data). The result should be error messages or zeroes in most of
the non-blank areas of the spreadsheet that should correct themselves once new data is entered.
(Note much of the spreadsheet is “protected” to ensure the calculations, etc. cannot be erased.)
Next, the measurement model loadings, measurement error variances, and variances for
X and Z should be entered (or copied and pasted) into the appropriate locations on the
spreadsheet (i.e., loadings go in the "lambda" lines, measurement error variances go in the
"theta" lines, and measurement model variances/covariances for X and Z go in the "phi" matrix).
(Note there are locations for 10 loadings and measurment errors—the example has fewer than 10
loadings and measurment errors with blanks because the measures had 5 and 4 items. These
entries will all appear on the EXCEL spreadsheet in bold font as like in the example—again, the
unbolded cells are unrelated to entering data). At this point the loadings (lambda) and
measurement error variances (theta) for XX, XZ and ZZ will be available near the bottom of the
spreadsheet, along with reliabilities and AVE’s for X, Z, XX, XZ, .and ZZ.
If starting “phi” values for X, Z, XX, XZ, and ZZ are desired, also enter the SAS, SPSS,
etc. covariances for X, Z, XX, XZ, and ZZ in the "Error Attenuated Cov's" matrix (again, don’t
forget to average X=(x1+...+xn)/n and Z=(z1+...+zm)/m). These entries will all appear on the
EXCEL spreadsheet in italicized font as they did in the example values. Once this is
accomplished the rest of the covariances in the "phi" matrix should be nonzero.
Obviously deleting old data is important when using this spreadsheet, and it is probably a
good idea to always delete the "Error Attenuated Cov's" entries even if starting values for the
balance of the "phi's" are not desired to avoid accidentally using incorrect phi’s later.
When the spreadsheet has downloaded, it can be saved for repeated later use (i.e., without
going back on line). Thus, it is possible to save a “master” copy of the on-line version of this
EXCEL spreadsheet locally for modification, subsequent calculations, saving modified copies,
etc. For example, I use the spreadsheet to calculate X and Z reliabilities and AVE’s even for
models with no XZ, etc. (I keep track of what model variables X and Z represent in the saved file
name--e.g., “XisSAT_YisALt.xls, etc.)

At the risk of overdoing it, I have one more word(s) about Latent Variable (LV)
Interaction and Quadratic validity (a disinterested reader could skip to the bottom of the text).
Authors in the Social Sciences disagree on what constitutes an adequate demonstration of
validity. Nevertheless, a minimal demonstration of the validity of any LV should probably
include the content or face validity of its indicators (how well they tap into the conceptual
definition of the second-order construct), the LV's construct validity, and its convergent and
discriminant validity (e.g., Bollen, 1989; DeVellis, 1991; Nunnally, 1993). The "validity" of this
LV would then be qualitatively assessed considering its reliability and its performance over this
minimal set of validity criteria.

Construct validity is concerned in part with an LV's correspondence or correlation with


other LV's. The other LV's in the study should be valid and reliable, then their correlations with
the target LV (e.g., significance, direction and magnitude) should be theoretically sound.
Convergent and discriminant validity are Campbell and Fiske's (1959) proposals involving the
measurement of multiple constructs with multiple methods, and they are frequently considered to
be additional facets of construct validity. Convergent measures are highly correspondent (e.g.,
correlated) across different methods. Discriminant measures are internally convergent. However,
convergent and discriminant validity are frequently not assessed in substantive articles as
Campbell and Fiske (1959) intended (i.e., using multiple traits and multiple methods). Perhaps
because constructs are frequently measured with a single method (i.e., the study at hand),
reliability is frequently substituted for convergent validity, and LV correlational distinctness (e.g.,
the target LV's correlations with other measures are less than about 0.7) is substituted for
discriminant validity.

However, LV reliability is a measure of the correspondence between the items and their
LV, the correlation between an LV and its items, and "correlations less than 0.7" ignores
measurement error. Fornell and Larker (1981) suggested that adequately convergent LV's should
have measures that contain more than 50% explained or common variance in the factor analytic
sense (less than 50% error variance, also see Dillon and Goldstein 1984), and they proposed a
statistic they termed Average Variance Extracted (AVE) as measure of convergent validity. AVE
is a measure of the shared or common variance in an LV, the amount of variance that is captured
by the LV in relation to the amount of variance due to its measurement error (Dillon and
Goldstein 1984). In different terms, AVE is a measure of the error-free variance of a set of items
(AVE and its computation are discussed in detail elsewhere on this web site).

AVE can also be used to gauge discriminant validity (Fornell and Larker 1981). If the
squared (error-disattenuated or structural equation model) correlation between two LV's is less
than either of their individual AVE's, this suggests the LV's each have more internal (extracted)
variance than variance shared between the LV's. If this is true for the target LV and all the other
LV's, this suggests the discriminant validity of the target LV.

Unfortunately, experience suggests that AVE in LV Interactions and Quadratics is


typically low, frequently less than 50%. For example, while they are not below 50% see the
lower LV Interaction and Quadratic AVE's in the EXCEL Spreadsheet example when compared
to high AVE’s of X and Z. Thus, to judge the validity of an LV Interaction or Quadratic, first it
must be acceptably reliable (validity assumes reliability). Content or face validity is usually
assumed unless fewer than all the indicators of the constituent variables are used to itemize the
LV Interaction or Quadratic. Construct or correlational validity is usually difficult to judge, and it
might be ignored. Convergent validity (AVE) should be 0.50 or above (the LV Interaction or
Quadratic should be composed of 50% or less error) and it should be discriminant valid with the
other model LV's, except perhaps its constituent variables (X or Z) (i.e., it is empirically distinct
from the other model LV's--its AVE is larger than the squared correlations of the other LV's). In
summary, while there are no hard and fast rules, reliability, and content, convergent and
discriminant validity are probably sufficient to suggest the validity of an LV Interaction or
Quadratic. Reliability, and content and convergent validity would be necessary, and construct
(correlational) validity is usually ignored. With an AVE near 0.50 an LV Interaction or Quadratic
might be argued to be empirically indistinct from 5-10% of the other model LV's by chance
(depending on reviewers). More than that would suggest the LV Interaction or Quadratic is
discriminant invalid, and its validity is impugned.

Experience suggests the substantive effect of the typically low AVE's in LV Interactions
and Quadratics is their structural coefficients and their significances vary widely across
replications. Specifically, with an AVE near 0.50 an hypothesized interaction or quadratic can be
significant in one study but nonsignificant in a replication or near-replication. As a result,
replication of a model test with hypothesized interactions or quadratics becomes comparatively
more important. Specifically, an hypothesized interaction or quadratic that is NS in a model test
could be significant in a replication, or vice versa.

For an LV Interaction or Quadratic with an AVE below 0.50, the alternatives besides
ignoring AVE and hoping reviewers do likewise are to improve AVE in the LV Interaction or
Quadratic. Low AVE in XZ is caused by low correlation between X and Z and/or comparatively
large measurement errors in the items of X and or Z (i.e., low X and/or Z reliability). (Please see
www.wright.edu/~robert.ping/ImprovXZ_AVEa.doc for more on improving XZ and XX
reliability and validity.)

REFERENCES

Bollen, Kenneth A. (1989), Structural Equations with Latent Variables, New York: Wiley.
Campbell, Donald T. and Donald W. Fiske (1959), "Convergent and Discriminant Validation by the Multitrait-
Multimethod Matrix," Psychological Bulletin, 56, 81-105.
DeVellis, Robert F. (1991), Scale Development: Theory and Applications, Newbury Park, CA: SAGE Publications.
Dillon, William R. and Matthew Goldstein (1984), Multivariate Analysis: Methods and Applications, New York:
Wiley.
Fornell, Claes and David F. Larker (1981), "Evaluating Structural Equation Models with Unobservable Variables
and Measurement Error," Journal of Marketing Research, 18 (February), 39-50.
Nunnally, Jum C. (1993), Psychometric Theory, 3rd Edition, New York, NY: McGraw-Hill.
Ping, R. A. (1995) “A Parsimonious Estimating Technique for Interaction and Quadratic Latent Variables,” Journal
of Marketing Research, 32 (August), 336-347.
Ping, R. A. (1996), “Latent Variable Interaction and Quadratic Effect Estimation: A Two-step Technique Using
Structural Equation Analysis,” Psychological Bulletin, 119 (January), 166-175.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy