Jia 2016
Jia 2016
Yueming Jia, Youn Joo Oh, Bernadette Sibuma, Frank LaBanca & Mhora
Lorentson
To cite this article: Yueming Jia, Youn Joo Oh, Bernadette Sibuma, Frank LaBanca & Mhora
Lorentson (2016) Measuring twenty-first century skills: development and validation of a
scale for in-service and pre-service teachers, Teacher Development, 20:2, 229-252, DOI:
10.1080/13664530.2016.1143870
Article views: 41
Introduction
Teachers have been called to better prepare students for work in the twenty-first cen-
tury workplace. Twenty-first century workforce readiness requires students to be
able to apply knowledge to complex and challenging tasks using skills such as prob-
lem solving, evaluation, reasoning, decision-making, and the ability to use digital
technology (Darling-Hammond 2007; Greenhill 2010; Trilling and Fadel 2009). In
response, curriculum and instruction strategies that incorporate twenty-first century
skills into teaching and learning are being implemented nationwide (Coutinho and
Mota 2011; Gibson 2005; Lowther et al. 2012; O’Sullivan and Dallas 2010;
Thomas, Ge, and Greene 2011). However, there is a shortage of instruments
available to measure teachers’ development of twenty-first century skills teaching.
development and validation of such a self-report scale for the teaching of twenty-
first century skills. Using such a scale can pinpoint specific areas in which pre-ser-
vice teacher education and teacher professional development are needed to better
instruct and enhance twenty-first century skills teaching.
To develop the scale, the authors first generated a pool of items based on a
thorough review of literature. Subsequently, two experts in measurement and cur-
riculum reviewed the items. After revision of the items based on expert reviews, the
items in the scale were tested by two groups of pre-service teachers and two groups
of in-service teachers for revision and validation.
first century learning occurs when a learner gains skills in each of the four clusters.
Acquisition of these skills is considered to lead to academic achievement. Addition-
ally, enGauge stated that the use of these standards in practice would decrease
achievement gaps and provide equal opportunities for individuals from all socioeco-
nomic, racial, and cultural groups.
Similarly, the National Research Council (Hilton 2008, 2010) designated five
key twenty-first century skills: adaptability, complex communication/social skills,
non-routine problem solving, self-management/self-development, and systems think-
ing. These skills have been found to be critical to success in the current and future
workplace. A detailed summary of the five frameworks based on Ruiz-Promo’s
review is provided in Table 1.
While each framework contains similar components, there is a lack of consis-
tency as to how specific indicators of twenty-first century skills are categorized.
Ruiz-Primo (2009) compared the above frameworks and concluded that non-routine
problem solving and complex communication/social skills most closely indicated
science proficiency and proposed that the dimensions in these frameworks could be
loaded cross-functional skills – cognitive skills such as thinking, understanding,
learning, and remembering that are likely to be used in any domain. Cross-functional
skills, in combination with dispositions (general inclinations or attitudes of mind)
and science knowledge, were proposed as complete domains of twenty-first century
skills in the context of science education.
In the meantime, based on NCREL/enGauge, ISTE/NETS, and the Partnership
for 21st century Skills standards, Costa and Cogan-Drew (2009) further developed
the areas of twenty-first century skills and proposed the following six essentials:
Information Literacy, Collaboration, Communication, Innovation/Creativity, Problem
Solving, and Responsible Citizenship. In our study, we used Costa and Cogan-
Drew’s six skills as a framework to create our item pool of twenty-first century skill
teaching measure.
Empirical studies
The majority of articles on twenty-first century skills describe implementation of the
skills but with little evidence of corresponding data (Lowther et al. 2012; Moylan
2008; Thomas, Ge, and Greene 2011). Of the empirical studies that have been
Downloaded by [Universite Laval] at 19:23 06 April 2016
NCREL’s enGauge framework (2002) 1. Digital age literacy: basic, scientific, economic, and technological literacy skills; visual, information,
multicultural literacies, and global awareness
2. Inventive thinking: an individual’s adaptability to change, management of complexity, curiosity,
creativity, risk-taking, and higher order thinking
3. Effective communication: the ability to collaborate and interact with others, and one’s sense of personal,
social, and civic responsibility
4. High productivity: being able to prioritize, plan, and manage for results. Use tools effectively and create
relevant, high-quality products
National Research Council (2008) 1. Adaptability: an individual’s ability to function well under new or different circumstances, such as
learning new tools and technologies on the job, handling crisis, new personalities, or cultures with ease
2. Complex communication/social skills: being able to effectively articulate ideas as well as incorporate
verbal and nonverbal information to generate favorable responses from others, including subordinates
and superiors
3. Non-routine problem solving: thinking creatively about solutions to problems or products
4. Self-management/self-development: the ability to work independently, including taking initiative to learn
skills needed for a job
5. Systems thinking: the ability to understand the interrelatedness/interconnections between different parts
of an organization
Teacher Development
233
234 Y. Jia et al.
skills were imparted by which tools. Likewise, in Lambert and Gong’s (2010) study,
ISTE’s NET-S standards were used to redesign an educational technology course for
pre-service teachers. They examined the effects of the redesign on pre-service teach-
ers’ self-efficacy, attitudes, and acquisition of technology skills. Although the con-
tent of the course was twenty-first century skills, the study specifically focused on
technology skill acquisition, not communication, collaboration, or problem solving.
A few empirical studies map directly on to the stated twenty-first century skills.
Hodge and Lear (2011) conducted a survey of 254 business students and 37 profes-
sors to determine what skills were perceived to be the most important for success in
the twenty-first century workforce. These skills were selected after a literature
review including the Partnership for 21st Century Learning. The researchers asked
the United States and international faculty and students to rank 17 twenty-first cen-
tury workforce skills according to their perceived importance to successful employ-
ment after graduation. Differences were found in the rankings between faculty and
the United States and international students, and between students and employers.
The United States students ranked managerial skills as the most important skills to
obtain a job after graduation while faculty members ranked them significantly lower
but, instead, considered interpersonal skills to be most valuable. Critical thinking
and problem-solving skills were considered more important to job procurement by
faculty than by the United States students. The researchers found that the United
States students perceived critical thinking, problem solving, communication, and
creativity to be lower in importance than business executives.
Sardone and Devlin-Scherer (2010) conducted a qualitative study on teacher can-
didates’ use of digital games with high school and middle school students to teach
twenty-first century skills: critical thinking, problem solving, creativity, innovation,
communication, and collaboration. They determined that peer modelling and posi-
tive student feedback made it more likely that teacher candidates would continue to
use digital games in the classroom.
A study conducted in Malaysia cited the enGauge twenty-first century skills and
the Partnership for 21st Century learning frameworks and focused on inventive
thinking as it was presented in both frameworks (Abdullah and Osman 2010). Using
a questionnaire framed by the enGauge twenty-first century skills, the study deter-
mined that 410 grade 5 students in Brunei scored higher on inventive thinking than
students in Malaysia while Malaysian students scored higher on risk taking and
Teacher Development 235
curiosity. Both groups scored low on curiosity and problem-solving skills. Abdullah
and Osman’s research provides an interesting lens to examine cross-cultural differ-
ences in twenty-first century skill acquisition using the existing enGauge twenty-first
century skills and the Partnership for 21st century learning frameworks.
In summary, although a few published empirical studies were found, evidence
for the effectiveness of interventions which enhanced twenty-first century skills in
teaching and learning is noticeably lacking. Needless to say, the field is ripe for rig-
orous research that is grounded on a well-developed twenty-first century framework.
The present study develops a self-report scale to understand teachers’ perceptions of
their capability and readiness to teach twenty-first century skills to high school
students.
Based on Costa and Cogan-Drew’s (2009) framework of six foundational skills for
twenty-first century success, a pool of 32 items was created tapping six aspects of
twenty-first century skills teaching: Information Literacy, Collaboration, Communi-
cation, Innovation/Creativity, Problem Solving, and Responsible Citizenship. Infor-
mation literacy contained items about teaching students how to locate, select, and
use information. Collaboration contained items related to teaching students to work
in a team, be a team leader, and work independently. Communication assessed items
related to the use of technology to effectively communicate ideas and concepts.
Innovation and Creativity items examined the ability to create an innovative solution
to a scientific or business problem and strategies to determine if an idea for a service
or product is original, useful, and valuable. Problem solving included items on
designing research to solve real work problems, analyzing data, evaluating questions
and validity of data, and applying research findings. Responsible citizenship was
indicated by items related to respect for individuals from other cultures, responsible
use of technology, and creation of projects or ideas valuable to society.
Expert review
After the establishment of the initial pool of items, two experts in measurement, cur-
riculum, and literature on twenty-first century skills individually reviewed all items.
The experts agreed on the quality of 16 items, which were evaluated based on the
clarity of language, relevancy to the concept to be assessed, and uniqueness of idea
that the item conveyed relative to other items. The 16 items kept the six aspects of
teaching twenty-first century skills: Information Literacy, Collaboration, Communi-
cation, Innovation/Creativity, Problem Solving, and Responsible Citizenship.
Field testing
After the initial creation of the 16-item scale, we administered the scale to teachers
for further revision. Some researchers have argued that teachers’ perceptions were
formed based on both formal knowledge and classroom teaching experience (Novak
and Knowles 1992; Powell 1992). We speculated that the scale may work differently
for pre-service teachers whose teaching perceptions were shaped mainly through
formal knowledge, than for in-service teachers whose teaching perceptions may be
236 Y. Jia et al.
ship. In this current study, the majority of the in-service teachers taught STEM.
Therefore, questions were framed in STEM settings to specifically address twenty-
first century skills when teaching in STEM, e.g. ‘Teaching students to identify
necessary information to accomplish a STEM task.’ Participants rated their level of
confidence in teaching activities defined by each item using a 7-point Likert scale
ranging from 1 (not at all confident) to 7 (completely confident).
Procedures
Two groups of teachers, one group was pre-service teachers and the other was in-
service teachers, were administered 16 items of the 21st Century Skills Teaching
Scale online using a web-enabled computer. Another two groups of pre-service and
in-service teacher participants were administered 10 items. Consent was collected
prior to survey administration.
Analytic strategies
The purpose of field testing is to help us understand the construct validity of the
scale, providing evidence for scale revision and validation. To achieve this goal, we
relied on two types of analyses, EFA and confirmatory factor analysis (CFA).
The EFA examined the dimensionality and the quality of items, forming the base
for removal or maintenance of factors or items. The number of dimensions was
determined based on scree plot, eigenvalues, and model fit indices. Model fit indices
such as standardized root mean square residual (SRMR), root mean square error of
approximation (RMSEA), comparative fit index (CFI), and the Tucker–Lewis index
(TLI) were used as the criteria of the goodness of fit.1 The value of Chi-square was
also provided, however, Chi-squares are often biased by sample size. Therefore, χ2/
df was calculated and a value close to 2 indicates a good fit of the model (Bentler
1995; Byrne 2001). Item quality was examined based on their factor loadings. A
high-quality item should have a factor loading larger than .45 and no cross-loading
with other factors.2 The quality of dimensions was evaluated according to the num-
ber (more than three items) and the quality of items in the dimension. In each round
of analyses, low-quality items and dimensions were removed and then new rounds
of analyses were conducted. After several rounds of EFAs, a factor model contained
Teacher Development 237
high-quality items and dimensions were selected and examined in a CFA model
using a new group of participants.
The CFA was to verify the validity of the items and factors identified in EFA.
The goodness of fit for the CFA model was judged based on the same model fit
indices used in the EFA such as SRMR, RMSEA, CFI, and TLI. Cronbach’s alpha
of the final model was also calculated as an estimate of the reliability.
All analyses were conducted using Mplus Version 6.1 (Muthén and Muthén
1998–2010). Maximum Likelihood estimation was used to a correlation matrix. The
potential factors in the scale were assumed to correlate with each other, and oblique
rotation was applied to reach an interpretable solution. Although Likert scale vari-
ables are categorical in nature, the use of seven points allowed us to treat them as
continuous variables (Johnson and Creech 1983).
Pre-service teachers
Downloaded by [Universite Laval] at 19:23 06 April 2016
Participants
This sub-study involved two groups of pre-service teachers. Group 1 included 151
pre-service teachers studying at the school of education in one Canadian university.
Seventy-five percent of the participants were female. Fifty percent of this sample
focused on social science education, 17% on art education, 14% on science educa-
tion, 2% on music education, 1% on technology education, 4% on other areas, and
13% failed to reported their area of study. EFAs were conducted with data collected
from this group.
Group 2 involved 70 pre-service teachers. Sixty-one percent of the participants
were female. Sixteen percent of them focused on science education, 16% on math edu-
cation, 69% on other areas. CFAs were conducted with data collected from this group.
EFA
Based on indicators such as eigenvalue, scree plot, and model fit indices,3 a promis-
ing 4-construct model of the scale was revealed after the first round of factor analy-
sis, which means that the items were clustered into groups that measured four
different constructs. A review of the items in the four clusters showed that the items
assessed constructs related to utility of technology, collaboration, innovation and
Downloaded by [Universite Laval] at 19:23 06 April 2016
238
Y. Jia et al.
in other areas
(Continued)
239
Downloaded by [Universite Laval] at 19:23 06 April 2016
Table 3. (Continued).
240
problem solving, and responsible citizenship (see Table 3) respectively. However, all
items in the sub-construct of responsible citizenship simultaneously loaded on
multiple clusters, indicating that this construct is not uniquely defined by its items.
Therefore, we dropped this construct from the scale. Correspondingly, the two items
that primarily loaded on responsible citizenship, ‘Using reflective practices (e.g. blog
journal, discussions with mentor, teacher, or peer) to foster a lifelong learning pro-
cess’ and ‘Conducting a project that has a value to society’ were also removed from
the scale. Using the remaining 14 items, we conducted another round of factor
analysis and examined the items and factors again. A total of three rounds of factor
analyses were conducted before a final acceptable model was identified. During
these analyses, another four items were removed from the scale because they were
loaded into multiple clusters or had a weak bond with a cluster or didn’t load to any
cluster or they belonged to a low-quality factor.
In the fourth rounds of factor analyses, a three-construct solution containing 10
items (Table 3) was eventually identified. All items in this model clearly loaded to
Downloaded by [Universite Laval] at 19:23 06 April 2016
one or another factor and had a loading larger than .45. The model fit indices table
(Table 2) revealed that this three-factor solution had an acceptable model fit. The 10
items measured three constructs: Construct 1, utility of technology, contained three
items; Construct 2, collaboration, contained three items; Construct 3, innovation and
problem solving, contained four items (Table 3).The three constructs explained 68%
of the total variance. Correlations among the three constructs ranged from .32 to
.47, which indicated they were separate but related.4 The internal consistency
reliability of each factor, which indicated how much the items were correlated and
measured the same idea, was acceptable with .78 for Construct 1, .74 for Construct
2, and .81 for Construct 3.
In-service teachers
Participants
This sub-study involved two groups of in-service teachers. Group 1 included 158 in-
service STEM teachers. Sixty-three percent of the participants were female. Forty-
242 Y. Jia et al.
three percent were science teachers, 24% math teachers, 22% technology teachers,
and 11% taught other subjects .The majority (81%) had more than five years
teaching experience. EFAs were conducted with data collected from this group.
Group 2 involved 95 in-service STEM teachers. Sixty-three percent of the partic-
ipants were female. Forty percent were science teachers, 22% math teachers, 37%
technology teachers, 1% did not identify their teaching area. The majority (77%)
had more than five years teaching experience. CFAs were conducted with data
collected from this group.
more, the KMO measure of sampling adequacy size for the sample was .93. The test
showed a good factorability of the current sample.
EFA
The first round of analysis revealed a promising three-construct model,6 in which
the 16 items clustered into three groups. However, two of the three constructs were
highly correlated (r = .75).Therefore the two constructs were merged into one, which
led to a two-construct solution. One group of items assessed things related to utility
of technology; the other included items assessing collaboration, communication,
innovation and creativity, problem solving as well as responsible citizenship, which
corresponded to Ruiz-Primo’s (2009) definition of cross-functional skills. Therefore,
we named this construct cross-functional skills (see Table 4). Two items, ‘Teaching
students to identify necessary information to accomplish a STEM task’ and ‘Teach-
ing students to take the lead on a group STEM project’ were cross-loaded on both
of the two factors (see Table 4). Thus, the items were removed from the scale and a
new round of analysis was conducted with the remaining 14 items.
After four rounds of analyses, a good-quality one-construct model with 10 items
was revealed, measuring cross-functional skills. The one-construct model showed an
acceptable fit (see Table 4), and explained 62% of the total variance. The internal
consistency of the scale was also high (.93).
skills. Pair 2 was items associated with skills of creating ideas, ‘Engaging students
in identifying real-world challenges or problems in STEM areas’ and ‘Teaching stu-
dents to evaluate the quality of an idea for a STEM product’. Pair 3 focused on
STEM application, ‘Teaching students to evaluate the validity of data or evidence
collected from a STEM product’ and ‘Encouraging students to apply STEM con-
cepts to solve problems in other areas’. Therefore, correlations between these pairs
were added to the model. However, Pair 4, ‘Engaging students in collaborating with
peers to achieve a goal on a STEM project’ and ‘Engaging students in making oral
presentations to clearly communicate STEM topics’ were not connected conceptu-
ally, and thus no correlation of this pair was included in the model.
The new model with correlation of three pairs showed an improved and
acceptable fit (Table 7). The internal consistency of this one-dimensional scale was
high (.96). In sum, the CFA showed that this one-dimensional 21st Century Skills
Teaching Scale had a good construct validity.
Downloaded by [Universite Laval] at 19:23 06 April 2016
(Continued)
245
Downloaded by [Universite Laval] at 19:23 06 April 2016
Table 5. (Continued).
246
citizenship skills
Teacher Development 247
Discussion
Downloaded by [Universite Laval] at 19:23 06 April 2016
The goal of the study was to develop a self-report scale to understand teachers’ per-
ceptions of their capability and readiness of teaching to teach twenty-first century
skills. After a literature review, pilot test, and expert reviews, a pool of 16 items was
developed and examined to understand their dimensionalities and item quality in
two groups, one group was pre-service teachers and the other was in-service teach-
ers. The results identified two different scales for pre-service teachers and in-service
STEM teachers. The former was a 10-item three-dimensional scale including utility
of technology, collaboration, and innovation and problem solving; the latter was a
10-item uni-dimensional scale tapping into teaching of what Ruiz-Primo (2009)
called ‘cross-functional skills’. The factor structures and item quality of the two
identified scales were confirmed in further analyses with two new samples of
pre-service and in-service teachers.
Consistent with the concept of the ‘Four Cs’ twenty-first century skills (Beers
2001), both of the two scales contain items tapping into teaching of creative and
innovative thinking, problem-solving skills, effective communication, and working
in a group. However, the categorization of these items did not map well with the
hypothesized subcategories (information literacy, collaboration, communication,
innovation and creativity, problem solving, and responsible citizenship) that served
as the basis for this scale. In the two studies, the six categories either collapsed
together or fell out of the scale, resulting in one (in-service teachers) or three
(pre-service teachers) broader categories. For instance, items in the innovation and
creativity and problem solving categories were loaded together in both studies.
Moreover, for in-service teachers, these items were related to items from collabora-
tion, communication, and responsible citizenship, leading to one single category.
This supports Ruiz-Primo’s (2009) proposition that twenty-first century skills are
highly intertwined and could be framed in fewer broad categories.
A new dimension: utility of technology, was identified in pre-service teachers.
This dimension consisted of three items, ‘Teaching students to use digital tools to
locate information’, ‘Teaching students to use technology tools to clearly communi-
cate concepts’, and ‘Teaching students to use technology in a responsible way’. The
first item was designed to test information literacy teaching, the second to tap into the
issue of teaching communication, and the last item to examine teaching of responsible
citizenship. However, the three items did not load with the other items in respective
hypothesized categories; rather, they loaded together into an independent factor
248 Y. Jia et al.
termed utility of technology. The utility of technology category was also formed once
in in-service teachers, but was eventually removed from the scale due to the poor
quality of the factor and the items (only two items in the factor and one of them was
double loaded). A revision of items may lead to the establishment of this dimension
for in-service teachers. The utility of technology dimension was supported by both
the enGauge framework (Lemke 2002) and NRC Technological Literacy (Gamire and
Pearson 2006). Future studies are needed to develop items tapping the issues in this
dimension using samples of in-service teachers.
For both scales, the category of responsible citizenship failed to be identified. In
pre-service teachers, the four items of responsible citizenship loaded together into
one factor in the first round of analysis. However, each of them also cross-loaded on
other factors, indicating that it is not a distinguishable category. After several rounds
of analysis, two items in this category could not be identified with any established
factors and eventually were removed from the scale. The other two items were sepa-
rately loaded on two different factors. In in-service teachers, the four items did not
Downloaded by [Universite Laval] at 19:23 06 April 2016
load on an independent factor during all rounds of analyses. Finally, only two of the
items remained in the scale. These results indicate that responsible citizenship is not
a well-recognized domain of twenty-first century skills teaching.
In this study, a pool of items tapping twenty-first century skills teaching were
tested in two groups: in-service STEM teachers and pre-service teachers. The analy-
ses revealed a different scale for each group. According to Richardson (1996),
personal experience, experience with school and instruction, and experience with
formal knowledge shape pre-service teachers’ pedagogical beliefs. Similarly, in-ser-
vice teachers’ pedagogical beliefs are formulated based on the same three sources,
but with an addition of knowledge gained through current teaching practice (Novak
and Knowles 1992; Powell 1992).Therefore, we speculate that the different percep-
tions of twenty-first century skills between in-service and pre-service teachers may
represent some gaps between theories of twenty-first century skills teaching in tea-
cher education programs and how this understanding is actually put into practice
within the classroom setting. Furthermore, in this study, the in-service teachers
taught mainly STEM; the term ‘STEM’ was added to most of the items to specify
the subject domain in the in-service teacher survey. Thus, the different perceptions
of pre-service teachers and in-service teachers revealed in the study may represent
domain-specific characteristics of twenty-first century skills. Future studies are
needed to examine these hypotheses by testing pre-service teachers in the STEM-
specific domain and in-service teachers in the general domain.
Disclosure statement
No potential conflict of interest was reported by the authors.
Funding
This work was supported by the Research and Evaluation Program at the Nellie Mae Educa-
tional Foundation.
Notes
1. Conventionally, SRMR smaller than .05, RMSEA smaller than .06, CFI larger than .96,
and TLI larger than .95 are considered to be a good fit (Hu and Bentler 1999). Whereas
SRMR smaller than .08, RMSEA smaller than .1, CFI larger than .90, and TLI larger
than .90 are considered to be an acceptable fit (Brown and Cudeck 1993).
2. Loading size larger than .60 was considered to be high and larger than .45 was
considered acceptable. Loading size larger than .35 in over one factor was considered as
cross-loading (Pett, Lackey, and Sullivan 2003; Silvera, Martinussen, and Dahl 2001;
Tabachnick and Fidell 2001).
3. Four factors had an eigenvalue larger than 1, indicating a four-factor solution. The scree
plot also showed that from the fourth factor on, the fractions of the total variance
explained by each successive factor were minimal. Model fit indices for each of the four
factor models were shown in Table 2.
4. According to Cohen (1988), a correlation of .30 to .50 was moderate.
5. The factor loadings from .75 to .81 for Utility of Technology, .55 to .82 for Collabora-
tion, and .64 to .81 for Innovation and Problem Solving.
6. Three factors had an eigenvalue larger than 1, indicating a three-factor solution. The
scree plot showed that, after the third factor, the fraction of the total variance explained
by each successive factor dramatically dropped. Model fit indices for each of the four
factor models were shown in Table 4.
250 Y. Jia et al.
Notes on contributors
Yueming Jia, PhD, conducted this work while a Senior Research Associate in the Learning
and Teaching Division at Education Development Center, Inc. She is now a primary research
scientist at ABA Research & Educational Consulting at Cambridge, MA.
Youn Joo Oh, EdD, formerly a Project Director in the Learning and Teaching Division at
Education Development Center, Inc., is now a primary research scientist at ABA Research &
Educational Consulting at Cambridge, MA.
Bernadette Sibuma, EdD, is a Research Associate in the Learning and Teaching Division at
Education Development Center, Inc. in Waltham, MA.
Frank LaBanca, EdD, is the Director of the Center for 21st Century Skills at EDUCATION
CONNECTION.
Mhora Lorentson, PhD, is the Director of the Center for Collaborative Evaluation and Strate-
gic Change at EDUCATION CONNECTION.
Downloaded by [Universite Laval] at 19:23 06 April 2016
References
AASL (American Association for School Librarians). 2007. The American Association for
School Librarians Standards for the 21st Century Learner. Chicago: The American Asso-
ciation for School Librarians.
Abdullah, M., and K. Osman. 2010. “Inventive Thinking Skills in Science: A Comparative
Study between Students in Malaysia and Brunei.” International Journal of Learning 17
(9): 227–236.
Beers, S. 2001. Teaching 21st Century Skills: An ASCD Action Tool. Alexandria: ASCD.
Bentler, P. M. 1995. EQS Structural Equations Program Manual. Encino: Multivariate Soft-
ware.
Brown, M.W., and R. Cudeck. 1993. “Alternative Ways of Assessing Model Fit.” In Testing
Structural Equation Models, edited by K. A Bollen and J. S. Long, 36–62. Thousand
Oaks: Sage.
Byrne, B. M. 2001. Structural Equation Modeling with AMOS: Basic Concepts, Applications
and Programming. Mahwah: Erlbaum.
Cohen, J., ed. 1988. Statistical Power Analysis for the Behavioural Sciences. 2nd ed.
Mahwah: Lawrence Erlbaum.
Costa, J. P., and D. Cogan-Drew. 2009. Six Skills That Form the Foundation for 21st Century
Success. Litchfield: EDUCATION CONNECTION.
Coutinho, C., and P. Mota. 2011. “Web 2.0 Technologies in Music Education in Portugal:
Using Podcasts for Learning.” Computers in the Schools 28 (1): 56–74.
Darling-Hammond, L. 2007. “Building a System for Powerful Teaching and Learning.” In
Building a 21st Century U.S. Education System, edited by B. Wehling and C. Schneider,
65–74. Washington, DC: National Commission on Teaching and America’s Future.
Gamire, E., and G. Pearson. 2006. Tech Tally: Approaches to Assessing Technological Liter-
acy. Washington, DC: The National Academies Press.
Gibson, I. W. 2005. “Constructing Meaning in a Technology-rich, Global Learning Environ-
ment.” Computers in the Schools 22 (1–2): 169–182.
Gorsuch, R. L. 1983. Factor Analysis. 2nd ed. Mahwah: Lawrence Erlbaum.
Greenhill, V. 2010. 21st Century Knowledge and Skills in Educator Preparation. Washington,
DC: Partnership for 21st Century Skills.
Hilton, M. 2008. Research on Future Skill Demands: A Workshop Summary. Washington,
DC: The National Academies Press.
Hilton, M. 2010. Exploring the Intersection of Science Education and 21st Century Skills: A
Workshop Summary. Washington, DC: The National Academies Press.
Hodge, K. A., and J. L. Lear. 2011. “Employment Skills for 21st Century Workplace: The
Gap between Faculty and Student Perceptions.” Journal of Career and Technical Educa-
tion 26 (2): 28–41.
Teacher Development 251
Hu, L., and P. M. Bentler. 1999. “Cutoff Criteria for Fit Indexes in Covariance Structure
Analysis: Conventional Criteria Versus New Alternatives.” Structural Equation Model-
ing: A Multidisciplinary Journal 6: 1–55.
International Society for Technology in Education. 2000. “ISTE National Educational Tech-
nology Standards.” Accessed June 2012. http://www.iste.org/standards.
Johnson, D. R., and J. C. Creech. 1983. “Ordinal Measures in Multiple Indicator Models: A
Simulation Study of Categorization Error.” American Sociological Review 48: 398–407.
Lambert, J., and Y. Gong. 2010. “21st Century Paradigms for Pre-Service Teacher Technology
Preparation.” Computers in the Schools 27 (1): 54–70. doi:10.1080/07380560903536272.
Lemke, C. 2002. EnGauge 21st Century Skills: Digital Literacies for a Digital Age. Naper-
ville: North Central Regional Educational Laboratory.
Lowther, D. L., F. A. Inan, S. M. Ross, and J. Strahl. 2012. “Do One-to-One Initiatives
Bridge the Way to 21st Century Knowledge and Skills?” Journal of Educational Comput-
ing Research 46 (1): 1–30.
Mayrath, M., J. Clarke-Midura, D. H. Robinson, and G. Schraw. 2012. Technology-Based
Assessments for 21st Century Skills: Theoretical and Practical Implications from Modern
Research. Charlotte: Information Age Publishing.
McCreery, M. P., P. G. Schrader, and S. Krach. 2011. “Navigating Massively Multiplayer
Downloaded by [Universite Laval] at 19:23 06 April 2016
Online Games: Evaluating 21st Century Skills for Learning within Virtual Environ-
ments.” Journal of Educational Computing Research 44 (4): 473–493. doi:10.2190/
EC.44.4.f.
Moylan, W. 2008. “Learning by Project: Developing Essential 21st Century Skills Using
Student Team Projects.” International Journal of Learning 15 (9): 287–292.
Muthén, L. K., and B. O. Muthén. 1998–2010. Mplus Version 6.1 [Computer Software]. Los
Angeles: Muthén and Muthén.
Novak, D., and J. G. Knowles. 1992. “Life Histories and the Transition to Teaching as a Sec-
ond Career.” Paper presented at the annual meeting of the American Educational
Research Association, Chicago, IL, April.
O’Sullivan, M. K., and K. B. Dallas. 2010. “A Collaborative Approach to Implementing 21st
Century Skills in a High School Senior Research Class.” Education Libraries 33 (1):
3–9.
Partnership for 21st Century Skills. 2002. “Learning for the 21st Century: A Report and Mile
Guide for 21st Century Skills.” http://www.21stcenturyskills.org/images/stories/otherdocs/
p21up_Report.pdf
Pett, M. A., N. R. Lackey, and J. J. Sullivan. 2003. Making Sense of Factor Analysis: The
Use of Factor Analysis for Instrument Development in Health Care Research. Thousand
Oaks: Sage.
Powell, R. 1992. “The Influence of Prior Experiences on Pedagogical Constructs of Tradi-
tional and Nontraditional Preservice Teachers.” Teaching and Teacher Education 8 (3):
225–238.
Richardson, V. 1996. “The Role of Attitudes and Beliefs in Learning to Teach.” In Handbook
of Research on Teacher Education, 2nd ed., edited by J. Sikula, 102–119. New York:
Macmillan.
Ruiz-Primo, M. A. 2009. “Towards a Framework for Assessing 21st Century Science Skills.”
Paper prepared for the Workshop on Exploring the Intersection of Science Education and
the Development of 21st Century Skills, Washington, DC, February. http://www7.nation
alacademies.org/bose/RuizPrimo.pdf
Sardone, N. B., and R. Devlin-Scherer. 2010. “Teacher Candidate Responses to Digital
Games.” Journal of Research on Technology in Education 42 (4): 409–425.
Silvera, D. H., M. Martinussen, and T. I. Dahl. 2001. “The Tromso Social Intelligence Scale,
A Self-report Measure of Social Intelligence.” Scandinavian Journal of Psychology 42:
313–319.
Smith, J. J., and E. Dobson. 2011. “Beyond the Book: Using Web 2.0 Tools to Develop 21st
Century Literacies.” Computers in the Schools 28 (4): 316–327. doi:10.1080/
07380569.2011.620939.
Tabachnick, B. G., and L. S. Fidell. 2001. Using Multivariate Statistics. 4th ed. New York:
Harper and Row.
252 Y. Jia et al.
Thomas, M. K., X. Ge, and B. A. Greene. 2011. “Fostering 21st Century Skill Development
by Engaging Students in Authentic Game Design Projects in a High School Computer
Programming Class.” Journal of Educational Computing Research 44 (4): 391–408.
Trilling, B., and C. Fadel. 2009. 21st Century Skills: Learning for Life in Our Times. San
Francisco: John Wiley & Sons.
Downloaded by [Universite Laval] at 19:23 06 April 2016