Comparison of Instructors' and
Comparison of Instructors' and
This study used an extensive online course evaluation inventory to analyze the subjects’ perceptions of course
effectiveness in the following subscales: flexibility, user interface, navigation, getting started, technical assis-
tance, course management, universal design, communications, instructional design, and content. Survey
results compared perceptions across instructors, students, and demographics, to include age, gender, educa-
tional level, and course experience. Results indicated that both students and instructors had positive percep-
tions of course effectiveness, with instructors having higher perceptions than students in some subscales. The
results also indicated positive correlations between perceptions and teaching experience, suggesting the need
for further research in the flexibility, communications, and online instructional design course effectiveness
subscales.
• Soonhwa Seok, The Center for Research on Learning, The University of Kansas, 1000 Sunnyside, Lawrence, KS 66045.
Telephone: (785) 864-2700.
The Quarterly Review of Distance Education, Volume 11(1), 2010, pp. 25–36 ISSN 1528-3518
Copyright © 2010 Information Age Publishing, Inc. All rights of reproduction in any form reserved.
26 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010
and students. For example, in 2000 the Bell- assistance, course management, universal
South Foundation (2003) launched the “Power design, communications, online instructional
to Teach Program” to explore ways to help cre- design, and content. This study was an exten-
ate a critical mass of K-12 teachers capable of sion of a validation study conducted by Soek
incorporating technology into everyday class- (2006) that piloted a comprehensive online
room experiences. The data showed that while course evaluation instrument at the postsec-
teachers felt they were making dramatic leaps ondary level.
in using technology to create new learning In this study, two survey questionnaires
experiences for their students, students saw were used to gather students’ and instructors’
few changes in their classroom instruction. In demographic information and their percep-
fact, students revealed that they were hungry tions of online course effectiveness in commu-
for more opportunities to use technology in the nity colleges. Variables of gender, age, native
learning environment. language, academic major, educational level,
So, how can there be such a disparity in per- technology skill, and experience with online
ceptions? The Web-based learning model— courses were taken into consideration to deter-
the Model of Community of Inquiry—devel- mine if they significantly affected subjects’
oped at the University of Alberta, assumes that perceptions of the effectiveness of Web-based
learning occurs within the community through courses. Educational theories, principles, prac-
the interaction of three elements: cognitive tices, and research on perceptions of course
presence, social presence, and teaching pres- effectiveness in Web-based instruction was the
ence (Garrison, Anderson, & Archer, 2000; research base for this study. The findings may
Seok, 2006, 2007a, 2007b). One of the contrib- be used to help instructors and course design-
uting factors may have to do with teaching ers gain a better understanding of how to eval-
presence. While cognitive and social presences uate, design, and deliver more effective web-
are considered core elements in learning, based learning environments for each subscale
whether or not learning is achieved depends on and across students’ demographics (Seok,
the presence of a teacher to facilitate the learn- 2007a, 2007b, 2008).
ing activities. Other contributing factors may
be at play as well with regard to social pres-
ence.
Research Questions
Although the BellSouth program focused In order to compare instructors’ and stu-
on K-12, in general, it sheds light on the dents’ perceptions of the effectiveness of
importance of perceptions in the classroom. online courses in a community college setting,
For instructors to deliver more effective Web- answers were sought to the following research
based learning in community college settings, questions:
we need to know more about the perceptions of
online courses by the producers and consumers • Research Question 1: Are there significant
of online technology—instructors and stu- relationships between students’ and in-
dents. structors’ perceptions of online course
effectiveness and students’ and instructors’
Purpose demographic characteristics (i.e., gender,
age, native language, major, educational
This study compared instructors’ and stu- level, technology skill, and learning experi-
dents’ perceptions of the effectiveness of ence)?
online courses in community college settings. • Research Question 2: Are there significant
Course effectiveness was analyzed along the differences between students’ perceptions
following composites: flexibility, user inter- and instructors’ perceptions of online
face, navigation, getting started, technical course effectiveness?
Comparison of Instructors’ and Students’ Perceptions of the Effectiveness of Online Courses 27
with the Internet and their perceptions of the Guidera (2004) investigated the perceptions
web-based distance learning activities/assign- of faculty at both public and nonprofit private
ments portion of the hybrid program. In other institutions in the United States—including 2-
words, subjects who had more experience with year institutions, 4-year colleges, and universi-
the Internet indicated significantly more favor- ties—on the effectiveness of online instruction
able perceptions of the web-based distance in terms of the seven principles of effective
learning activities and assignment portion than undergraduate education. These seven princi-
did subjects with less experience with technol- ples include good instructional practice,
ogy and the Internet. encourage student-faculty contact, encourage
While finally, Jurczyk, Benson, and Savery cooperation among students, encourage active
(2004) used a standards-based approach to learning, provide prompt feedback, emphasize
measure student perceptions in web-based time on task, communicate high expectations,
courses in an effort to develop a process for and respect diverse talents and ways of learn-
evaluating perceptions. Standards were based ing. The research results indicated that online
on a literature review and interviews with 147 instruction was rated slightly more effective
individuals. Subjects included faculty mem- overall and more effective for promoting
bers, students, and administrators at six lead- prompt feedback, time on task, respect for
ing accredited institutions in distance diverse learning styles, and communicating
education. Forty-five benchmarks were identi- high expectations. However, it was rated less
fied and organized into seven categories: insti- effective for promoting student-faculty contact
tutional support, course development, and cooperation among students. Interestingly,
teaching/learning process, course structure, the perceived effectiveness was higher for
evaluation and assessment, student support, experienced faculty, which increased with the
and faculty support. In addition to a process, a number of online courses taught (Guidera,
questionnaire was developed to measure stu- 2004).
dent attitudes before, during, and after taking Finally, Seok (2006, 2007a, 2007b, 2008)
the web-based course. identified indicators to evaluate online instruc-
tion at the postsecondary level. Ninety-nine
indicators applicable to the evaluation of
Instructors’ Perceptions of online instruction were identified and vali-
Online Course Effectiveness dated. In doing so, Seok (2006) developed 11
categories as follows: flexibility (e.g., sched-
Positive findings have also been found with ules, technical skills, and work settings), user
regard to instructors’ perceptions of online interface (e.g., consistent user interface and
course effectiveness. For example, Wingard appealing screens), navigation (e.g., menus
(2004) found that a significant number of fac- and labels), getting started (e.g., class orienta-
ulty thought that adding web-improved prepa- tion and performance expectations), technical
rations, for themselves as well as for their assistance (e.g., online help and on-call sup-
students, contributed to greater student port), course management—instructor (e.g.,
engagement and active learning in the class- manage student assignments and monitor stu-
room. Within an online environment, faculty dent progress), course management—student
often felt they were more familiar with their (e.g., detailed syllabus and benchmarks for
students’ academic progress during the term completing course requirements), universal
and reported a growing expectation that stu- design (e.g., accommodate students with dis-
dents could take more responsibility for inde- abilities and allow students to vary font size),
pendently learning the fundamentals from the communication (e.g., individual responses,
readily available resources provided on the distribute announcements, discussions, and
Web. chat), online instructional design (e.g., glos-
Comparison of Instructors’ and Students’ Perceptions of the Effectiveness of Online Courses 29
sary of all terms, content maps of all lessons, ber of online courses completed (for students)
and additional resources for enrichment), and and taught (for instructors).
content. These validated indicators can be
transformed into item scales and subscales to
Dependent Variables
evaluate the effectiveness of learning instruc-
tion. It is these indicators that are examined in Eleven dependent variables were included
this study. It is hoped that instruments derived to ascertain students’ perceptions of online
from this study’s findings may be used to eval- course effectiveness: flexibility, user interface,
uate productivity and processes of e-learning navigation, getting started, technical assis-
(Seok, 2007a, 2007b). tance, course management (instructor), course
management (student), universal design, com-
munications, instructional design, and content.
METHODOLOGY
Validity and Reliability of the Instrument items ranged from 1 = strongly disagree to 5 =
strongly agree. All subscales in this study were
Validity. Ninety-nine items were developed found to have alpha levels greater than 0.7,
and solidly validated in a previous study indicating acceptable reliability (see Table 1).
(Seok, 2006) to develop an instrument to eval- In this study, factor analysis consisted of
uate online courses by utilizing 4, 856 pair- the four typical phases: Phase I—item devel-
wise comparisons by subject matter experts opment, Phase II—item distribution to large
(SMEs) by implementing multidimensional sampling, Phase III—factor analysis applica-
scaling (MDS). tion, and Phase IV—item revision.
Reliability. Cronbach’s coefficient alphas As mentioned earlier, MDS was conducted.
were used to compute internal consistency This quantitative statistic application is the
estimates of reliability for each subscale (flex- process of item validation, which corresponds
ibility, user interface, navigation, getting to Phase I in factor analysis. Multidimensional
started, technical assistance, course manage- scaling utilizes SMEs to validate the identified
ment, universal design, communications, and developed items (which is one of the dif-
instructional design, and content) of the mea- ferences between factor analysis and multidi-
surement instrument. mensional scaling) and for the purpose of this
According to Nunnaly (1978), 0.7 is an study, consists of six phases.
acceptable reliability coefficient. Using an Phase I involved a literature search to
internal consistency estimate of reliability, identify indicators crucial to effective online
individuals are administered a measure with learning. Phase II involved two subject mat-
multiple parts on a single occasion (Green & ter experts (SMEs) reaching consensus on the
Salkind, 2005). In the current instrument, no indicators, consisting of a comprehensive set
items needed to be reverse-scaled since all sur- of 99 independent indicators considered to be
vey questions presented positively worded representative of online course structure for
statements. Furthermore, all items shared the postsecondary education. Phase III involved
same metric, since the response scale for all developing an online MDS instrument to
TABLE 1
Reliability Statistics for Internal Consistency of Instructors’
and Students’ Perceptions of Online Course Effectiveness Subscale
Instructors/Students
Flexibility .72/.76 6
User interface .83/.84 9
Navigation .84/.84 6
Getting started .80/.81 6
Technical assistance .83/.81 4
Course management (instructor) .89/.91 10
Course management (student) .80/.84 7
Universal design .80/.79 7
Communications .87/.87 8
Online instructional design .90/.94 22
Content .92/.95 14
Comparison of Instructors’ and Students’ Perceptions of the Effectiveness of Online Courses 31
The results of the Soek (2006) study indi- The subsequent sections describe the data
cated that a three-dimensional solution was an collected on the demographic section of the
appropriate model for analysis, thus indicating surveys, which showed significant findings.
a .24222 of the fit values of STRESS and 74%
of R2. Three dimensions were labeled as acces-
Gender
sibility, adaptability, and clarity of communi-
cation. Four clusters of indicators were Gender was found to be a statistically sig-
identified as contextual accommodation, nificant factor for instructors and students
instructional access, guided learning, and orga- alike. Both female instructors and students had
nizational clarity (Seok, 2006, 2007a, 2007b). statistically significant higher perceptions of
An ANOVA test was used to measure the online course effectiveness than males.
whether: One-way ANOVAs indicated a significant dif-
ference between male and female instructors’
perceptions of the subscales as follows. For
• the means on subjects’ perceptions of getting started, F(2, 191) = 4.97, p < .05, the
course effectiveness subscales were signif- mean of female instructors was significantly
icantly different across gender, age, native higher (M = 4.2, SD = .57) than the mean of
language, academic majors, educational male instructors (M = 4.1, SD = .61). In the
levels, and technology skills; and case of technical assistance, F(2, 191) = 7.83, p
• if the means on students’ and instructors’ < .05, the mean of female instructors was sig-
perceptions of course effectiveness were nificantly higher (M = 3.9, SD = .75) than the
significantly different. mean of male instructors (M = 3.6, SD = .80).
32 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010
TABLE 2
Instructors’ and Students’ Perceptions of Online Course Effectiveness by Subscales
Instructor/Student
Subscale N Mean SD
TABLE 3
Pearson Correlation Coefficient for Online Course Taught and Subscales
was significantly higher (M = 4.0, SD = .62) effectiveness subscales: navigation, F(5, 188)
than the mean of other language speakers (M = = 2.66, p < .05; getting started, F(5, 188) =
3.4, SD = .88). Non-native-English-speaking 4.32, p < .05; course management, F(5, 188) =
students had significantly lower satisfaction 3.22, p < .05; and universal design, F(5, 188) =
with online course effectiveness in the user 3.64, p < .05.
interface and getting started subscales than Significant differences also emerged across
native English-speaking students. Such a find- students’ educational levels in their percep-
ing deserves further attention in future tions of the online course effectiveness in the
research. Topics might include how computer subscales of instructional design, F(6, 135) =
user interface design affects students’ online 3.68, p < .05, and content F(6, 135) = 5.20, p <
learning with regard to cultural diversity and .05. This finding indicates that instructors with
how to provide a better getting-started design higher educational levels may deliver more
in online learning for non-native-English- effectively designed online courses in terms of
speaking students of diverse cultural back- navigation, getting started, course manage-
grounds. No statistically significant differ- ment, and universal design.
ences were found in instructors’ perceptions of
online course effectiveness subscales across
instructors’ native languages and academic
Technology Skills
majors. Results indicated significant differences
across instructors’ technology skills in their
Educational Level perceptions of the online course effectiveness
of the following subscales: flexibility, F(3,
Significant differences were found across 190) = 3.81, p < .05; user interface, F(3, 190) =
instructors’ education levels (less than high 3.82, p < .05; communications, F(3, 190) =
school, associate’s degree, bachelor’s degree, 4.44, p < .05; online instructional design, F(3,
master’s degree, and doctoral degree) in their 190) = 4.55, p < .05; and content, F(3, 190) =
perceptions of the following online course 6.74, p < .05. For the students, statistically sig-
34 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010
TABLE 4
Means of Instructors Perceptions of Subscales Across Technology Skill Group
Mean
nificant differences were found across technol- courses than did students. These results
ogy skills in their perceptions of the online deserve further attention and should be subse-
course effectiveness for the content subscale, quently researched. For example, it would be
F(3, 138) = 3.83, p < .05. advantageous to explore whether the differ-
The flexibility mean value for beginners ence in perceptions are due to generational
was higher than the value for instructors with gaps between what Prensky (2003) refers to as
intermediate skills. However, both mean val- digital natives and digital immigrants.
ues were significantly lower than the mean Results also showed positive correlations
value for instructors with advanced technology between online courses taught and most sub-
skills. Thus, increased technology skills may
scales except flexibility and navigation. These
contribute to instructors’ perceptions of deliv-
findings indicated that the delivery of effective
ering more effectively designed online courses
in the following course effectiveness sub- courses depends upon teaching experience.
scales: flexibility, user interface, communica- Instructors with advanced technology skills
tions, online instructional design, and content. had statistically significant higher perceptions
As a result, the delivery of effective online in some course effectiveness subscales, further
courses may depend upon increased technol- implying that, with rapid advances of informa-
ogy skills as illustrated in Table 4. tion technology, popular use of course man-
agement systems, comfort levels of using a
computer, and increased online learning/teach-
Comparison Between Instructor’s ing experience, come the increased positive
and Students’ Perceptions perceptions of online course effectiveness on
of Online Course Effectiveness the part of both students and instructors.
Results of the ANOVA showed that Furthermore, native-English-speaking stu-
instructors had statistically significant higher dents had statistically significant higher per-
perceptions toward online course effectiveness ceptions of online course effectiveness than
than students in the following subscales: get- did non-native-English-speaking students in
ting started, course management, communica- terms of user interface and getting started.
tions, and content (see Table 1). Non-native-English-speaking students had sig-
nificantly lower satisfaction with online course
effectiveness in these subscales than did
Implications
native-English-speakers. Such findings may
The aforementioned results indicated that imply that non-native-English-speakers may
instructors had statistically significant higher need additional assistance when taking online
perceptions of the effectiveness of online courses.
Comparison of Instructors’ and Students’ Perceptions of the Effectiveness of Online Courses 35
Suggestions for Future Research online learning and the development of cogni-
tive and social strategies for students with dif-
Based on the findings of this study, further ferent cultural and linguistic backgrounds.
research is recommended related to the follow-
ing course effectiveness subscales: flexibility,
communications, and online instructional APPENDIX: INDEPENDENT
design. For flexibility, further research is rec- VARIABLES
ommended relating to effective self-paced
online learning and teaching because of the Nominal variable for the independent vari-
demands of self-paced flexibility from online ables: Gender: 1= female and 2 = male;
course students and concerns expressed by fac- Native language: 1= English and 2 = other;
ulty about students’ success in the self-paced Academic major: 1 = business and 2 = contin-
learning environment. For communications, it ued education, 3 = engineering or technology,
is recommended that appropriate communica- 4 = fine arts, 5 = humanities, 6 = information
tion theories and principles that meet the needs technology, 7 = mathematics, 8 = natural sci-
of online learning environments are included ences, 9 = nursing, dental or allied health, and
in future research. While finally, for online 10 = social sciences.
instructional design, it is recommended that Ordinal variables for the independent vari-
future studies address students’ learning pref- ables: Educational level: 1 = less than high
erences (Dunn & Dunn, 1978) in online school, 2 = high school diploma or GED, 3 =
instructional design. It is also the recommen- associate’s degree, 4 = bachelor’s degree, 5 =
dation of the authors that non-native-English master’s degree, and 6 = doctoral degree;
speakers’ perceptions see further attention in Technology skills: 1 = beginner: students who
future research. have minimum experience in using a computer
and the Internet, 2 = intermediate: students
who use a computer and the Internet on a daily
CONCLUSION basis, and 3 = advanced: students who have
the ability to solve their problems in using a
The descriptive results indicated that, overall, computer and the Internet.
students and instructors had positive percep- Interval variable: Number of online courses
tions of online course effectiveness. Findings, taught and completed.
generally speaking, are in line with past studies
investigating the perceptions of instructors and
students with regard to online courses. REFERENCES
As previously discussed, there are three
major aspects of online learning: cognitive, Allen, I. E., & Seaman, J. (2005). Growing by
social, and teaching (Seok, 2006, 2007a, degrees: Online education in the United States.
2007b). The findings of this research related to Needham, MA: Sloan-C. Retrieved from http://
the teaching aspects in terms of students with www.sloan-c.org/resources/
different language backgrounds other than growing_by_degrees.pdf
English, teaching experience, and technology BellSouth Foundation. (2003). The big difference:
skills. Teaching experience and technology The growing technology gap between schools
and students [Electronic version]. Retrieved
skills were found to be highly correlated with
from http://www.bellsouthfoundation.org/pdfs/
online course effectiveness, while students pttreport03.pdf
with different language backgrounds other Dunn, R., & Dunn, K. (1978). Teaching students
than English had low perceptions of online through their individual learning styles: A prac-
course effectiveness. tical approach. Reston, VA: Prentice Hall.
Taken together, these findings underscore Dwyer, D., Barbieri, K., & Doerr, H. (1995). Creat-
the importance of faculty development in ing a virtual classroom for interactive education
36 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010
on the Web [Electronic version]. Retrieved from Online Learning. (2005). Online learning: Con-
http://www.igd.fhg.de/archive/1995_www95/ cepts, strategies and application [Electronic ver-
papers/62/ctc.virtual.class/ctc.virtual.class.html sion]. Retrieved from http://www.prenhall.com/
Garrison, D. R., Anderson, T., & Archer, W. (2000). dabbagh/ollresources/resources9.html
Critical inquiry in a text-based environment: Prensky, M. (2003, May/June). Overcoming educa-
Computer conferencing in higher education. The tors' digital immigrant accents: A rebuttal [Elec-
Internet and Higher Education, 11(2), 114. tronic version]. The Technology Source
Green, S. B., & Salkind, N. J. (2005). Using SPSS Archives. Retrieved from http://technolo-
for Windows and Macintosh: Internal consis- gysource.org/article/overcoming_educators_
tency estimates of reliability. Upper Saddle digital_immigrant_accents
River, NJ: Pearson Prentice Hall. Ragan, L. (1999). Good teaching is good teaching:
Guidera, S. G. (2004). Perceptions of the effective- An emerging set of guiding principles and prac-
ness of online instruction in terms of the seven tices for the design and development of distance
principles of effective undergraduate education. education. CAUSE/EFFECT Journal, 22, 1.
Journal of Educational Technology Systems, Seok, S. (2006). Validation of items by rating the
32(2 & 3), 139-178. proximity between similarity and dissimilarity
Illinois Online Network. (2007). Instructional strat- among items in pairs for online course evalua-
egies for online courses [Electronic version]. tion instrument at the postsecondary educational
Retrieved from http://www.ion.illinois.edu/
level. Unpublished dissertation, University of
resources/tutorials/pedagogy/instructionalstrate-
Kansas, Lawrence.
gies.asp
Seok, S. (2007a). Item validation of online postsec-
Jurczyk J., Benson, S., & Savery, J. R. (2004, Win-
ondary courses: Rating the proximity between
ter). Measuring student perceptions in web-
similarity and dissimilarity among item pairs
based courses: A standards-based approach
(Validation study series i—Multidimensional
[Electronic version]. Online Journal of Distance
scaling) [Electronic version]. Retrieved March
Learning Administration. Retrieved from http://
11, 2009, from http://www.springerlink.com/
www.westga.edu/~distance/ojdla/winter74/
jurczyk74.htm content/wxp145x6125q8j27
Koohang, A., & Durante, A. (2003). Learners’ per- Seok, S. (2007b). Standards, accreditations, bench-
ceptions toward the Web-based distance learn- marks in distance education. Quarterly Review
ing activities/assignments portion of an of Distance Education, 8(4), 387-398.
undergraduate hybrid instructional model. Jour- Seok, S. (2008). Teaching aspects of e-learning. The
nal of Informational Technology Education, 2, International Journal on E-Learning, 7(4), 725-
105-113. 741.
Nunnaly, J. (1978). Psychometric theory. New Tung, C. K. (2007). Perceptions of students and
York, NY: McGraw-Hill. instructors of online and Web-enhanced course
O’Malley, J., & McGraw, H. (1999, Winter). Stu- effectiveness in community colleges. Unpub-
dents’ perceptions of distance learning, online lished dissertation, University of Kansas, Law-
learning and the traditional classroom [Elec- rence.
tronic version]. Online Journal of Distance Wingard, R. G. (2004). Classroom teaching changes
Learning Administration. Retrieved from http:// in Web-enhanced courses: A multi-institutional
www.westga.edu/~distance/omalley24.html study. Educause Quarterly, 1, 26-35.
AUTHOR BIOGRAPHICAL DATA
Aries Cobb, an assistant professor of educa- has been involved in a number of defense pro-
tional technology in the Division of Education grams to include the Warfighters’ Simulation,
at Baldwin-Wallace College, works with the One Semi-Automated Forces simulation,
teaching professionals and teaching candidates and Future Combat Systems.
to use technology-based instruction in the
classroom. Formerly principal investigator of
Enhancing Education Through Technology Taurean T. Davis graduated with his master’s
(EETT) for the Cleveland Metropolitan School degree in student affairs/counselor education
District, Cobb assessed the EETT program, from Clemson University in Clemson, SC. He
provided teachers with instructional strategies serves as career counselor for outreach at the
to integrate technology in the classroom, and University of Virginia. His research interests
assisted teachers in increasing their student include first generation students, transfer stu-
academic achievement by maintaining an dents, and multicultural issues in higher educa-
e-Portfolio for their students. She is the author
tion.
of “e-Portfolio: Action Research Team Profes-
sional Development Plan,” published in Dis-
tance Learning. Cobb’s research interests
Jianxia Du earned her BA from Southwest
relate to cooperative learning and the use of
Normal University in China where she later
instructional technologies for the improvement
served as an assistant professor. After coming
of teaching and learning.
to United States, she earned an MA in educa-
tional policy and technology and a PhD in edu-
Boaventura DaCosta has a BS in computer
science and an MA and PhD in instructional cational technology at University of Illinois at
systems design. He is a researcher and the Urbana-Champaign. She has enjoyed her role
cofounder of Solers Research Group, Inc. in as assistant professor in the Department of
Orlando, FL. In addition to his research inter- Instructional Systems, Leadership, and Work-
ests in cognitive psychology and information force Development at Mississippi State Uni-
and communication technology innovations, versity for the past several years. Her research
DaCosta is interested in how games can be interests include race and gender issues in
used in learning. Complimenting his work as a instructional technology, online discussion,
researcher, DaCosta has worked in the com- and collaborative learning. Du’s professional
mercial and government training sectors for accomplishments included over 20 articles and
the past 15 years as a software engineer and professional presentations.
The Quarterly Review of Distance Education, Volume 11(1), 2010, pp. 59–64 ISSN 1528-3518
Copyright © 2010 Information Age Publishing, Inc. All rights of reproduction in any form reserved.
60 The Quarterly Review of Distance Education Vol. 11, No. 1, 2010
Kerry W. Foxx completed his graduate Julie A. McElhany is the coordinator for
coursework in student affairs/counselor educa- instructional design in the Department of
tion at Clemson University in 2009. He is cur- Instructional Technology and Distance Educa-
rently the assistant director for the Career and tion at Texas A&M University-Commerce
Community Engagement Center at Lewis & and, as an adjunct faculty member, teaches
Clark College. His research interests include graduate courses in the educational technology
leadership, social justice, and the intersection leadership program. Her research interests
among leadership, social justice, and civic include effective instructional design and prac-
engagement in higher education. tices related to online learning and adult learn-
ers as well as the integration of educational
Pamela Havice is an associate professor at
technology in the classroom. She serves as a
Clemson University. She has published and
member of the Distance Education Advisory
presented widely over the last 15 years on the
Council for the Texas A&M University Sys-
topics of distance and distributed learning
environments. Presently she is the coordinator tem and is a member of the eCollege Product
of the student affairs/counselor education Advisory Board. She has been invited to con-
graduate program and serves as a faculty mem- duct workshops on effective instructional
ber in the higher education doctoral program. design practices for online learning and tech-
nology integration.
William L. (Bill) Havice is the associate dean
for academic support services and undergradu- Soonhwa Seok has an MA and PhD in curric-
ate studies in the College of Health, Education ulum and instruction in special education from
and Human Development at Clemson Univer- the University of Kansas. She has interests in
sity. In this role, Havice oversees undergradu- educational communication and technology
ate curriculum, student support and technology with applications for teaching English as a sec-
for the college. He has been actively involved ond language and special education. Most
in researching, presenting and publishing on recently, as a postdoctoral researcher, she has
instructional technology and distributed learn- examined and developed intersensory learning
ing environments for the past 20 years models, assistive technology, and motivation
and feedback for students with learning dis-
Carolyn Kinsell holds a PhD in instructional abilities. Another research focus is assistive
technology and a certification in human per- technology evaluation, such as functional eval-
formance. Her career expands over 18 years in
uation for assistive technology, and supports
which she focused on the application of train-
intensity scales implementing assistive tech-
ing that spans from analysis, to the develop-
nology for the students with disabilities. She
ment of virtual environments, to defining
requirements and solutions for human perfor- has served as a peer reviewer for conference
mance standards; and, more recently, to proposals, presented on web accessibility, and
research and development of training applica- published articles on distance education and
tions. She has worked closely with the military special education technology.
to include cryptologists, intelligence special-
ists, naval diving and salvage experts, to Force Chan Tung holds a PhD in instructional tech-
XXI Battle Command Brigade and Below nology and has been teaching computer sys-
Joint Capabilities Release. She has also sup- tems networking and telecommunications at
ported commercial clients such as Cingular Kansas City Community College over 15
and North America Honda. years.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.