0% found this document useful (0 votes)
10 views14 pages

Marchant 2017

Uploaded by

fernando.meneses
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views14 pages

Marchant 2017

Uploaded by

fernando.meneses
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Assessment & Evaluation in Higher Education

ISSN: 0260-2938 (Print) 1469-297X (Online) Journal homepage: http://www.tandfonline.com/loi/caeh20

The impact of a university teaching development


programme on student approaches to studying
and learning experience: evidence from Chile

Jorge Marchant, Carlos González & Jaime Fauré

To cite this article: Jorge Marchant, Carlos González & Jaime Fauré (2017): The impact
of a university teaching development programme on student approaches to studying and
learning experience: evidence from Chile, Assessment & Evaluation in Higher Education, DOI:
10.1080/02602938.2017.1401041

To link to this article: http://dx.doi.org/10.1080/02602938.2017.1401041

Published online: 10 Nov 2017.

Submit your article to this journal

Article views: 3

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=caeh20

Download by: [University of Sussex Library] Date: 13 November 2017, At: 07:19
Assessment & Evaluation in Higher Education, 2017
https://doi.org/10.1080/02602938.2017.1401041

The impact of a university teaching development programme


on student approaches to studying and learning experience:
evidence from Chile
Jorge Marchanta , Carlos Gonzálezb and Jaime Fauréc
a
Vicerrectoría de Pregrado, Universidad Diego Portales, Santiago de Chile, Chile; bFacultad de Educación, Pontificia
Universidad Católica de Chile, Santiago de Chile, Chile; cVicerrectoría Académica, Universidad Andrés Bello, Santiago
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

de Chile, Chile

ABSTRACT KEYWORDS
In this paper, we analyse the impact of teacher participation in a University Teaching development
Teaching Diploma on student approaches to studying and learning programme; evaluation
experience. A quasi-experimental and multilevel design was employed. impact; approaches
to studying; learning
University teachers answered the Approaches to Teaching Inventory and
experience
students completed the Course Experience Questionnaire and the Study
Process Questionnaire. In addition, contextual variables were included for
both teachers and students. The total sample included 44 teachers and 686
students. Of these, 25 university teachers had completed the University
Teaching Diploma and 19 had not; 373 students were in courses with
a diploma teacher and 313 in courses were not. Results show that those
university teachers who have completed the programme have, in their
courses, students who were more likely to declare having adopted a deep
approach to studying than those teachers who have not participated in the
diploma. At the same time, no significant impact was found on the student
learning experience. For practical purposes, this investigation provides
evidence for the value of teaching development programmes in promoting
deeper approaches to studying. For research purposes, it proposes the use
of multilevel models to evaluate the impact of university teaching diplomas.

Introduction
In Chile, the quality of university teaching and its impact on the student learning experience has come
under scrutiny in recent years. Two phenomena have led to the emergence of this questioning. In the
first place, enrolments have expanded significantly. Undergraduate students grew from 149,689 in 1994
to 1,165,654 in 2015 (CNED 2011, 2014, Zapata and Tejeda 2016), and, currently, 7 out of 10 students
are the first in their families to attend higher education institutions (OECD 2013). Secondly, reports
from international agencies have shown that teaching-centred approaches to teaching are common
in Chilean universities and, at the same time, that it is difficult to engage teachers in implementing
teaching innovations (OECD 2009; World Bank 2011).
Both government agencies and universities have been aware of these problems and, hence, imple-
mented initiatives to improve the situation. From 2005 and 2010, an important number of universities
created centres for teaching development. These centres emerged with the aim of supporting university

CONTACT Jorge Marchant jorge.marchant@udp.cl, jorge.marchant.m@gmail.com


© 2017 Informa UK Limited, trading as Taylor & Francis Group
2 J. MARCHANT ET AL.

teachers to professionalise their teaching, and, in this way, improving the learning experience they
provide their students. During their relatively short period of existence, the centres have organised a
range of professional development activities, such as workshops, class observation and feedback, as
well as university teaching diplomas, the most relevant being the diplomas.
The diplomas are quite similar amongst universities. In terms of contents, they include modules on
conceptions of teaching and learning, course design, active learning methods and information and
communication technology integration. Diplomas are relatively long – lasting for one or two years – and
require a significant amount of time and commitment from teachers. Their implementation is relatively
recent: some of them are in their first version, while others are being revised. They are usually mandatory
for university teachers starting their careers and voluntary for others (Marchant 2017).
Despite the importance of these diplomas in promoting better university teaching in Chilean uni-
versities, there has been no formal evaluation to date of their impact. Thus, conducting research that
explores this issue is both timely and relevant. In this article, we present a study that evaluates the
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

impact on student approaches to studying and learning experience for one of these programmes: the
University Teaching Diploma implemented by the Unit for Educational Innovation at the University of
Santiago, Chile. To this end, a quasi-experimental multilevel design was employed. Our results show
the value of university teaching diplomas: teachers who participated are more likely to have students
reporting they adopt deeper approaches to studying.

Research on the impact of teaching development programmes


University teaching diplomas have become common practice in universities around the world (see
e.g. Gibbs and Coffey 2004; Postareff, Lindblom-Ylänne, and Nevgi 2007; Trigwell, Caballero Rodríguez,
and Han 2012). Research on learning and teaching in higher education has demonstrated that deep
approaches to studying are more likely to be adopted in contexts where teachers present learning-fo-
cused approaches to teaching, and are able to productively organise key elements of the learning expe-
rience (e.g. good teaching, appropriate assessment, appropriate workload, clarity of goals, intellectual
environment) (Ramsden 1998, 2003; Entwistle 2000; Trigwell, Prosser, and Waterhouse 1999; Biggs and
Tang 2011). This evidence has highlighted the importance of teachers participating in development
programmes in which they embrace student-centred approaches to teaching. At the same time, univer-
sities have encouraged these programmes because: (1) there is a need to promote quality learning in a
context of mass higher education, with an increasingly diverse population of students (Trigwell, Prosser,
and Waterhouse 1999), and (2) teaching quality is a key element that influences institutional quality
and therefore needs to be addressed systematically (Trigwell, Caballero Rodríguez, and Han 2012).
Two types of studies have explored the impact of teaching development programmes. The first
group focuses on their impact on university teachers only, while the second type advances to investi-
gate their impact on learning.
Regarding the first group, research shows that there is a positive effect both in decreasing the use
of teacher-centred approaches to teaching and increasing the use of student-centred approaches, as
well as in strengthening self-efficacy beliefs (Postareff, Lindblom-Ylänne, and Nevgi 2007; Hanbury,
Prosser, and Rickinson 2008). However, the authors reporting these results found that change towards
student-centred approaches is slow and progressive: teachers show some significant changes only
after at least one year of participation in development programmes (Postareff, Lindblom-Ylänne, and
Nevgi 2008).
In relation to the second group, research has shown inconclusive results: studies by Ho, Watkins,
and Kelly (2001), Gibbs and Coffey (2004) and Trigwell, Caballero Rodríguez, and Han (2012) reported a
positive impact; while studies by Stes et al. (2012a, 2012b) and Stes, Coertjens, and Van Petegem (2013)
reported no, limited or even negative impact.
Studies that show an impact are related to changes in students’ approaches to studying and
improvements in students’ evaluation of teaching. Concerning changes in students’ approaches to
studying, research has shown that teachers who developed student-centred approaches to teaching
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 3

were more likely to have students adopting deep approaches to studying, while those who maintained
teacher-centred approaches were more likely to have students adopting surface approaches (Ho,
Watkins, and Kelly 2001; Gibbs and Coffey 2004). Besides, teachers who participated in teaching devel-
opment programmes tended to obtain better student evaluations (Ho, Watkins, and Kelly 2001; Gibbs
and Coffey 2004) as well as having students with higher levels of satisfaction than those who had not
participated (Trigwell, Caballero Rodríguez, and Han 2012).
Unlike these studies, Stes et al. (2012a) tested multilevel models (called the gross model, the net
model and two interaction with context models) to explore the impact of a one-year teaching develop-
ment programme on students’ learning. Results from the gross model showed non-significant effects
on students learning outcomes (affective learning outcomes, psychomotor learning outcomes, generic
and information skills, and knowledge and subject-specific skills); while the net model found one sig-
nificant effect, which was negative, on scale knowledge and subject-specific skills. They also reported
small negative effects on the affective learning outcomes for both models, and a small negative effect
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

on the generic and information skills in the net model. Therefore, the experimental group did not pres-
ent better learning outcomes than the control group in the post-test, as the authors’ hypothesis had
expected. First interaction model results showed there is a stronger teacher impact on first-year classes
than on non-first-year classes. However, non-significant differences were found between experimental
and control teachers teaching first-year students. The second interaction model, related to class size,
established that the impact is more positive for teachers in medium or large classes, but few significant
differences were found between experimental and control groups.
The same authors (2012b) investigated associations between teacher participation in development
courses and student perceptions of the learning environment. They tested the same models. In this case,
the gross model showed non-significant effects of the learning environment on student perceptions.
The net model showed significant effects only for the teaching for understanding scale, and this was
negative. The interaction models showed only two significant effects: the post-test was significant for
non-first year students on the interest and enjoyment scale, and for the post-test in large classes on
student support was negative. Finally, Stes, Coertjens, and Van Petegem (2013) investigated the effect of
participation in a teaching development programme on student perceptions of teaching and changes
in teacher behaviour, as perceived by students, after the end of the programme. No significant differ-
ences were found on student perceptions of the learning and teaching contexts, or on perceptions of
teaching behaviour.
It is important to acknowledge that methodological problems have been identified in studies on the
impact of university teachers’ participation on university teaching diplomas (Gibbs and Coffey 2004;
Postareff, Lindblom-Ylänne, and Nevgi 2007; Stes et al. 2010). Several authors agree that studies need
to consider more variables and use more sophisticated methods of analysis (Stes et al. 2010; Trigwell,
Ashwin, and Millan 2013). In the first place, regarding the incorporation of more variables, research has
focused both on the implications that teaching development programmes have for university teaching
(e.g. Postareff, Lindblom-Ylänne, and Nevgi 2007, 2008) and on student experiences of learning (e.g.
Gibbs and Coffey 2004). However, studies that analyse the relationships between learning and teaching,
and consider a wider range of variables, are scarce (Stes, Coertjens, and Van Petegem 2013).
Considering only a few variables or not controlling them is a major issue in some of the studies
we reviewed. In relation to the analysis methods used so far, some studies have examined the data
using, for example, t-tests (e.g. Gibbs and Coffey 2004) or analysis of variance (e.g. Postareff, Lindblom-
Ylänne, and Nevgi 2007). These analysis methods work with only a few variables, and do not establish
the relationships between student and teacher answers. Therefore, using methods that appropriately
consider these relationships, and use multiple sources of information for the evaluation of these pro-
grammes, is needed. In this respect, studies by Stes et al. (2012a, 2012b) are in the right direction. It
is striking, however, that these investigations found limited or no evidence of the impact of teaching
development programmes. Although the authors claim that this lack of effect may be due to the short
timeframe of the programmes, not allowing enough time to ‘capture’ some effect, voluntary participa-
tion, which implies an intrinsic motivation for improving teaching, and the low number of participants
4 J. MARCHANT ET AL.

(Stes et al. 2012a, 2012b; Stes, Coertjens, and Van Petegem 2013), results suggest that the more complex
the analysis, the harder to find some effect on student learning.
In summary, research has shown that participation in university teaching diplomas promotes teach-
ers’ adoption of student-centred approaches, and that, on the side of student learning, research reports
both positive and no or limited effects. At the same time, research reporting no or limited effect has
used more sophisticated methods of analysis, which questions to some extent the previous studies
that use less sophisticated analysis. This contrasting evidence leaves important questions on the impact
of university teaching diplomas unanswered, at least for the student learning experience. Given the
increasing pressures to provide evidence that these programmes promote better teaching and learn-
ing (Brew 2007), and the inconclusive evidence found in the literature, it is both timely and relevant to
conduct research that helps unveil whether they do have an impact.
This study aims to provide further evidence towards answering this question, by investigating the
impact of a University Teaching Diploma on student approaches to studying and learning experience.
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

In so doing, we followed suggestions by Stes et al. (2010) and Trigwell, Ashwin, and Millan (2013) and
as such incorporate variables not considered in previous studies and use an analysis strategy that con-
siders the nested nature of this phenomenon. At the same time, we provide evidence from a country
– Chile – where no previous studies in this area have been carried out.

Methodology
The present study investigates the relative impact of university teacher participation in a university
teaching diploma on student approaches to studying and perceived learning experience. Other varia-
bles are also included to evaluate this impact (for students: gender, age, perceived workload, perceived
course relevance, parental responsibilities, work and self-efficacy beliefs; for teachers: discipline, gender,
age, academic degrees, experience, previous training in university teaching, motivation with teaching
development, type of contract and course size). These were selected based upon previous research into
the broader area of learning and teaching in higher education (e.g. Rosario et al. 2013). In this manner,
we would be able to determine the specific effect of participation in the university teaching diploma on
student approaches to studying and learning experience. A quasi-experimental and multilevel design
was employed. This was considered appropriate for this study, as it facilitates examining relationships
between selected variables considering the nested or hierarchical nature of the data (two levels in this
case: university teachers and students) (Snijders and Bosker 1999; Kline 2010). The research questions
are the following:

(1) What is the relative impact of teacher participation in a University Teaching Diploma on student
approaches to studying?
(2) What is the relative impact of teacher participation in a University Teaching Diploma on student
perceived learning experience?

The University Teaching Diploma at the University of Santiago


The University Teaching Diploma at the University of Santiago started in 2009. It aims to promote crit-
ical reflexion in teachers on their practice and provide them with strategies and tools for developing a
student-centred pedagogy. The programme may be completed in one or two years, depending on the
time and dedication of each participant. Participation is voluntary for all university teachers, except for
those who have entered the University from 2009, for whom it is compulsory. At the time the data for
this study was collected, 509 teachers had finished the diploma.
The University Teaching Diploma is organised into four modules:

(1) curricular design;
(2) students’ learning assessment;
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 5

(3) ICT integration; and


(4) active learning methods and reflective practice.

The diploma starts with the modules on curricular design and learning assessment. Then, more
advanced modules are conducted on ICT integration and university teacher reflexion into their own
practice. By the end of the programme, participants must have had 160 h of face-to-face attendance.

Participants
Forty-four university teachers agreed to participate in this study. Twenty-five of them had participated
in the University Teaching Diploma and nineteen had not; 56% were males and 44% females. The age
mean was 48.7 years (SD = 13.11). In total, 686 students answered the questionnaires: 373 were partici-
pating in courses of diploma teachers, while 313 were participating in courses with teachers not taking
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

the diploma; 53% were males and 47% females. Mean age was 22 years (SD = 3.57). Table 1 provides
demographic information for the teachers participating in this study.

Instruments
Well known instruments from the student learning research tradition were employed. For students, the
Course Experience Questionnaire (CEQ) and the Study Process Questionnaire (SPQ) were used; and for
teachers the Approaches to Teaching Inventory (ATI) was employed. Questions related to contextual
variables were incorporated. All these instruments were validated prior to being used in this study
(Marchant, Fauré, and Abricot 2016; Marchant 2017).
The CEQ, SPQ and ATI were translated and compared with previous versions produced in Chile
(González et al. 2011; Montenegro and González 2013). Then, qualified experts analysed them to reduce
bias on linguistic, psychological and cultural differences. Experts also judged questions for contex-
tual variables. The process was carried out according to guidelines proposed by Muñiz, Elosua, and
Hambleton (2013).
The instrument for gathering student data included a shortened version of the CEQ, the SPQ and
questions for contextual variables.

Table 1. Distribution of UTD and non UTD teachers according to their academic degree, other teaching commitments, gender and
teaching experience.

Teachers UTD teachers (experimental) Non UTD teachers (control) Total


Academic grade
Professional degree 7 8 15
 Master 10 5 15
PhD 8 6 14
Total 25 19 44

Teaches somewhere else


 No 19 11 30
Yes 6 8 14
Total 25 19 44

Gender
 Male 16 12 28
Female 9 7 16
Total 25 19 44

Teaching experience
 Less than 5 years 0 4 4
Between 6 and 10 years 10 2 12
 More than 10 years 15 13 28
Total 25 19 44
6 J. MARCHANT ET AL.

The CEQ version we employed is the shortened version reported by Webster et al. (2009). This is com-
posed of 17 five-point Likert items, grouped into four scales: good teaching, clear goals and standards,
appropriate assessment and appropriate workload. Confirmatory factor analysis was carried out for
each scale, which were considered as independent, following Richardson (2006). Goodness of fit was
appropriate: good teaching, χ2 (5.36, N = 668) = 100.38, p = .084, CFI = 0.997, TLI = .985, SRMR = .011; clear
goals and standards, χ2 (483.4, N = 657) = 855.84, p = .061, CFI = .975, TLI = .951, SRMR = .038; appropri-
ate workload, χ2 (586.3, N = 569) = 980.31, p = .075, CFI = .971, TLI = .949, SRMR = .039, and appropriate
assessment, χ2 (1994, N = 579) = 522.72, p = .053, CFI = .985, TLI = .956, SRMR = .024. Cronbach’s alpha
ranged from acceptable to very good (good teaching, α = .8585; clear goals and standards, α = .7408;
appropriate workload, α = .7817; appropriate evaluation, α = .6977).
The SPQ questionnaire consists of 20 five-point Likert items, grouped into two scales: deep and sur-
face learning. The data fitted the model well: χ2 (1821, N = 575) = 1450.89, p = .086, CFI = .965, TLI = .939,
SRMR = .041. Very good Cronbach’s α were found (deep learning, α = .84; surface learning, α = .80).
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

Contextual variables included were: socio-demographic (gender and age), perceived workload, per-
ceived course relevance (for the semester curriculum and personal) and self-efficacy beliefs (adapted
from Richardson 2006). Also, questions related to whether he/she cared for his/her children and whether
they were studying full-time were incorporated.
In the case of university teachers, the instrument used included the ATI as well as contextual var-
iables. ATI is a scale composed of 22 five-points Likert items, grouped in two sub-scales: conceptual
change/student-focused (CCSF) and information transmission/teacher-focused (ITTF). The data fitted
the model well (CFI = .998; TLI = .995; SRMR = .027; RMSEA = .023). Acceptable Cronbach’s alphas were
obtained (CCSF, α = .70; ITTF, α = .85).
The contextual variables included discipline, socio-demographic (gender, age, academic degrees)
and work-related elements (experience, working full or part-time in the institution, course size). Self-
efficacy beliefs items (related to confidence in content knowledge, confidence in students’ learning and
confidence in pedagogical knowledge; Lindblom-Ylänne et al. 2006) were included. Table 2 presents
scales from questionnaires employed with a representative item for each of them.
Students and teachers answered the instruments during their regular class time within the classroom.
In all cases, one member of the research team was in the classroom where the instruments were being
answered. In this way, the team member communicated the study objectives as well as the voluntary
and anonymous character of participation both verbally and in writing.

Analysis
Multilevel hierarchical regressions represent an analysis sensitive to the nested or hierarchical nature of
educational phenomena. As stated before, there are few studies investigating the impact of university
teaching diplomas that use this type of analysis (e.g. Stes et al. 2012a, 2012b). We employed multilevel

Table 2. Representative items from each questionnaires’ scales.

Questionnaire and scale Representative item


CEQ
Good teaching The teacher works hard to make his/her subject interesting
Clear goals and standards I have usually had a clear idea of where I am going and what is expected of me
in this course
Appropriate assessment To do well in this course all you really need is a good memory
Appropriate workload I am generally given enough time to understand the things I have to learn
SPQ
Deep learning I work hard at my studies because I find the material interesting
Surface learning I see no point in learning material which is not likely to be in the examination
ATI
Conceptual change/student-focused In teaching sessions, I deliberatively provoke debate and discussion
Information transmission/teacher-focused In this subject, my teaching focuses on the good presentation of information to
students
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 7

hierarchical regressions (Kline 2010) to calculate the relative impact of a number of variables on stu-
dent approaches to studying and learning experiences. Variables were organised in two levels: those
associated with university teachers and those associated with groups of students (Snijders and Bosker
1999). Figure 1 represents them.
In order to understand the impact of these variables on student approaches to studying and learn-
ing experience, six random intercept and fixed slope models were calculated, using the maximum
likelihood method (Fielding and Goldstein 2006). One model was built for each SPQ scale (deep and
surface learning) and one for each CEQ scale (good teaching, clear goals and standards, appropriate
assessment and appropriate workload).
The tested models are the following:
SPQij = 𝛽00 + 𝛽10 ⋅ CEQij + 𝛽20 ⋅ Sij + 𝛽30 ⋅ Fj + 𝛽40 ⋅ ATIj + 𝛽50 ⋅ Xj + 𝛽60 ⋅ Ij + u0j + ∈ ij
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

CEQij = 𝛿00 + 𝛿10 ⋅ SPQij + 𝛿20 ⋅ Sij + 𝛿30 ⋅ Fj + 𝛿40 ⋅ ATIj + 𝛿50 ⋅ Xj + 𝛿60 ⋅ Ij + 𝜐0j + 𝜀ij

where SPQij corresponds to each approach to studying scale for student i in the course j; CEQij represents
the learning experiences of student i in the course j and included the four CEQ scales; Sij corresponds
to the student’s features vector i in the course j, where the self-efficacy, course relevance, workload, sex
and age were included; Fj represents whether the teacher j finished the University Teaching Diploma;
ATIj represents the approaches to teaching of the teacher in course j and included its two scales; Xj cor-
responds to the context and teacher’s features vector and j, where self-efficacy, motivation to improve

Figure 1. Multilevel analysis model.


8 J. MARCHANT ET AL.

teaching skills, previous university teaching qualification, gender, age, disciplinary area and course size
were included; Ij corresponds to the vector which included the effects of participation in University
Teaching Diploma with ATI scales and with self-efficacy and motivation to improve teaching skills var-
iables; u0j & 𝜐0j correspond to random components, such as the average mark of each course; finally, ∈i,j
and 𝜀i,j correspond to random components, such as each case average or error.

Results
Impact of the University Teaching Diploma on student approaches to studying
Table 3 presents the complete models obtained from calculating the University Teaching Diploma’s
impact on student approaches to studying.
In the first place, the results show that university teachers who completed the diploma had stu-
dents with higher scores in the deep approach scale and lower scores in the surface approach scale
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

(β(SPQ deep) = 2.3947, p > .05; β(SPQ surface) = −3.3571, p > .05). Moreover, results show that completing the
diploma, compared to other variables, such as age, sex or previous formation, has the highest load as
a predictor for the SPQ scale scores.

Table 3. Standardised multilevel linear regression coefficients for SPQ scales.

β(SPQ deep) β(SPQ surface)


Fixed effects
Level 1
 CEQ good teaching .2912* −.1175*
 CEQ clear goals and standards .0799* .0227
 CEQ appropriate evaluation −.0352 −.2946*
 CEQ appropriate workload −.0285 −.0295
 Student self-efficacy .0740* −.0476*
 Course relevance (for semester curriculum) .1173* −.0544
 Course relevance (personal) .3103* −.2969*
 Cares for his/her children .1204* −.0311
Full time study −.0061 −.0608
 Gender .0953* −1.908*
Age .0121* −.0061
Level 2
 UTD participation (completed) 2.3947* −3.3571*
ATI CCSF .0443 −.0157
ATI ITTF .0471 −0.454*
 Self-efficacy (confidence in content knowledge) −.0614 .0478
 Self-efficacy (confidence in students’ learning) .0049 −.0079)
 Self-efficacy (confidence in pedagogical knowledge) −.0339 .0686
 Motivation to improve teaching skills −.0797 .0284
 Undergraduate degree in education −.3399* .1137
Postgraduate degree in education .4741* −.3505*
Previous university teaching qualification .0304 −.1104
 Gender −.0649 −.0566
Age −.0021 −.0048*
 Disciplinary area .1423 −.1227*
 Course size −.0024 .0009
Random effects
Within course variance .0738 .0450*
Residual variance .4380 .4430*
N 635 635
Courses 40 40
Log pseudolikelihood −384.08 −387.08

Notes: Models were calculated with a constant but this is omitted in the results. Also, robust standard errors were omitted.
*p < .05.
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 9

In the second place, the CEQ scales also predicted SPQ scale scores. Those students with higher scores
in the deep approach scale tended to present higher scores in the good teaching (β(SPQ deep) = .2912,
p < .05) and clear goals and objectives (β(SPQ deep) = .0799, p < .05) scales. At the same time, students
who presented higher scores in the surface approach scale tended to have lower scores in the good
teaching (β(SPQ surface) = −.1175, p < .05) and appropriate evaluation (β(SPQ surface) = −.2946, p < .05) scales.
Finally, when considering the other variables included in the multilevel models some important asso-
ciations can be seen. At the level of the students, gender (β(SPQ deep) = .0953, p < .05; β(SPQ surface) = −.1908,
p < .05), self-efficacy (β(SPQ deep) = .0740, p < .05; β(SPQ surface) = −.0476, p < .05) and course relevance (per-
sonal) (β(SPQ deep) = .1173, p < .05; β(SPQ surface) = −.0544, p < .05) had a significant predictive effect in both
SPQ scales. At the same time, age (β(SPQ deep) = .0121, p < .05), course relevance (for semester curriculum)
(β(SPQ deep) = .1173, p < .05) and caring for his/her children (β(SPQ deep) = .1204, p < .05) were positively
significant for the deep approach scale. On the other hand, at the level of the university teachers, post-
graduate formation (β(SPQ deep) = .4741, p < .05; β(SPQ surface) = −.3505, p < .05) showed a significant predictive
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

impact on both SPQ scales. The teaching-centred scale of the ATI (β(SPQ surface) = −.0454, p < .05), age (β(SPQ
surface)
= −.0048, p < .05) and disciplinary area (β(SPQ surface) = −.1227, p < .05), were negatively significant
for the surface approach scale. It is worth highlighting that university teachers who had studied their
first degrees as school teachers tended to promote the deep learning approach less strongly in their
students (β(SPQ deep) = −.3399, p < .05).
Therefore, in relation to the first question, our results show that university teacher participation in a
University Teaching Diploma impacts positively on student adoption of deep approaches to studying;
and that this is the variable with the highest explicative load.

Impact of the University Teaching Diploma on student learning experience


The complete models calculated for CEQ scales are presented in Table 4.
Results show that there are no associations between having completed the diploma and CEQ scales:
good teaching (β(CEQ good teaching) = −2.0447; p > .05), clear goals and objectives (β(CEQ clear goals and objec-
tives)
= −1.927; p > .05), appropriate evaluation (β(CEQ appropriate evaluation) = −2.4381; p > .05) or appropriate
workload (β(CEQ appropriate workload) = −1.803; p > .05).
On the other hand, we found that SPQ scales predict CEQ scores. Thus, higher scores on the deep
approach scale predicted higher scores in good teaching (β(CEQ good teaching) = .4327; p < .05) and clear goals
and objectives (β(CEQ clear goals and objectives) = .3046; p < .05) scales. At the same time, an inverse relation was
found between the deep approach and appropriate evaluation scales (β(CEQ appropriate evaluation) = −.2876;
p < .05): those who tend to score higher in the deep learning scale tend to consider the evaluation as
inappropriate. On the other hand, a higher score in the surface approach scale predicts a lower score in
appropriate evaluation (β(CEQ appropriate evaluation) = −.7493; p < .05) and appropriate workload (β(CEQ appropriate
workload)
= −.143; p < .05) scales; which means that those who tend to present surface approaches to
learning perceive the evaluation as inappropriate and the workload as too high.
It is also relevant to consider other variables with significant results. At the level of student self-effi-
cacy, there was a positive predictive effect on both clear goals and objectives and appropriate workload
scales (β(CEQ clear goals and objectives) = .0843; p < .05; β(CEQ appropriate workload) = .093; p < .05); course relevance
(for semester curriculum) had a positive predictive effect on the appropriate evaluation scale (β(CEQ
appropriate evaluation)
= .1964; p < .05) and a negative one on the appropriate workload (β(CEQ appropriate workload
scale)
= −.3007; p < .05); and course relevance (personal) had a positive predictive effect on the clear
goals and objectives scales (β(CEQ clear goals and objectives) = .1385; p < .05. At the level of teachers, motivation
to improve teaching skills had a negative predictive effect on the good teaching, appropriate evalu-
ation and appropriate workload scales (β(CEQ good teaching) = −.4932; p < .05; β(CEQ appropriate evaluation) = −.34;
p < .05; β(CEQ appropriate workload) = −.5402; p < .05); course size had a negative predictive effect on the
good teaching and appropriate workload scales (β(CEQ good teaching) = −.0133; p < .05; β(CEQ appropriate work-
load)
= −.02; p < .05); teacher age had a significant negative predictive effect on appropriate evaluation
10 J. MARCHANT ET AL.

Table 4. Standardised multilevel linear regression coefficients for CEQ scales.

β(CEQ Good teaching) β(CEQ Clear goals and standards) β(CEQ appropriate evaluation) β(CEQ appropriate workload)
Fixed effects
Level 1
 SPQ deep .4327* .3046* −.2876* .0148
 SPQ surface −.0653 −.0328 −.7493* −.143*
 Student self-efficacy 0316 .0843* .017 .093*
 Course relevance (for −.0225 −.0913 .1964* −.3007*
semester curriculum)
 Course relevance (per- .102 .1385* .1041 .0724
sonal)
 Cares for his/her children −.1124 −.0614 −.0711 .0552
Full time study .0175 .0637 .0163 .0183
 Gender −.0553 .0029 −.0015 −.0244
Age .002 −.0011 −.0088 −.0012
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

Level 2
 UTD participation (com- −2.0447 −1.927 −2.4381 −1.803
pleted)
ATI CCSF .378 −.0989 .2319 .472
ATI ITTF −.0858 −.1413 −.3225* −.1879
 Self-efficacy (confidence in −.1 −.0294 −.0004 −.1008
content knowledge)
 Self-efficacy (confidence in 0 .087 −.0767 .1067
students’ learning)
 Self-efficacy (confidence in .2057 .0973 .2902* −.0197
pedagogical knowledge)
 Motivation to improve −.4932* −.2074 −.34* −.5402*
teaching skills
 Undergraduate degree in −.3302 .0851 −.2195 −.7522
education
Postgraduate degree in .2737 −.052 −.075 .5208
education
Previous university teach- −.1117 −.3093 .0437 −.2003
ing qualification
 Gender .1926 −.0125 −.0858 .0774
Age −.0035 −.0008 −.0115* −.0013
 Disciplinary area .0939 .0313 .2427 .1286
 Course size −.0133* −.0049 −.005 −.02*
Random effects
Within course variance .2609* .1432* .1505* .3123*
Residual variance .5283* .5452* .6093* .6268*
N 635 635 635 635
Courses 40 40 40 40
Log pseudolikelihood −526.26 −530.10 −599.51 −635.15

Notes: Models were calculated with a constant but this is omitted in the results. Also, robust standard errors were omitted.
*p < .05.

(β(CEQ appropriate evaluation) = −.0115; p < .05); and pedagogical confidence had a positive predictive effect
on appropriate evaluation (β(CEQ appropriate evaluation) = .2902; p < .05).
Therefore, regarding to our second question, the answer is negative: university teachers’ partici-
pation in a University Teaching Diploma does not have an impact on the students’ perceived learning
experience.

Discussion and conclusions


In this study, we investigated the impact of a university teaching diploma on student approaches to
studying and learning experiences. We found that: (1) the diploma had a positive and direct effect on
student approaches to studying: those university teachers who completed the diploma were capa-
ble of promoting deeper approaches to studying in their students and decreasing the use of surface
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 11

approaches. Moreover, the diploma effect on adopting a deep approach to studying is quantitatively
larger than any of the other variables included in the analysis; and (2) university teachers’ participa-
tion in the diploma did not have a significant impact on the students’ perceived learning experiences
(understood as perceptions of good teaching, clear goals and objectives, appropriate evaluation and
appropriate workload).
These results on approaches to studying are consistent with those reported in previous research by
Ho, Watkins, and Kelly (2001) and Gibbs and Coffey (2004), who found that teachers who participated
in teaching development were more likely to have students adopting deeper approaches to studying.
On the other hand, our results are not aligned with those by Ho, Watkins, and Kelly (2001), Gibbs and
Coffey (2004) and Trigwell, Caballero Rodríguez, and Han (2012), who found that participation in a
teaching development programme had a positive impact on one or more elements of the student
learning experience. Our findings, however, are aligned with Stes, Coertjens, and Van Petegem (2013)
who found non-significant positive impact on the learning experience. In summary, using a multilevel
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

approach, we found results on student approaches to studying similar to those of the studies that used
less sophisticated analysis techniques (Ho, Watkins, and Kelly 2001; Gibbs and Coffey 2004); and similar
results on the impact on the learning experience to those found by a previous study using a multilevel
approach (Stes, Coertjens, and Van Petegem 2013), but different from studies using less sophisticated
techniques (Ho, Watkins, and Kelly 2001; Gibbs and Coffey 2004; Trigwell, Caballero Rodríguez, and
Han 2012).
It is important to acknowledge, in relation to the lack of impact on student perceptions of the
learning experiences, that these may be explained by the indirect feature of the relationship. Thus, the
University Teaching Diploma effect may be hidden by other variables such as approaches to studying,
particularly, considering that approaches to studying scales do predict the learning experience scales.
The present study is not without its limitations. First, the study was conducted only in one university
and with students from a relatively small number of courses. Therefore, we do not claim it as generaliza-
ble. We rather think of our study as providing further evidence on the impact of teaching development
on student approaches to studying and learning experiences, and one that proposes a more complex
way of investigating this phenomenon. Second, we acknowledge that the use of contrast groups,
conformed post hoc, is problematic: quasi-experiments do not allow causes to be established, due to
an absence of a randomly generated contrast group (Lacave, Molina, and del Castillo 2014). Having
said that, we are aware this is the most appropriate approach for phenomena that happen in real con-
texts. So, design inherent issues are due to the nature of the object of study, and do not impede the
method of evaluation we employed from being valuable. This method can be replicated, and in doing
so improved, to evaluate other teaching development initiatives.
Thus, the study results presented in this article are likely to have implications for further research and
for practice. Future studies will need to provide further evidence on the impact of teaching development
programmes. Indeed, it is particularly important to conduct research that incorporates a broader range
of variables. We argued that student perceptions are influenced by diverse variables, such as age or
gender, which in turn, encouraged us to incorporate those variables exploring the specific effect of the
diploma. Other variables were incorporated, both at the level of students (e.g. self-efficacy, personal
importance of the course or course size) and of teachers (e.g. motivation, previous formation or disci-
plinary area). We suggest that future research should continue investigating the specific effect these
variables have, together with the participation on teaching development programmes, on student
approaches to studying and learning experience. At the same time, we claim that analyses that con-
sider the relationships between teachers and students’ answers – multilevel modelling and structural
equation modelling – are appropriate for achieving this aim.
Results from this study also have practical implications. In a context of increasingly limited fund-
ing and competing demands within universities, centres for teaching development around the world
are expected to demonstrate evidence of the effectiveness of their development programmes. This is
similar to the case in Chile, where these centres need to compete within their universities and, at the
same time, at least for those initially funded by the government, demonstrate their worth to external
12 J. MARCHANT ET AL.

agencies. Results from this study provide evidence that teacher participation in a university teaching
diploma does have an impact on student approaches to studying. It demonstrates that, at least in the
case of the University of Santiago’s University Teaching Diploma, conducting teaching development
programmes is worthwhile. Moreover, the research may provide a means to evaluating these pro-
grammes in other universities, thus becoming a practical tool for other university teaching diplomas
to demonstrate their impact.
In conclusion, we found that teaching development has a positive and direct impact on approaches
to studying, but that it does not produce, at least directly, a significant impact on student learning
experience. We proposed that further studies incorporate a wider range of variables, and use analyses
that capture the relationship between teacher and student answers. We also claimed that this article
proposes a model for evaluating other university teaching diplomas in a context of competing demands,
where they are permanently expected to demonstrate their value.
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

Disclosure statement
No potential conflict of interest was reported by the authors.

Notes on contributors
Jorge Marchant, PhD, is Director of Curriculum Development at Universidad Diego Portales. His research interests are learn-
ing and teaching in higher education, curriculum evaluation and teaching development programmes in higher education.
Carlos González, PhD, is an associate professor and Director of Postgraduate Studies at the Faculty of Education, Pontificia
Universidad Católica de Chile. His research interests are learning and teaching in higher education, the student experience
and learning analytics.
Jaime Fauré is an analyst of Teaching Development at Universidad Andres Bello. His research interests are learning trajec-
tories, teaching in higher education and teaching development programmes.

ORCID
Jorge Marchant http://orcid.org/0000-0003-4207-3817
Jaime Fauré http://orcid.org/0000-0001-6644-3339

References
Biggs, J., and C. Tang. 2011. Teaching for Quality Learning at University. Buckingham: Open University Press.
Brew, A. 2007. “Evaluating Academic Development in a Time of Perplexity.” International Journal for Academic Development
12 (2): 69–72. doi:10.1080/13601440701604823.
CNED. 2011. Evolución de la matrícula de Educación Superior 1994–2011: Departamento de Investigación e Información Pública
[Evolution of enrolment in Higher Education 1994–2011. Department of Research and Public Information]. Santiago:
Consejo Nacional de Educación.
CNED. 2014. Índices: Estadísticas y Bases de Datos. Retrieved May 28, 2014, from http://www.cned.cl/public/Secciones/
SeccionIndicesEstadisticas/indices_estadisticas_sistema.aspx
Entwistle, N. 2000. Promoting Deep Learning through Teaching and Assessment: Conceptual Frameworks and Educational
Contexts. In Conference dictated on TLRP, University of Leicester, England.
Fielding, A., and H. Goldstein. 2006. Cross-classified and Multiple Membership Structures in Multilevel Models: An Introduction
and Review. London: Research Report 791 for DfES.
Gibbs, G., and M. Coffey. 2004. “The Impact of Training of University Teachers on Their Teaching Skills, Their Approach
to Teaching and the Approach to Learning of Their Students.” Active Learning in Higher Education 5 (1): 87–100.
doi:10.1177/1469787404040463.
González, C., H. Montenegro, A. López, I. Munita, and P. Collao. 2011. “Relación entre la Experiencia de Aprendizaje de
estudiantes universitarios y la docencia de sus profesores.” Calidad en la Educación 35: 21–49. doi:10.4067/S0718-
45652011000200002.
Hanbury, A., M. Prosser, and M. Rickinson. 2008. “The Differential Impact of UK Accredited Teaching Development Programmes
on Academics’ Approaches to Teaching.” Studies in Higher Education 33 (4): 469–483. doi:10.1080/03075070802211844.
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 13

Ho, A., D. Watkins, and M. Kelly. 2001. “The Conceptual Change Approach to Improving Teaching and Learning: An Evaluation
of a Hong Kong Staff Development Programme.” Higher Education 42 (2): 143–169. doi:10.1023/A:1017546216800.
Kline, R. 2010. Principles and Practice of Structural Equation Modelling. New York: Guilford Press.
Lacave, C., A. Molina, and E. del Castillo. 2014. “Evaluación de una innovación docente a través de un diseño estadístico
cuasi-experimental: aplicación al aprendizaje de la recursividad.” Calidad y evaluación de la docencia 20 (3): 159–166.
Lindblom-Ylänne, S., K. Trigwell, A. Nevgi, and P. Ashwin. 2006. “How Approaches to Teaching Are Affected by Discipline
and Teaching Context.” Studies in Higher Education 31 (3): 285–298. doi:10.1080/03075070600680539.
Marchant, J. 2017. La formación en docencia universitaria en Chile y su impacto en profesores y estudiantes (Doctoral
dissertation). Leiden University, Netherland. Retrieved from https://openaccess.leidenuniv.nl/handle/1887/46488
Marchant, J., J. Fauré, and N. Abricot. 2016. “Preliminary adaptation and validation of the SPQ and the CEQ for the study of
teaching development programmes in a Chilean university context.” Psykhe 25 (2): 1–18. doi:10.7764/psykhe.25.2.873.
Montenegro, H., and C. González. 2013. “Análisis factorial confirmatorio del cuestionario “Enfoques de Docencia
Universitaria” (Approaches to Teaching Inventory, ATI-R).” Estudios Pedagógicos 39 (2): 213–230. doi:10.4067/S0718-
07052013000200014.
Muñiz, J., P. Elosua, and R. Hambleton. 2013. “Directrices para la Traducción & Adaptación de los tests: segunda edición.”
Downloaded by [University of Sussex Library] at 07:19 13 November 2017

Psicothema 25 (2): 151–157.


OECD. 2009. La Educación Superior en Chile. París: Organización para la Cooperación y el Desarrollo Económico.
OECD. 2013. Revisión de Políticas Nacionales de Educación: El Aseguramiento de la Calidad en la Educación Superior en Chile
2013. Santiago: OECD Publishing.
Postareff, L., S. Lindblom-Ylänne, and A. Nevgi. 2007. “A Follow-up Study of the Effect of Pedagogical Training on Teaching
in Higher Education.” Higher Education 56 (1): 29–43. doi:10.1016/j.tate.2006.11.013.
Postareff, L., S. Lindblom-Ylänne, and A. Nevgi. 2008. “A Follow-up Study of the Effect of Pedagogical Training on Teaching
in Higher Education.” Higher Education 56 (1): 29–43. doi:10.1007/s10734-007-9087-z.
Ramsden, P. 1998. “Managing the Effective University.” Higher Education Research and Development 17 (3): 347–370.
doi:10.1080/0729436980170307.
Ramsden, P. 2003. Learning to Teach in Higher Education. London: Routledge Falmer.
Richardson, J. T. 2006. “Investigating the Relationship between Variations in Students’ Perceptions of Their Academic
Environment and Variations in Study Behaviour in Distance Education.” British Journal of Educational Psychology 76 (4):
867–893. doi:10.1348/000709905X69690.
Rosario, P., J. Núñez, A. Valle, O. Paiva, and S. Polydoro. 2013. “Approaches to Teaching in High School When Considering
Contextual Variables and Teacher Variables.” Revista de Psicodidáctica 18 (1): 25–45. doi:10.1387/RevPsicodidact.6215.
Snijders, T., and R. Bosker. 1999. Multilevel Analysis: An Introduction to Basic and Advanced Multilevel Modelling. London: Sage.
Stes, A., L. Coertjens, and P. Van Petegem. 2013. “Instructional Development in Higher Education: Impact on Teachers’
Teaching Behavior as Perceived by Student.” Instructional Science 41 (6): 1103–1126. doi:10.1007/s11251-013-9267-4.
Stes, A., S. De Maeyer, D. Gijbels, and P. Van Petegem. 2012a. “Instructional Development for Teachers in Higher Education:
Effects on Students’ Learning Outcomes.” Teaching in Higher Education 17 (3): 295–308. doi:10.1080/13562517.2011.6
11872.
Stes, A., S. De Maeyer, D. Gijbels, and P. Van Petegem. 2012b. “Instructional Development for Teachers in Higher Education:
Effects on Students’ Perceptions of the Teaching–Learning Environment.” British Journal of Educational Psychology 82
(3): 398–419. doi:10.1111/j.2044-8279.2011.02032.x.
Stes, A., M. Min-Leliveld, D. Gijbels, and P. Van Petegem. 2010. “The Impact of Instructional Development in Higher Education:
The State-of-the-Art of the Research.” Educational Research Review 5 (1): 25–49. doi:10.1016/j.edurev.2009.07.001.
Trigwell, K., P. Ashwin, and E. Millan. 2013. “Evoked Prior Learning Experience and Approach to Learning as Predictors of
Academic Achievement.” British Journal of Educational Psychology 83 (3): 363–378. doi:10.1111/j.2044-8279.2012.02066.x.
Trigwell, K., K. Caballero Rodríguez, and F. Han. 2012. “Assessing the Impact of a University Teaching Development
Programme.” Assessment & Evaluation in Higher Education 37 (4): 499–511. doi:10.1080/02602938.2010.547929.
Trigwell, K., M. Prosser, and F. Waterhouse. 1999. “Relations between Teachers’ Approaches to Teaching and Students’
Approaches to Learning.” Higher Education 37 (1): 57–70. doi:10.1023/A:1003548313194.
Webster, B., W. Chan, M. Prosser, and D. Watkins. 2009. “Undergraduates’ Learning Experience and Learning Process:
Quantitative Evidence from the East.” Higher Education 58: 375–386. doi:10.1007/s10734-009-9200-6.
World Bank. 2011. Educación superior en Iberoamérica: informe 2011. Santiago: RIL.
Zapata, G. and I. Tejeda. 2016. Educación Superior en Chile-Informe Nacional. Santiago: Centro Interuniversitario de Desarrollo.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy