Marchant 2017
Marchant 2017
To cite this article: Jorge Marchant, Carlos González & Jaime Fauré (2017): The impact
of a university teaching development programme on student approaches to studying and
learning experience: evidence from Chile, Assessment & Evaluation in Higher Education, DOI:
10.1080/02602938.2017.1401041
Article views: 3
Download by: [University of Sussex Library] Date: 13 November 2017, At: 07:19
Assessment & Evaluation in Higher Education, 2017
https://doi.org/10.1080/02602938.2017.1401041
de Chile, Chile
ABSTRACT KEYWORDS
In this paper, we analyse the impact of teacher participation in a University Teaching development
Teaching Diploma on student approaches to studying and learning programme; evaluation
experience. A quasi-experimental and multilevel design was employed. impact; approaches
to studying; learning
University teachers answered the Approaches to Teaching Inventory and
experience
students completed the Course Experience Questionnaire and the Study
Process Questionnaire. In addition, contextual variables were included for
both teachers and students. The total sample included 44 teachers and 686
students. Of these, 25 university teachers had completed the University
Teaching Diploma and 19 had not; 373 students were in courses with
a diploma teacher and 313 in courses were not. Results show that those
university teachers who have completed the programme have, in their
courses, students who were more likely to declare having adopted a deep
approach to studying than those teachers who have not participated in the
diploma. At the same time, no significant impact was found on the student
learning experience. For practical purposes, this investigation provides
evidence for the value of teaching development programmes in promoting
deeper approaches to studying. For research purposes, it proposes the use
of multilevel models to evaluate the impact of university teaching diplomas.
Introduction
In Chile, the quality of university teaching and its impact on the student learning experience has come
under scrutiny in recent years. Two phenomena have led to the emergence of this questioning. In the
first place, enrolments have expanded significantly. Undergraduate students grew from 149,689 in 1994
to 1,165,654 in 2015 (CNED 2011, 2014, Zapata and Tejeda 2016), and, currently, 7 out of 10 students
are the first in their families to attend higher education institutions (OECD 2013). Secondly, reports
from international agencies have shown that teaching-centred approaches to teaching are common
in Chilean universities and, at the same time, that it is difficult to engage teachers in implementing
teaching innovations (OECD 2009; World Bank 2011).
Both government agencies and universities have been aware of these problems and, hence, imple-
mented initiatives to improve the situation. From 2005 and 2010, an important number of universities
created centres for teaching development. These centres emerged with the aim of supporting university
teachers to professionalise their teaching, and, in this way, improving the learning experience they
provide their students. During their relatively short period of existence, the centres have organised a
range of professional development activities, such as workshops, class observation and feedback, as
well as university teaching diplomas, the most relevant being the diplomas.
The diplomas are quite similar amongst universities. In terms of contents, they include modules on
conceptions of teaching and learning, course design, active learning methods and information and
communication technology integration. Diplomas are relatively long – lasting for one or two years – and
require a significant amount of time and commitment from teachers. Their implementation is relatively
recent: some of them are in their first version, while others are being revised. They are usually mandatory
for university teachers starting their careers and voluntary for others (Marchant 2017).
Despite the importance of these diplomas in promoting better university teaching in Chilean uni-
versities, there has been no formal evaluation to date of their impact. Thus, conducting research that
explores this issue is both timely and relevant. In this article, we present a study that evaluates the
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
impact on student approaches to studying and learning experience for one of these programmes: the
University Teaching Diploma implemented by the Unit for Educational Innovation at the University of
Santiago, Chile. To this end, a quasi-experimental multilevel design was employed. Our results show
the value of university teaching diplomas: teachers who participated are more likely to have students
reporting they adopt deeper approaches to studying.
were more likely to have students adopting deep approaches to studying, while those who maintained
teacher-centred approaches were more likely to have students adopting surface approaches (Ho,
Watkins, and Kelly 2001; Gibbs and Coffey 2004). Besides, teachers who participated in teaching devel-
opment programmes tended to obtain better student evaluations (Ho, Watkins, and Kelly 2001; Gibbs
and Coffey 2004) as well as having students with higher levels of satisfaction than those who had not
participated (Trigwell, Caballero Rodríguez, and Han 2012).
Unlike these studies, Stes et al. (2012a) tested multilevel models (called the gross model, the net
model and two interaction with context models) to explore the impact of a one-year teaching develop-
ment programme on students’ learning. Results from the gross model showed non-significant effects
on students learning outcomes (affective learning outcomes, psychomotor learning outcomes, generic
and information skills, and knowledge and subject-specific skills); while the net model found one sig-
nificant effect, which was negative, on scale knowledge and subject-specific skills. They also reported
small negative effects on the affective learning outcomes for both models, and a small negative effect
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
on the generic and information skills in the net model. Therefore, the experimental group did not pres-
ent better learning outcomes than the control group in the post-test, as the authors’ hypothesis had
expected. First interaction model results showed there is a stronger teacher impact on first-year classes
than on non-first-year classes. However, non-significant differences were found between experimental
and control teachers teaching first-year students. The second interaction model, related to class size,
established that the impact is more positive for teachers in medium or large classes, but few significant
differences were found between experimental and control groups.
The same authors (2012b) investigated associations between teacher participation in development
courses and student perceptions of the learning environment. They tested the same models. In this case,
the gross model showed non-significant effects of the learning environment on student perceptions.
The net model showed significant effects only for the teaching for understanding scale, and this was
negative. The interaction models showed only two significant effects: the post-test was significant for
non-first year students on the interest and enjoyment scale, and for the post-test in large classes on
student support was negative. Finally, Stes, Coertjens, and Van Petegem (2013) investigated the effect of
participation in a teaching development programme on student perceptions of teaching and changes
in teacher behaviour, as perceived by students, after the end of the programme. No significant differ-
ences were found on student perceptions of the learning and teaching contexts, or on perceptions of
teaching behaviour.
It is important to acknowledge that methodological problems have been identified in studies on the
impact of university teachers’ participation on university teaching diplomas (Gibbs and Coffey 2004;
Postareff, Lindblom-Ylänne, and Nevgi 2007; Stes et al. 2010). Several authors agree that studies need
to consider more variables and use more sophisticated methods of analysis (Stes et al. 2010; Trigwell,
Ashwin, and Millan 2013). In the first place, regarding the incorporation of more variables, research has
focused both on the implications that teaching development programmes have for university teaching
(e.g. Postareff, Lindblom-Ylänne, and Nevgi 2007, 2008) and on student experiences of learning (e.g.
Gibbs and Coffey 2004). However, studies that analyse the relationships between learning and teaching,
and consider a wider range of variables, are scarce (Stes, Coertjens, and Van Petegem 2013).
Considering only a few variables or not controlling them is a major issue in some of the studies
we reviewed. In relation to the analysis methods used so far, some studies have examined the data
using, for example, t-tests (e.g. Gibbs and Coffey 2004) or analysis of variance (e.g. Postareff, Lindblom-
Ylänne, and Nevgi 2007). These analysis methods work with only a few variables, and do not establish
the relationships between student and teacher answers. Therefore, using methods that appropriately
consider these relationships, and use multiple sources of information for the evaluation of these pro-
grammes, is needed. In this respect, studies by Stes et al. (2012a, 2012b) are in the right direction. It
is striking, however, that these investigations found limited or no evidence of the impact of teaching
development programmes. Although the authors claim that this lack of effect may be due to the short
timeframe of the programmes, not allowing enough time to ‘capture’ some effect, voluntary participa-
tion, which implies an intrinsic motivation for improving teaching, and the low number of participants
4 J. MARCHANT ET AL.
(Stes et al. 2012a, 2012b; Stes, Coertjens, and Van Petegem 2013), results suggest that the more complex
the analysis, the harder to find some effect on student learning.
In summary, research has shown that participation in university teaching diplomas promotes teach-
ers’ adoption of student-centred approaches, and that, on the side of student learning, research reports
both positive and no or limited effects. At the same time, research reporting no or limited effect has
used more sophisticated methods of analysis, which questions to some extent the previous studies
that use less sophisticated analysis. This contrasting evidence leaves important questions on the impact
of university teaching diplomas unanswered, at least for the student learning experience. Given the
increasing pressures to provide evidence that these programmes promote better teaching and learn-
ing (Brew 2007), and the inconclusive evidence found in the literature, it is both timely and relevant to
conduct research that helps unveil whether they do have an impact.
This study aims to provide further evidence towards answering this question, by investigating the
impact of a University Teaching Diploma on student approaches to studying and learning experience.
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
In so doing, we followed suggestions by Stes et al. (2010) and Trigwell, Ashwin, and Millan (2013) and
as such incorporate variables not considered in previous studies and use an analysis strategy that con-
siders the nested nature of this phenomenon. At the same time, we provide evidence from a country
– Chile – where no previous studies in this area have been carried out.
Methodology
The present study investigates the relative impact of university teacher participation in a university
teaching diploma on student approaches to studying and perceived learning experience. Other varia-
bles are also included to evaluate this impact (for students: gender, age, perceived workload, perceived
course relevance, parental responsibilities, work and self-efficacy beliefs; for teachers: discipline, gender,
age, academic degrees, experience, previous training in university teaching, motivation with teaching
development, type of contract and course size). These were selected based upon previous research into
the broader area of learning and teaching in higher education (e.g. Rosario et al. 2013). In this manner,
we would be able to determine the specific effect of participation in the university teaching diploma on
student approaches to studying and learning experience. A quasi-experimental and multilevel design
was employed. This was considered appropriate for this study, as it facilitates examining relationships
between selected variables considering the nested or hierarchical nature of the data (two levels in this
case: university teachers and students) (Snijders and Bosker 1999; Kline 2010). The research questions
are the following:
(1) What is the relative impact of teacher participation in a University Teaching Diploma on student
approaches to studying?
(2) What is the relative impact of teacher participation in a University Teaching Diploma on student
perceived learning experience?
(1) curricular design;
(2) students’ learning assessment;
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 5
The diploma starts with the modules on curricular design and learning assessment. Then, more
advanced modules are conducted on ICT integration and university teacher reflexion into their own
practice. By the end of the programme, participants must have had 160 h of face-to-face attendance.
Participants
Forty-four university teachers agreed to participate in this study. Twenty-five of them had participated
in the University Teaching Diploma and nineteen had not; 56% were males and 44% females. The age
mean was 48.7 years (SD = 13.11). In total, 686 students answered the questionnaires: 373 were partici-
pating in courses of diploma teachers, while 313 were participating in courses with teachers not taking
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
the diploma; 53% were males and 47% females. Mean age was 22 years (SD = 3.57). Table 1 provides
demographic information for the teachers participating in this study.
Instruments
Well known instruments from the student learning research tradition were employed. For students, the
Course Experience Questionnaire (CEQ) and the Study Process Questionnaire (SPQ) were used; and for
teachers the Approaches to Teaching Inventory (ATI) was employed. Questions related to contextual
variables were incorporated. All these instruments were validated prior to being used in this study
(Marchant, Fauré, and Abricot 2016; Marchant 2017).
The CEQ, SPQ and ATI were translated and compared with previous versions produced in Chile
(González et al. 2011; Montenegro and González 2013). Then, qualified experts analysed them to reduce
bias on linguistic, psychological and cultural differences. Experts also judged questions for contex-
tual variables. The process was carried out according to guidelines proposed by Muñiz, Elosua, and
Hambleton (2013).
The instrument for gathering student data included a shortened version of the CEQ, the SPQ and
questions for contextual variables.
Table 1. Distribution of UTD and non UTD teachers according to their academic degree, other teaching commitments, gender and
teaching experience.
Gender
Male 16 12 28
Female 9 7 16
Total 25 19 44
Teaching experience
Less than 5 years 0 4 4
Between 6 and 10 years 10 2 12
More than 10 years 15 13 28
Total 25 19 44
6 J. MARCHANT ET AL.
The CEQ version we employed is the shortened version reported by Webster et al. (2009). This is com-
posed of 17 five-point Likert items, grouped into four scales: good teaching, clear goals and standards,
appropriate assessment and appropriate workload. Confirmatory factor analysis was carried out for
each scale, which were considered as independent, following Richardson (2006). Goodness of fit was
appropriate: good teaching, χ2 (5.36, N = 668) = 100.38, p = .084, CFI = 0.997, TLI = .985, SRMR = .011; clear
goals and standards, χ2 (483.4, N = 657) = 855.84, p = .061, CFI = .975, TLI = .951, SRMR = .038; appropri-
ate workload, χ2 (586.3, N = 569) = 980.31, p = .075, CFI = .971, TLI = .949, SRMR = .039, and appropriate
assessment, χ2 (1994, N = 579) = 522.72, p = .053, CFI = .985, TLI = .956, SRMR = .024. Cronbach’s alpha
ranged from acceptable to very good (good teaching, α = .8585; clear goals and standards, α = .7408;
appropriate workload, α = .7817; appropriate evaluation, α = .6977).
The SPQ questionnaire consists of 20 five-point Likert items, grouped into two scales: deep and sur-
face learning. The data fitted the model well: χ2 (1821, N = 575) = 1450.89, p = .086, CFI = .965, TLI = .939,
SRMR = .041. Very good Cronbach’s α were found (deep learning, α = .84; surface learning, α = .80).
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
Contextual variables included were: socio-demographic (gender and age), perceived workload, per-
ceived course relevance (for the semester curriculum and personal) and self-efficacy beliefs (adapted
from Richardson 2006). Also, questions related to whether he/she cared for his/her children and whether
they were studying full-time were incorporated.
In the case of university teachers, the instrument used included the ATI as well as contextual var-
iables. ATI is a scale composed of 22 five-points Likert items, grouped in two sub-scales: conceptual
change/student-focused (CCSF) and information transmission/teacher-focused (ITTF). The data fitted
the model well (CFI = .998; TLI = .995; SRMR = .027; RMSEA = .023). Acceptable Cronbach’s alphas were
obtained (CCSF, α = .70; ITTF, α = .85).
The contextual variables included discipline, socio-demographic (gender, age, academic degrees)
and work-related elements (experience, working full or part-time in the institution, course size). Self-
efficacy beliefs items (related to confidence in content knowledge, confidence in students’ learning and
confidence in pedagogical knowledge; Lindblom-Ylänne et al. 2006) were included. Table 2 presents
scales from questionnaires employed with a representative item for each of them.
Students and teachers answered the instruments during their regular class time within the classroom.
In all cases, one member of the research team was in the classroom where the instruments were being
answered. In this way, the team member communicated the study objectives as well as the voluntary
and anonymous character of participation both verbally and in writing.
Analysis
Multilevel hierarchical regressions represent an analysis sensitive to the nested or hierarchical nature of
educational phenomena. As stated before, there are few studies investigating the impact of university
teaching diplomas that use this type of analysis (e.g. Stes et al. 2012a, 2012b). We employed multilevel
hierarchical regressions (Kline 2010) to calculate the relative impact of a number of variables on stu-
dent approaches to studying and learning experiences. Variables were organised in two levels: those
associated with university teachers and those associated with groups of students (Snijders and Bosker
1999). Figure 1 represents them.
In order to understand the impact of these variables on student approaches to studying and learn-
ing experience, six random intercept and fixed slope models were calculated, using the maximum
likelihood method (Fielding and Goldstein 2006). One model was built for each SPQ scale (deep and
surface learning) and one for each CEQ scale (good teaching, clear goals and standards, appropriate
assessment and appropriate workload).
The tested models are the following:
SPQij = 𝛽00 + 𝛽10 ⋅ CEQij + 𝛽20 ⋅ Sij + 𝛽30 ⋅ Fj + 𝛽40 ⋅ ATIj + 𝛽50 ⋅ Xj + 𝛽60 ⋅ Ij + u0j + ∈ ij
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
CEQij = 𝛿00 + 𝛿10 ⋅ SPQij + 𝛿20 ⋅ Sij + 𝛿30 ⋅ Fj + 𝛿40 ⋅ ATIj + 𝛿50 ⋅ Xj + 𝛿60 ⋅ Ij + 𝜐0j + 𝜀ij
where SPQij corresponds to each approach to studying scale for student i in the course j; CEQij represents
the learning experiences of student i in the course j and included the four CEQ scales; Sij corresponds
to the student’s features vector i in the course j, where the self-efficacy, course relevance, workload, sex
and age were included; Fj represents whether the teacher j finished the University Teaching Diploma;
ATIj represents the approaches to teaching of the teacher in course j and included its two scales; Xj cor-
responds to the context and teacher’s features vector and j, where self-efficacy, motivation to improve
teaching skills, previous university teaching qualification, gender, age, disciplinary area and course size
were included; Ij corresponds to the vector which included the effects of participation in University
Teaching Diploma with ATI scales and with self-efficacy and motivation to improve teaching skills var-
iables; u0j & 𝜐0j correspond to random components, such as the average mark of each course; finally, ∈i,j
and 𝜀i,j correspond to random components, such as each case average or error.
Results
Impact of the University Teaching Diploma on student approaches to studying
Table 3 presents the complete models obtained from calculating the University Teaching Diploma’s
impact on student approaches to studying.
In the first place, the results show that university teachers who completed the diploma had stu-
dents with higher scores in the deep approach scale and lower scores in the surface approach scale
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
(β(SPQ deep) = 2.3947, p > .05; β(SPQ surface) = −3.3571, p > .05). Moreover, results show that completing the
diploma, compared to other variables, such as age, sex or previous formation, has the highest load as
a predictor for the SPQ scale scores.
Notes: Models were calculated with a constant but this is omitted in the results. Also, robust standard errors were omitted.
*p < .05.
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 9
In the second place, the CEQ scales also predicted SPQ scale scores. Those students with higher scores
in the deep approach scale tended to present higher scores in the good teaching (β(SPQ deep) = .2912,
p < .05) and clear goals and objectives (β(SPQ deep) = .0799, p < .05) scales. At the same time, students
who presented higher scores in the surface approach scale tended to have lower scores in the good
teaching (β(SPQ surface) = −.1175, p < .05) and appropriate evaluation (β(SPQ surface) = −.2946, p < .05) scales.
Finally, when considering the other variables included in the multilevel models some important asso-
ciations can be seen. At the level of the students, gender (β(SPQ deep) = .0953, p < .05; β(SPQ surface) = −.1908,
p < .05), self-efficacy (β(SPQ deep) = .0740, p < .05; β(SPQ surface) = −.0476, p < .05) and course relevance (per-
sonal) (β(SPQ deep) = .1173, p < .05; β(SPQ surface) = −.0544, p < .05) had a significant predictive effect in both
SPQ scales. At the same time, age (β(SPQ deep) = .0121, p < .05), course relevance (for semester curriculum)
(β(SPQ deep) = .1173, p < .05) and caring for his/her children (β(SPQ deep) = .1204, p < .05) were positively
significant for the deep approach scale. On the other hand, at the level of the university teachers, post-
graduate formation (β(SPQ deep) = .4741, p < .05; β(SPQ surface) = −.3505, p < .05) showed a significant predictive
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
impact on both SPQ scales. The teaching-centred scale of the ATI (β(SPQ surface) = −.0454, p < .05), age (β(SPQ
surface)
= −.0048, p < .05) and disciplinary area (β(SPQ surface) = −.1227, p < .05), were negatively significant
for the surface approach scale. It is worth highlighting that university teachers who had studied their
first degrees as school teachers tended to promote the deep learning approach less strongly in their
students (β(SPQ deep) = −.3399, p < .05).
Therefore, in relation to the first question, our results show that university teacher participation in a
University Teaching Diploma impacts positively on student adoption of deep approaches to studying;
and that this is the variable with the highest explicative load.
β(CEQ Good teaching) β(CEQ Clear goals and standards) β(CEQ appropriate evaluation) β(CEQ appropriate workload)
Fixed effects
Level 1
SPQ deep .4327* .3046* −.2876* .0148
SPQ surface −.0653 −.0328 −.7493* −.143*
Student self-efficacy 0316 .0843* .017 .093*
Course relevance (for −.0225 −.0913 .1964* −.3007*
semester curriculum)
Course relevance (per- .102 .1385* .1041 .0724
sonal)
Cares for his/her children −.1124 −.0614 −.0711 .0552
Full time study .0175 .0637 .0163 .0183
Gender −.0553 .0029 −.0015 −.0244
Age .002 −.0011 −.0088 −.0012
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
Level 2
UTD participation (com- −2.0447 −1.927 −2.4381 −1.803
pleted)
ATI CCSF .378 −.0989 .2319 .472
ATI ITTF −.0858 −.1413 −.3225* −.1879
Self-efficacy (confidence in −.1 −.0294 −.0004 −.1008
content knowledge)
Self-efficacy (confidence in 0 .087 −.0767 .1067
students’ learning)
Self-efficacy (confidence in .2057 .0973 .2902* −.0197
pedagogical knowledge)
Motivation to improve −.4932* −.2074 −.34* −.5402*
teaching skills
Undergraduate degree in −.3302 .0851 −.2195 −.7522
education
Postgraduate degree in .2737 −.052 −.075 .5208
education
Previous university teach- −.1117 −.3093 .0437 −.2003
ing qualification
Gender .1926 −.0125 −.0858 .0774
Age −.0035 −.0008 −.0115* −.0013
Disciplinary area .0939 .0313 .2427 .1286
Course size −.0133* −.0049 −.005 −.02*
Random effects
Within course variance .2609* .1432* .1505* .3123*
Residual variance .5283* .5452* .6093* .6268*
N 635 635 635 635
Courses 40 40 40 40
Log pseudolikelihood −526.26 −530.10 −599.51 −635.15
Notes: Models were calculated with a constant but this is omitted in the results. Also, robust standard errors were omitted.
*p < .05.
(β(CEQ appropriate evaluation) = −.0115; p < .05); and pedagogical confidence had a positive predictive effect
on appropriate evaluation (β(CEQ appropriate evaluation) = .2902; p < .05).
Therefore, regarding to our second question, the answer is negative: university teachers’ partici-
pation in a University Teaching Diploma does not have an impact on the students’ perceived learning
experience.
approaches. Moreover, the diploma effect on adopting a deep approach to studying is quantitatively
larger than any of the other variables included in the analysis; and (2) university teachers’ participa-
tion in the diploma did not have a significant impact on the students’ perceived learning experiences
(understood as perceptions of good teaching, clear goals and objectives, appropriate evaluation and
appropriate workload).
These results on approaches to studying are consistent with those reported in previous research by
Ho, Watkins, and Kelly (2001) and Gibbs and Coffey (2004), who found that teachers who participated
in teaching development were more likely to have students adopting deeper approaches to studying.
On the other hand, our results are not aligned with those by Ho, Watkins, and Kelly (2001), Gibbs and
Coffey (2004) and Trigwell, Caballero Rodríguez, and Han (2012), who found that participation in a
teaching development programme had a positive impact on one or more elements of the student
learning experience. Our findings, however, are aligned with Stes, Coertjens, and Van Petegem (2013)
who found non-significant positive impact on the learning experience. In summary, using a multilevel
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
approach, we found results on student approaches to studying similar to those of the studies that used
less sophisticated analysis techniques (Ho, Watkins, and Kelly 2001; Gibbs and Coffey 2004); and similar
results on the impact on the learning experience to those found by a previous study using a multilevel
approach (Stes, Coertjens, and Van Petegem 2013), but different from studies using less sophisticated
techniques (Ho, Watkins, and Kelly 2001; Gibbs and Coffey 2004; Trigwell, Caballero Rodríguez, and
Han 2012).
It is important to acknowledge, in relation to the lack of impact on student perceptions of the
learning experiences, that these may be explained by the indirect feature of the relationship. Thus, the
University Teaching Diploma effect may be hidden by other variables such as approaches to studying,
particularly, considering that approaches to studying scales do predict the learning experience scales.
The present study is not without its limitations. First, the study was conducted only in one university
and with students from a relatively small number of courses. Therefore, we do not claim it as generaliza-
ble. We rather think of our study as providing further evidence on the impact of teaching development
on student approaches to studying and learning experiences, and one that proposes a more complex
way of investigating this phenomenon. Second, we acknowledge that the use of contrast groups,
conformed post hoc, is problematic: quasi-experiments do not allow causes to be established, due to
an absence of a randomly generated contrast group (Lacave, Molina, and del Castillo 2014). Having
said that, we are aware this is the most appropriate approach for phenomena that happen in real con-
texts. So, design inherent issues are due to the nature of the object of study, and do not impede the
method of evaluation we employed from being valuable. This method can be replicated, and in doing
so improved, to evaluate other teaching development initiatives.
Thus, the study results presented in this article are likely to have implications for further research and
for practice. Future studies will need to provide further evidence on the impact of teaching development
programmes. Indeed, it is particularly important to conduct research that incorporates a broader range
of variables. We argued that student perceptions are influenced by diverse variables, such as age or
gender, which in turn, encouraged us to incorporate those variables exploring the specific effect of the
diploma. Other variables were incorporated, both at the level of students (e.g. self-efficacy, personal
importance of the course or course size) and of teachers (e.g. motivation, previous formation or disci-
plinary area). We suggest that future research should continue investigating the specific effect these
variables have, together with the participation on teaching development programmes, on student
approaches to studying and learning experience. At the same time, we claim that analyses that con-
sider the relationships between teachers and students’ answers – multilevel modelling and structural
equation modelling – are appropriate for achieving this aim.
Results from this study also have practical implications. In a context of increasingly limited fund-
ing and competing demands within universities, centres for teaching development around the world
are expected to demonstrate evidence of the effectiveness of their development programmes. This is
similar to the case in Chile, where these centres need to compete within their universities and, at the
same time, at least for those initially funded by the government, demonstrate their worth to external
12 J. MARCHANT ET AL.
agencies. Results from this study provide evidence that teacher participation in a university teaching
diploma does have an impact on student approaches to studying. It demonstrates that, at least in the
case of the University of Santiago’s University Teaching Diploma, conducting teaching development
programmes is worthwhile. Moreover, the research may provide a means to evaluating these pro-
grammes in other universities, thus becoming a practical tool for other university teaching diplomas
to demonstrate their impact.
In conclusion, we found that teaching development has a positive and direct impact on approaches
to studying, but that it does not produce, at least directly, a significant impact on student learning
experience. We proposed that further studies incorporate a wider range of variables, and use analyses
that capture the relationship between teacher and student answers. We also claimed that this article
proposes a model for evaluating other university teaching diplomas in a context of competing demands,
where they are permanently expected to demonstrate their value.
Downloaded by [University of Sussex Library] at 07:19 13 November 2017
Disclosure statement
No potential conflict of interest was reported by the authors.
Notes on contributors
Jorge Marchant, PhD, is Director of Curriculum Development at Universidad Diego Portales. His research interests are learn-
ing and teaching in higher education, curriculum evaluation and teaching development programmes in higher education.
Carlos González, PhD, is an associate professor and Director of Postgraduate Studies at the Faculty of Education, Pontificia
Universidad Católica de Chile. His research interests are learning and teaching in higher education, the student experience
and learning analytics.
Jaime Fauré is an analyst of Teaching Development at Universidad Andres Bello. His research interests are learning trajec-
tories, teaching in higher education and teaching development programmes.
ORCID
Jorge Marchant http://orcid.org/0000-0003-4207-3817
Jaime Fauré http://orcid.org/0000-0001-6644-3339
References
Biggs, J., and C. Tang. 2011. Teaching for Quality Learning at University. Buckingham: Open University Press.
Brew, A. 2007. “Evaluating Academic Development in a Time of Perplexity.” International Journal for Academic Development
12 (2): 69–72. doi:10.1080/13601440701604823.
CNED. 2011. Evolución de la matrícula de Educación Superior 1994–2011: Departamento de Investigación e Información Pública
[Evolution of enrolment in Higher Education 1994–2011. Department of Research and Public Information]. Santiago:
Consejo Nacional de Educación.
CNED. 2014. Índices: Estadísticas y Bases de Datos. Retrieved May 28, 2014, from http://www.cned.cl/public/Secciones/
SeccionIndicesEstadisticas/indices_estadisticas_sistema.aspx
Entwistle, N. 2000. Promoting Deep Learning through Teaching and Assessment: Conceptual Frameworks and Educational
Contexts. In Conference dictated on TLRP, University of Leicester, England.
Fielding, A., and H. Goldstein. 2006. Cross-classified and Multiple Membership Structures in Multilevel Models: An Introduction
and Review. London: Research Report 791 for DfES.
Gibbs, G., and M. Coffey. 2004. “The Impact of Training of University Teachers on Their Teaching Skills, Their Approach
to Teaching and the Approach to Learning of Their Students.” Active Learning in Higher Education 5 (1): 87–100.
doi:10.1177/1469787404040463.
González, C., H. Montenegro, A. López, I. Munita, and P. Collao. 2011. “Relación entre la Experiencia de Aprendizaje de
estudiantes universitarios y la docencia de sus profesores.” Calidad en la Educación 35: 21–49. doi:10.4067/S0718-
45652011000200002.
Hanbury, A., M. Prosser, and M. Rickinson. 2008. “The Differential Impact of UK Accredited Teaching Development Programmes
on Academics’ Approaches to Teaching.” Studies in Higher Education 33 (4): 469–483. doi:10.1080/03075070802211844.
ASSESSMENT & EVALUATION IN HIGHER EDUCATION 13
Ho, A., D. Watkins, and M. Kelly. 2001. “The Conceptual Change Approach to Improving Teaching and Learning: An Evaluation
of a Hong Kong Staff Development Programme.” Higher Education 42 (2): 143–169. doi:10.1023/A:1017546216800.
Kline, R. 2010. Principles and Practice of Structural Equation Modelling. New York: Guilford Press.
Lacave, C., A. Molina, and E. del Castillo. 2014. “Evaluación de una innovación docente a través de un diseño estadístico
cuasi-experimental: aplicación al aprendizaje de la recursividad.” Calidad y evaluación de la docencia 20 (3): 159–166.
Lindblom-Ylänne, S., K. Trigwell, A. Nevgi, and P. Ashwin. 2006. “How Approaches to Teaching Are Affected by Discipline
and Teaching Context.” Studies in Higher Education 31 (3): 285–298. doi:10.1080/03075070600680539.
Marchant, J. 2017. La formación en docencia universitaria en Chile y su impacto en profesores y estudiantes (Doctoral
dissertation). Leiden University, Netherland. Retrieved from https://openaccess.leidenuniv.nl/handle/1887/46488
Marchant, J., J. Fauré, and N. Abricot. 2016. “Preliminary adaptation and validation of the SPQ and the CEQ for the study of
teaching development programmes in a Chilean university context.” Psykhe 25 (2): 1–18. doi:10.7764/psykhe.25.2.873.
Montenegro, H., and C. González. 2013. “Análisis factorial confirmatorio del cuestionario “Enfoques de Docencia
Universitaria” (Approaches to Teaching Inventory, ATI-R).” Estudios Pedagógicos 39 (2): 213–230. doi:10.4067/S0718-
07052013000200014.
Muñiz, J., P. Elosua, and R. Hambleton. 2013. “Directrices para la Traducción & Adaptación de los tests: segunda edición.”
Downloaded by [University of Sussex Library] at 07:19 13 November 2017