Ac 2012-3237: An Experience Using Reflection in Software Engineering
Ac 2012-3237: An Experience Using Reflection in Software Engineering
ENGINEERING
Dr. Alexandra Martinez, University of Costa Rica
Alexandra Martinez has been working since 2009 as an Invited Professor in the Department of Computer
and Information Science at the University of Costa Rica (UCR). She has taught courses in databases, soft-
ware testing, and bioinformatics, and done applied research in software testing at UCR’s Research Center
on Information and Communication Technologies. Previously, she worked as a Software Design Engi-
neer in Test at Microsoft Corporation in Redmond, Wash., and as a Software Engineer at ArtinSoft in San
Jose, Costa Rica. She received her Ph.D. in computer engineering from the University of Florida in 2007,
her M.S. in computer engineering from the University of Florida in 2006, and her B.S. in computer and
information science from the Universidad de Costa Rica in 2000. She also received a scholarship to study
in the pre-doctoral program in computer science at the Ecole Polytechnique Fdrale de Lausanne, Switzer-
land, from 2001 to 2002. Her research interests include software testing, data quality, bioinformatics, and
databases.
Marcelo Jenkins obtained a B.S. degree en computer and information sciences at the University of Costa
Rica in 1986 and a M.Sc. and Ph.D. degrees from the University of Delaware, USA, in 1988 and 1992, re-
spectively. Since 1986, he has been teaching computer science at the University of Costa Rica. From 1993
until 1998, he coordinated the Graduate Committee, and from 1998 through 2001, he was the Chairman
of the Department of Computer and Information Sciences. His research interests are in software engineer-
ing, software quality assurance, project management, and object-oriented programming. He has authored
more than 40 technical papers on these subjects. As an independent consultant, he has worked with some
of the largest software companies in the Central America region in establishing software quality man-
agement systems. In the last 15 years, he has taught several seminars on software quality assurance and
software project management in seven different countries. Jenkins is an ASQ Certified Software Quality
Engineer (CSQE).
American
c Society for Engineering Education, 2012
An Experience Using Reflection in Software Engineering
Abstract
This paper reports the results of a case study where two different reflection mechanisms
were used in a couple of graduate courses in the area of software engineering. A learning
journal was used in a Software Testing course whereas a two-part reflection questionnaire
was used in the Software Quality Assurance course. We evaluated the reflection mechanisms
from the student’s perspective by means of a survey, and from the teacher’s perspective
through a qualitative assessment of observed strengths and limitations. We analyzed some of
the results obtained from implementing both approaches and reach some conclusions on how
to improve our reflection method.
1. INTRODUCTION
There are three common ways in which ‘reflection’ is used and understood 9. The first one is
that reflection happens between the learning process and the representation of that learning.
The second is that reflection has an implying purpose, i.e., we reflect consciously with the
aim of attaining a useful outcome. And the third one is that reflection emerges when pro-
cessing issues for which there is no obvious solution.
John Dewey, an American philosopher, psychologist and educator, is often cited as the fa-
ther of modern critical thinking theory, which was originally called ‘reflective thinking’ 3.
Dewey defines reflective thinking as an ‘active’ process, in which we think things through,
raise questions and find relevant information for ourselves, as opposed to receiving infor-
mation from someone else in a passive way 3. The key principles underlying Dewey’s ap-
proach to reflection are the need for ‘perplexity’ (state of doubt or uncertainty), the sense of
goal orientation, and the notion of testing (which later derived into the experiential learning
approach) 9. Dewey believes that there is a close link between reflection and learning, as evi-
denced in his quote “We do not learn from experience…we learn from reflecting on experi-
ence.”
Building upon the work of Dewey and others, David Kolb, an American educational theorist,
laid the foundations for the Experiential learning theory (ETL) 8, which is a model of the
learning process that emphasizes the central role of experience in learning and development.
ETL defines learning as “the process whereby knowledge is created through the transfor-
mation of experience.” The experiential learning cycle developed by Kolb 7 consists of four
stages: concrete experience, reflective observation (where the learner consciously reflects on
the experience), abstract conceptualization (where the learner tries to assimilate and distill
her reflections into abstract models or theories), and active experimentation (where the
learner tests her models and theories with new experiences) 8.
Learning journals, diaries and portfolios are increasingly used in higher education to facili-
tate and to assess learning. They may be highly structured or free, but regardless of their
shape and form, they generally seem to be helpful in personalizing and deepening the quality
of learning and in integrating the learning material 10. The distinction between learning jour-
nal and other types of writing is that “…it focuses on ongoing issues over time and there will
be some intention to learn from either the process of doing it or from the results of it.” 10
Some of the reasons why we can learn from journals are 10: (i) journal writing is a process
that accentuates favorable conditions for learning, (ii) journal writing encourages reflection
and reflection is associated with deep learning, (iii) writing in a journal encourages meta-
cognition, and (iv) the act of writing is associated with learning or the enhancement of learn-
ing.
There were two main motivations for introducing a reflection mechanism in our courses.
First, we wanted to increase students’ engagement in the courses given that there had been a
declining student interest in the courses offered by the Master’s Program (evidenced by a
decreasing number of students taking courses each semester). In response to this situation,
we decided to adopt an “active learning” approach in our courses, including the use of re-
flection mechanisms. Second, we wanted students to achieve a deeper and more enduring
learning from our courses (rather than a superficial, short-term one), and inspired on Ken-
neth Bain’s ideas, we decided to try two implementations of student reflection, namely a
learning journal and a reflection questionnaire.
Previous experiences on the use of reflection in engineering have been described by Gillet 4,
Kelly 6, and Walther12. Previous work on learning portfolios has been described by Brown 1
and Chang 2. The use of wikis to support learning in Computer Science has been reported by
Hamalainen 5 and Tsai 11. Most of these works have been developed and studied at the un-
dergraduate level only. The contribution of our work is to report on two experiences using
reflection at the graduate level, thus expanding the context where reflection can be used.
Kelly et al. 6 report on their experience with class reflections in an introductory materials
science and engineering course at Arizona State University. Their work aimed at answering
the question "How can we use class reflections to support student learning, attitude, and re-
tention?” After each class period, students were asked to fill out a Class Reflection about
what interested them, what they found confusing, and what they found most valuable. In or-
der to measure conceptual learning gains, the instructors created pre and post Topical Mod-
ule Assessments. The authors evaluated in depth the records of six students which were rep-
resentative of general class trends. They claim that the use of classroom reflections support-
ed student learning, attitude, and retention. The type of reflection we used is quite similar to
the one used by these authors, except for two aspects: the frequency of reflections (they ask
students for a reflection after each class, but we do it less often); and the moment when those
reflections take place (at the end of the class period in their case, and out-of-class in our
case).
Tsai et al. 11 presents a case study of using wiki web sites for collaborative learning in a jun-
ior-level course (Introduction to Software Engineering) at Arizona State University. They
based their approach on Web 2.0 principles including the Web as platform, harnessing col-
lective intelligence, data are the next Intel Inside, and rich user experiences. They report that
their approach has enhanced the educational experience and improved student performance.
It also increased motivation and participation, improved peer support for learning, improved
accessibility of learning, improved learning results, and increased self-directed learning ac-
tivities/skills. One of the differences between their work and ours is that their wiki web sites
are public, with the spirit of sharing their knowledge base and encouraging peer-evaluation
(through ranking of the web sites).
Brown 1 performed a study that aimed to describe adult students’ perspectives on the learn-
ing that resulted from their creation of an experiential learning portfolio, at Barry Universi-
ty’s School of Adult and Continuing Education (ACE) in Miami, Florida. Their major find-
ings were: an increase in the participants’ self-knowledge after portfolio development; a
greater recognition of the value of learning from work and from mentors; and improved
communication and organization skills plus a greater appreciation of the role of reflection in
recognizing learning. Their experiential portfolio differs from ours in that it intends to doc-
ument prior student learning from professional and community activities, and it is targeted
for adult students. Our student population consists of usually young graduate students, and
the learning journal aims to document the learning process that took place in a specific
course.
The rest of the paper is organized as follows. Section 2 describes the context of the courses.
Section 3 presents the methodology. Section 4 discusses our findings. And Section 5 con-
cludes the paper and outlines our plans for future work.
The main objective of the ST course is to provide students with a practical introduction to
software testing processes, techniques, and activities within the context of quality assurance.
After completing this course, students will be able to:
Explain the fundamental concepts of software testing.
Identify relevant elements and best practices of software quality related to software
testing.
Compare the different software testing techniques, levels, and types.
Reflect on their learning process.
The main objective of our SQA course is to prepare the student to be able to apply modern
tools in the design and implementation of a quality assurance system for the software devel-
opment process. At the end of the course, students will be able to:
Identify the main elements associated to the quality of the software process.
Design and implement components of a quality assurance system for a software de-
velopment process.
Apply a clear vision of quality in the various components of the software develop-
ment process from a practical perspective.
Perform quality audits of the software development process and technical reviews of
software products.
3. METHODOLOGY
We present a case study where two different reflection mechanisms were used in a couple of
graduate courses in the area of software engineering. In the Software Testing course a learn-
ing journal was used, whereas the Software Quality Assurance course used a two-part reflec-
tion questionnaire. Figure 1 illustrates the two different approaches used in each course. Fig-
ure 1 shows that there were five interventions in the case of the Software Testing course and
two in the case of the Software Quality Assurance course. It also shows that there was a
formal Department survey at the end of each course, which was used as an independent as-
sessment instrument for cross-validating some of our findings. The purpose of the imple-
mented reflection methods differs from that of the official Department survey. The Depart-
ment survey aimed at evaluating both the instructor’s performance and the quality of the
course, from the students’ viewpoint. Meanwhile, the reflection methods served as a tool for
students to organize their reflections about their learning, and as a tool for teachers to obtain
early feedback about the teaching and learning process. Another way in which the reflection
mechanisms differ from the Department survey is in the timing of their application, i.e., the
latter one is applied or performed at the end of the course whereas the former ones are ap-
plied throughout the course. Nevertheless, we acknowledge that some of the questions asked
in the reflection mechanisms are similar to those in the Department survey, and in this regard
we view them as complementary assessments.
3.1.1. Implementation
The reflection mechanism used in the ST course was a learning journal. Students were asked
to write a journal entry every 3 weeks (on average) and there were 5 checkpoints during the
semester. This journal accounted for 15% of the final course grade. The style and format
were mainly free with the only constraint that the journal had to be electronic and accessible
online. Most of the students created a blog for their journals, and all of them made the jour-
nal private but shared it with the teacher (it was not open-access and public).
In order to guide students, the teacher gave them four questions to consider when writing
their journal entries:
1. What have you learned so far?
2. What activities have helped you to better understand the material (content)?
3. What activities have not helped you to understand the material?
4. What suggestions do you have for future classes?
Figure 1. The two different approaches used to implement reflection.
Grading of the student learning journals was done by the teacher, and the feedback was pro-
vided either through the blog itself (when the blog website allowed it) or via email. The as-
pects that were evaluated at each journal checkpoint were:
- Degree to which the student address the guideline questions.
- Degree to which the student reflected about all the material studied in the course (not
only a part of it).
- Writing: Was it clear to understand? Were the ideas logically organized? Were there
signs of a rushed or careless work? Did the student put an extra effort and time in
writing the journal entry (for example, some students looked for additional material
about topics they were curious about and linked to it from their journal)?
3.1.2. Assessment
The use of a learning journal was evaluated both from the student’s and the teacher’s per-
spective. Student feedback was obtained through an official course survey and an informal
course survey. On the other hand, the teacher made a qualitative assessment based on ob-
served strengths and limitations.
The official survey is run by our Master’s Program every semester with the intent to evaluate
several aspects of the graduate courses taught during that semester. This survey has general
questions that apply to all courses. Participation in the survey is anonymous and voluntary.
The semester where this study took place, the official survey was conducted between June
and July of 2011 and had a total of 10 participants from the ST course.
The informal survey was designed by the teacher specifically for the Software Testing
course. Students had to complete this survey as part of their learning journal (filling out the
web survey counted as a journal entry) but they were informed that grading of this part
would be based on effort only (regardless of their answers or comments). This survey was
conducted in July of 2011 and had a total of 14 participants. Most questions were multiple-
choice type but had an additional field where students could write comments (not mandato-
ry).
3.2.1. Implementation
The reflection mechanism used in the SQA course was a reflection questionnaire. Students
were asked to fill a questionnaire twice during the semester, the first one immediately after
the first term exam, and the second one at the end of the course. Both questionnaires ac-
counted for 10% of the final grade (5% each one).
The following guidelines were given to students regarding the reflection questionnaire:
“The reflection questionnaire will have two deliverables and consists in developing a per-
sonal document to document your learning experiences, reflect about them, and self-evaluate
using certain parameters.”
The first reflection questionnaire had eight open-ended questions that students had to an-
swer, as follows:
The second reflection questionnaire was more structured: it had ten multiple-choice ques-
tions, each with an optional comments/justification field.
3.2.2. Assessment
Besides our reflection questionnaire, an official standard course survey was conducted at the
end of the course (same used for all courses of the Master’s Program). This allowed us to
compare and contrast some of the answers from the questionnaire and the survey. Out of 17
students in the group, 14 of them filled up the reflection questionnaires and only 11 filled up
the final official survey.
4. FINDINGS AND DISCUSSION
N = 14
7% 7% 29% Very suitable
Somewhat suitable
Somewhat unsuitable
57%
Very unsuitable
Furthermore, for the same question from the informal survey, 12 out of 14 students who
completed the survey wrote comments. Table 2 shows all comments from students. We ob-
serve from Table 2 that three students failed to see value in the learning journal or were not
able to take full advantage of it (see comment from students 8, 11, and 12). This was some-
what expected, especially given the fact that it was the first time they were confronted with
writing a learning journal. It is interesting that two students (students 7 and 9) indicated that
the journal was somewhat suitable but their comments were rather negative, with a possible
explanation being that those students indeed thought that using a journal was suitable but
they disagreed with some aspects of the implementation of this journal. Another issue point-
ed out by two students (students 4 and 6) was that more time should be allowed between
journal entries, which we assume was proposed with the intent of allowing students more
time to reflect and also more material to think about. On the bright side, two students (stu-
dents 2 and 5) indicated that the journal helped them to review what they had learned, and
two other students (students 1 and 3) conveyed that they liked the journal (one of them
pointing out that it allowed him to provide real time feedback on the process).
Next we will show the more relevant results from the official survey. The course was rated
by students with an average grade of 8.5 out of 10 (actual ratings ranged between 7 and 10).
The teacher was rated with an average grade of 8.6 out of 10 (actual ratings also ranged be-
tween 7 and 10). There were other questions that could be related to some extent to the use
of a learning journal in the course, but the relationship is not always direct, therefore we cau-
tion that results from these questions cannot be taken as conclusive. Next we present these
results.
Figure 3 summarizes student responses to the question “Does the teacher pose activities that
allow you to think in a critical, diverse and novel way?” We believe that the learning journal
used in the course can be considered an activity that allows students to think critically, since
reflection is an inherent part of critical thinking. It also allows students to think in a novel
way because they have to think about what they are learning and how they are learning it
(meta-cognition level). We observe from Figure 3 that 50% of the students consider that the
teacher poses activities that allow them to think in a critical, diverse and novel way “almost
always” or “always”.
N0%
= 10
No response
30%
50% Never / Rarely
20% Sometimes
Table 3 summarizes student responses to the question “With regard to the course objectives
outlined in the program syllabus, you believe that the course met...?” Recall that one of the
course objectives was that students be able to reflect on their learning process. We observe
from Table 3 that 80% of the students considered that the course objectives have been ful-
filled at a level of 60% or more.
Some of common tendencies found in the first journal entries were: (i) summarizing the ma-
terial studied in class with little or no reflection about the learning process, (ii) writing only
about the part of the material that they found most interesting, (iii) very brief writing. These
issues were addressed through formative assessment besides the summative grade associated
to each checkpoint.
Formative assessment was provided to students individually after each journal checkpoint,
identifying strong and weak points in their work. This feedback was normally in the form of
an email, although sometimes comments were left in the blog itself. The following are sam-
ple feedback sent to students by the teacher:
“Your journal entry is fairly complete. I liked that you related what you have
learned in the course to your professional practice. I hope you keep making this
link in future entries of the journal.”
“I really liked your journal entry, especially the analysis you made of the teach-
ing methodology of the course, since it gives me clues as to what things I'm do-
ing right or wrong as a teacher. The level of technical detail used for the analysis
of what was learned was also appropriate.”
“I really liked your journal blog, especially the links you included. The only ob-
servation I have is that you neglected the ‘form’, i.e. you have some typos and
other writing errors, which gives the impression of a rushed work. For the future
checkpoints, try to be more careful with grammar and spelling details so that the
work has a higher quality.”
“Your journal entry is okay but somewhat short; I feel you could have expanded
your reflection on what you have learned (what did you get from the topics stud-
ied, how did you connect/relate the different topics studied, more details about
what you found most interesting or more contradictory with respect to precon-
ceived ideas, etc). In general it is easier for me to assess the learning process
when students express their ideas in more detail.”
“Your journal entry was very original, as usual. I liked the video about Unit test-
ing and TDD. I also found very helpful the other video on Coded UI tests. I have
only one observation: you did not mention anything about the test plan docu-
mentation, which was a topic extensively studied in class, and I would have
liked to know what you learned on that dimension.”
In three of the five journal checkpoints, two students failed to submit the journal entry on
time, resulting in a grade of zero for those students. Only one student failed to write a jour-
nal entry twice during the course, the other students failed to submit the journal entry only
once in the semester. Additionally, journal grades were reasonably good overall, as can be
observed from Table 4. This table shows the average and standard deviation of grades ob-
tained by students at every journal checkpoint. Note that the standard deviation at check-
points 1, 3, and 4 was considerably larger than at checkpoints 2 and 5, which is attributable
to the fact that those where the checkpoints where two students obtained a zero. Note also
that all students obtained the maximum possible grade (100) in the last journal checkpoint,
because this checkpoint corresponded to completing the survey prepared by the teacher and
students were given full credit if they completed it and made comments in some of the ques-
tions. The teacher believes that all these are indicators of a successful experience since it
shows that most students were making the effort and putting the time in writing good journal
entries and meeting the purpose of reflecting about their learning process.
As it was mentioned in the Introduction, one of the motivations for introducing reflection
mechanisms in our courses was that we wanted students to achieve a deeper and more endur-
ing learning from the course. Being able to measure the degree to which this aim was at-
tained is, however, a difficult task. One way in which this could be evaluated is surveying
students a year (or so) after the course has ended, on the use of reflection and also on key
concepts of the material studied in the course that we would expect them to have learned
deeply. Unfortunately, we haven’t done this yet but it is an idea that we would like to pursue
in the near future.
A major drawback of learning journals was the burden posed by grading since it is very
time-consuming to read over every journal entry (some of them were rather large) and pro-
vide individual feedback to students. Hence, we believe that this reflection mechanism does
not scale up well for large classes (unless you want to leave the grading to the Teaching As-
sistants).
On the other hand, Figure 5 shows the way students see their own preparation for the course
exams, in response to the question “Do you prepare adequately for the course evaluations?”
The answers for this question were significantly better than for the previous one, with 45%
of the students saying they “always” or “almost always” prepare adequately for the exams.
Of most concern for us is the lack of in-class participation from some students, as shown in
Figure 6. This chart summarizes students’ answers to the question “Do you participate, raise
questions and make comments in class?” A total of 54% of students said they “never” or
“almost never” participate with questions or comments during class. Actually, this question
originated comments in the survey such as “the teacher should encourage more student par-
ticipation during class” and “the teacher should make classes more participatory”, underlin-
ing the need for more in-class involvement by the students.
N = 11
9%
27% 9%
NR
Never/Almost never
Sometimes
55% Almost always/Always
N = 11 0%
9%
46% NR
Never/Almost never
45% Sometimes
Almost always/Always
N = 11
9% 9% NR
27% Never/Almost Never
54% Sometimes
Almost always/Always
Some of our findings from students’ reflection questionnaire were that students needed more
in-class hands-on exercises to grasp the concepts, and that most of their learning was not
done in class but during the development of the term project.
Finally, Table 5 shows the results from the official survey about how the students rated three
important characteristics of the course in a 0 to 10 discrete scale. In average, they assigned a
grade of 8.7 to the overall course, and a grade of 8.9 to the overall teacher´s performance,
which are both considered good ratings. Interestingly, they rated their own performance as
students in the course with an average grade of 8.5, meaning that they believe there is room
for improvement in the performance of some academic activities related to this course.
Table 5. Students’ opinion about overall aspects of the course (N=11).
Question Average Std. dev.
Overall course quality 8.7 1.45
Overall teacher performance 8.9 1.97
Overall student self performance 8.5 1.85
Regarding the students’ own interpretation of their course performance, the reflection ques-
tionnaire had a question asking “What could you personally do to enhance your learning”.
The three common answers we received were: attend more classes, doing the pre-class read-
ing more often, and having the opportunity to do more hand-on exercises during class. We
will surely take these observations into account next time we offer this course.
The main shortcoming of the reflection questionnaires is that except for the grade, no indi-
vidual formative feedback was given to the students. However, its strength is that, despite
not being anonymous, they allowed our students to be truthful and forthcoming expressing
their opinions and ideas, and they did hold back anything about the different aspects of the
course and the teacher. Furthermore, the reflection questionnaires, used for the first time by
most of the students, forced them to think about how they learned in the course, making the
experience more fruitful.
Among the lessons learned from our experience, we recommend that the reflection question-
naires be continuously revised and improved, experimenting with different types of ques-
tions depending on whether they are anonymous or not. Our study used graded non-
anonymous questionnaires, but it is clear to us that both options (anonymous and non-
anonymous) have advantages and disadvantages, and that the choice made imposes re-
strictions on the kind of questions that can be asked to the students. In the future we would
like to experiment with both anonymous and non-anonymous reflections in the same course,
always having the official Department survey as a complementary assessment method.
Another lesson learned is that before using a reflection mechanism, instructors need to have
clarity about what its main purpose is (e.g., formative, summative, or both) and who are the
principal users (the Department, the students, or the instructors) in order to choose suitable
types of questions and the most suitable method of carrying it out (anonymously or non-
anonymously).
Our students are just starting using reflection to enhance their learning, thus, some of them
do not fully appreciate its value yet. We as instructors must do a better job at explaining it at
the start of the course by showing them the benefits of reflection using examples and case
studies from previous courses.
The instructors are both satisfied with the initial results and plan to continue its use in the
near future. To do that, we plan to explore how to merge the two approaches explained in
this paper and obtain one combined method of reflection to be applied consistently in several
graduate level courses, in such a way that the students themselves would learn to appreciate
its value too. This would also entail the development of a rubric for grading students’ reflec-
tive writing in a more consistent and objective way, as well as the definition of clear and re-
alistic learning/teaching outcomes from the reflection mechanism used (for example, a com-
promise is needed between teacher’s and students’ benefits and investments).
6. REFERENCES
1. Brown, J.O. Know Thyself: The Impact Of Portfolio Development On Adult Learning. Adult Education
Quarterly, 52, 3 (2002).
2. Chang, C.C. Construction and Evaluation of a Web-based Learning Portfolio System: An Electronic As-
sessment Tool. Innovations in Education and Teaching International, 38, 2 (2001).
3. Fisher, A. Critical Thinking: An Introduction. Cambridge University Press, United Kingdom (2001).
4. Gillet, D., Ngoc, A.V N., Rekik, Y. Collaborative Web-based experimentation in flexible engineering edu-
cation. IEEE Trans. Educ., 48, 4 (2005).
5. Hamalainen, H., Ikonen, J., Porras, J. Using Wiki for Collaborative Studying and Maintaining Personal
Learning Diary in a Computer Science Course. Ninth IEEE International Conference on Digital Advanced
Learning Technologies (2009).
6. Kelly, J., Graham, A., Eller, A., Baker, D., Tasooji, A., Krause, S. Supporting Student Learning, Attitude,
and Retention Through Critical Class Reflections. Proceedings of the American Society for Engineering
Education Annual Conference (2010).
7. Kolb, D.A. Experiential Learning: Experience as the Source of Learning and Development. Prentice-Hall,
New Jersey (1984).
8. Kolb, D.A., Boyatzis, R.E., Mainemelis, C. Experiential Learning Theory: Previous Research and New
Directions. In: Perspectives on cognitive learning, and thinking styles. Lawrence Erlbaum Associates, New
Jersey (2001).
9. Moon, J. Reflection in learning & professional development: Theory and practice. Kogan Page, London
(1999).
10. Moon, J. Learning Journals; a handbook for academics, students and professional development,
RoutledgeFalmer, London (2004).
11. Tsai, W.T., Li, W., Elston, J., Chen, Y. Collaborative Learning Using Wiki Web Sites for Computer Sci-
ence Undergraduate Education: A Case Study. IEEE Transactions on Education, 99 (2010).
12. Walther, J., Sochacka, N.W., Kellam, N.N. Emotional Indicators as a Way to Initiate Student Reflection in
Engineering Programs. Proceedings of the American Society for Engineering Education Annual Confer-
ence (2011).