ASEE Final Paper 4-6-15
ASEE Final Paper 4-6-15
American
c Society for Engineering Education, 2015
Paper ID #12196
Fundamentals at Michigan Tech on August 1, 2014. His research has been supported by a number of
companies, as well as by NSF/CISE, NSF/DUE. and DARPA. Specifically his research in DBER-based
engineering education has been supported by NSF/DUE and NSF/CISE.
American
c Society for Engineering Education, 2015
Towards a Framework for Assessing Computational Competencies for
Engineering Undergraduate Students
Abstract
One of the challenges of preparing engineers for the rapidly changing workplace is to provide
the foundations they need to choose among and use a wide range of changing computational tools.
The ability of engineers to understand both engineering and computational principles which
allows them to select and use computational tools to solve engineering problems by moving
between physical, chemical or biological systems and abstractions in computing is crucial to the
engineering practice [1, 2, 3]. Our educational challenge is to implement evidence-based
instructional strategies that help engineering students move from novice understanding of
computing towards competence, while maintaining motivation grounded in their engineering
disciplines. To tackle this challenge and develop innovative instructional practices it is necessary
to develop the foundational knowledge that uniquely characterizes computational problem-
solving behavior and expertise in engineering practice.
Assessment is a central component of educational reform efforts; the use of valid and accurate
assessment tools enables us to determine the impact of a given intervention on student learning
and allows us to see the relationship between instruction and student learning outcomes.
Assessment quickly became a central theme in engineering education and there are several
examples of assessment tools and frameworks aimed to determine what a student knows and can
do in the context of the engineering practice [4]. A large body of literature refers to characterizing
and assessing engineering design [5, 6, 7]. However, despite the central role that computation plays
in engineering practice there is a distinct lack of valid assessments to accurately measure
computational competencies in the engineering context.
This paper describes our progress towards characterizing students’ skills and behaviors
associated with computational competency as they solve engineering problems. This work will
contribute to our understanding about computational thinking in engineering and bolster the small
body of research devoted to assessing computation in engineering education.
2. Project Overview
The Collaborative Process to Align Computing Education with Engineering Workforce Needs
(CPACE) team developed a partnership among various stakeholders—Michigan State University
(MSU) and Lansing Community College (LCC) and business and industry leaders—to redesign
the role of computing within engineering programs at MSU and LCC. The project comprised two
phases: CPACE I: a) Based on employer interviews and employee surveys conducted across a
representative sample of engineering businesses and industries we identified the computational
competencies needed in the engineering workplace; b) To translate our research findings into
fundamental CS concepts that can be used in curricular implementation we evaluated three
different computational frameworks and developed a ‘data-to-computer science (CS)-concept
map’. Figure 1 shows the distribution of the computational competencies mapped to CS concepts
i.e. CPACE Computational Competencies. Our results are consistent with other research on
engineering education and details of the process and findings from CPACE I are presented
elsewhere [1, 8-11].
However, as Alexander (2003) points out, expertise is domain-specific [12, 15]. What is relevant
about a given computational concept for a computer scientist is far from relevant for a mechanical
or chemical engineer attempting to use the concept in practice. Our work in CPACE I revealed the
same point; in order to translate our findings ⎯computational competencies/needs in the
engineering workplace —into fundamental computer science (CS) concepts we examined
frameworks from computer science [16, 17] but found that they focus on principles and concepts
that reflect the deep understanding of expert computer scientists [9]. The need remains to better
characterize the computational competencies as applied in the context of the engineering practice.
A major effort during CPACE II —and the subject of this paper—is to determine students’
computational skills and capabilities while solving engineering problems. The guiding research
question is: what are the features that broadly characterize the knowledge, skills and behaviors
associated with computational competencies for undergraduate engineering students?
A major challenge emerged during our initial analyses of student artifacts using the CPACE
computational competencies framework (Table 1). We discovered that using the artifacts alone it
was very difficult to recognize students’ computational problem solving processes (CPS) [let
alone measure computational competencies] as they solved engineering problems. To
complement our analyses and understand the complexity of the students’ computational thinking
processes we conducted semi-structured interviews with the students. This paper focuses on the
preliminary analyses of students’ interviews and will describe our efforts to define a
triangulation process [across data sources] that will allow us to characterize students’ skills and
behaviors associated with computational competency as they solve engineering problems.
Detailed survey findings are published elsewhere [18, 19].
To better understand the complexity of the students’ computational problem solving processes
we conducted semi-structured interviews; the objective was to focus on the role of computation
during the problem solving process. Drawing from qualitative methods our analyses follow three
major steps common to grounded theory: coding the text for themes, linking themes into
theoretical models, and displaying and validating the models [20].
4.1. Interviews
Our objectives were to better understand the process by which students solve engineering
problems using computational tools; specifically how, when, and why computational tools were
employed during the problem solving process. We targeted courses in both chemical and civil
engineering disciplines including sophomore, junior and senior levels. We conducted a total of
13 face to face one hour semi-structured interviews, including three pilots. Operationally,
students enrolled in the target courses were recruited via e-mail within two months of project
completion and offered a monetary incentive. A copy of their project report was provided in
advance of the interview. To prepare for the interview, the interviewers--part of the CPACE
team-- reviewed the project specifications and the students’ report. The interview protocol was
designed to probe for three main themes: (1) general questions about the project assignment and
its objectives; (2) computational themes, including probes to elicit learning and application of
computational tools to problem solving and (3) probes to determine conceptual understanding of
computation and the use of computational tools. The last set of questions targeted students
curricular experiences as related to computation. The protocol was piloted three times to refine
the final instrument.
During the interview students described their thought process as they recalled how they solved
the engineering problem; a video recorder focused on the project report captured references to
figures and other relevant data. ATLAS.ti qualitative software was utilized to organize and
analyze the interview data. Interviews were organized according to course level; verbatim
transcriptions were associated to audio so researchers had the ability to concurrently read and
listen to interview data as they segmented and coded.
The overall process is summarized in figure 2. The boxes represent distinct steps in the
process, which are detailed in the following sections. Following an immersion process, the
researcher listens, reads and watches all recorded interviews taking observational notes.
Figure 2. CPACE Qualitative Process: interview coding and alignment to CPACE computational
competencies.
Primary Coding: Each segment of the transcript is coded according to the behavior exhibited.
Primary codes and code definitions are established based on the utterances that broadly
characterize student experiences across interviews. As one codes across interviews, patterns
begin to emerge allowing for categorization and grouping. For example, a student talking about
understanding a computational tool comes to define ‘comprehension of tool’ (COT) primary
code. Each primary code has a rule of inclusion or definition that limits or allows its application.
This code is then applied across interviews each time a participant exhibits ‘comprehension of
tool’.
Secondary Coding: Once exhaustive primary coding is completed across interviews, the
researcher has a codebook with associated definitions or rules of inclusion for each code. Based
on frequencies of occurrence, patterns, and co-occurrences primary codes are grouped into code
families because they share some characteristic. Second order coding and theme development is
focused on refinement of high-density primary codes into more strictly defined secondary codes,
which include related secondary sub-codes that capture specific skills and behaviors. Thus
allowing the coders to determine alignment or non-alignment with the CPACE computational
competencies framework. Figure 3 depicts an example of the primary and secondary code
structure; it includes the definitions for each of the codes as well as the interview quotes
associated with those codes.
Figure 3. CPACE primary and secondary code structure: Alignment between secondary codes
interview quotations are color-coded.
Tertiary Coding: After primary and secondary code/family refinement, the researchers targeted
assessing code alignment to the strict constraints of the CPACE computational competencies
framework. The tertiary code structure correlated more specifically to the characteristics and
behaviors encompassed in the CPACE computational competencies framework. However it is
important to note that using the framework as a guide did not restrict the development of
emergent themes nor of additional computational competencies that might not be included in the
initial framework.
Two major categories emerged from these preliminary analyses: (1) Computational Problem
Solving (CPS) and (2) Non-CPS.
Computational Problem Solving (CPS): Includes all code families that reflect behaviors
specifically related to the CPACE competencies framework or were otherwise related in
definition or content to computation.
Table 2 depicts examples of CPS tertiary coding. It shows the alignment of the interview
quotations to descriptors used in the CPACE computational competencies framework. Each
exemplar tertiary code referenced in the table includes the definition, which is based on the
CPACE competencies framework (Table 1).
Inter-coder Reliability and Arbitration: inter-coder reliability tests were performed after primary
and secondary coding to ensure that multiple coders perceived the same constructs/codes applied
to the same interview data segments. Investigators including chemical and civil engineers as well
as computer scientists were given the codebook and a subset of interviews and asked to
independently code. While final analysis is still pending (but will be included in the final
manuscript) preliminary results indicate robust coding consensus. Furthermore, the exemplars
used in this draft include a second round of inter-coder testing by a member of our team with
expertise in both computational science and educational research methods.
At the time of writing we have completed primary and secondary coding for all the
interviews (13 total) representing chemical and civil engineering disciplines and including
sophomore, junior and senior level courses. We also completed most of the tertiary coding for all
interviews from chemical engineering including sophomore, junior and senior level courses and
interviews from civil engineering including sophomore and junior level courses. The interview
quotes used in figures and tables are taken from the chemical engineering interviews. Due to
space restrictions tertiary-coding exemplars presented here are focused on three CPACE
computational competencies: Algorithmic Thinking & Programing, Modeling & Abstraction,
and Limitations of information Technology.
Our initial approach involved using a set of rubrics based on the CPACE computational
competencies framework (Table 1), to compare self-reported survey data regarding students’
computational abilities and actual performance in classroom assignments. This approach made it
very difficult to recognize students’ computational problem solving processes (CPS) as they
solved engineering problems. To complement our analysis we used qualitative interview data
along with survey data and student artifacts. The developed interview codification process and
analysis was grounded in the lived experiences of students, which allowed for a deeper
examination of the role computation plays during problem solving. In addition, the qualitative
process helped the CPACE team identify several emergent themes that might further inform
rubric development and future research questions centered on the role that sociocultural and
disciplinary factors play during problem solving within the engineering context.
The examples presented here represent only a fraction of our qualitative analyses and are for
the purposes of demonstrating our process of creating a framework for assessing computational
competencies for undergraduate engineering students. However, the exemplars do point to
preliminary understandings of the important role computation plays as students solve complex
engineering problems. We highlight two broad categories, Computational Problem Solving
(CPS) and Non-CPS, each encompassing several related code families that expose a more
focused view of how students approach computational problem solving and operationalize
knowledge. Currently, we are completing the categorization, coding process and alignment to the
CPACE computational competencies framework.
Initial analyses reveal alignment with survey findings; a conclusion of our study is that
students have difficulties making connections between different computational tools. Moreover,
students have difficulties utilizing learned computational skills in ways that are integrated and
operationalized across new and different contexts. This points to a limited understanding about
underlying computational principles. Our studies point to the need of a deeper revision of the
instructional practices for the teaching and learning of computational competencies, particularly at
the introductory level to help students develop a cohesive computational knowledge based on
computing principles that are well integrated with the engineering practice.
The CPACE computational competencies framework helps disaggregate the level at which
students think computationally across curricular levels and thus allow researchers to address
particular needs. Principally, it represents a step forward in the development of assessment tools
and scoring rubrics to measure computational competencies for engineers. These instruments can
be used for pedagogical and research purposes. For example, faculty can design instruction that
is more appropriate to the ways that students organize their knowledge of the subject.
Challenges/Limitations:
During the interviews students were asked to describe or recount their thought process
surrounding an already completed project and the time elapsed between interview and
completion of the project may have impacted interview responses. Further, by the very nature of
the research and interviews, leading questions and interviewer confirmation of particular
computational processes were inherent, which may have introduced leading questions and biased
answers. Moderator acceptance bias may have also been present, whereby interviewees provide
answers to please the moderator. Respondents might interpret what they believe the moderator
wants to hear and answer accordingly. All instances of bias were noted during coding process.
Acknowledgment
This material is based upon work supported by the National Science Foundation (NSF) under
award 0939065. Any opinions, findings and conclusions or recommendations expressed in this
material are those of the author(s) and do not necessarily reflect the views of the NSF.
References
[1] Vergara, C. E., Urban-Lurain, M., Dresen, C., Coxen, T., MacFarlane, T., Frazier, K., et al. (2009). Leveraging
workforce needs to inform curricular change in computing education for engineering: The CPACE project.
Computers in Education Journal Vol XVIIII (4), 84-98
[2] Sheppard, S. D., Macatangay, K., Colby, A., Sullivan, W. M. & Shulman, L. S. (2008). Educating engineers:
Designing for the future of the field: Jossey-Bass.
[3] Lattuca, L. R., Terenzini, P. T., & Volkwein, J. F. (2006). Engineering change: A study of the impact of
EC2000. Baltimore, MD: ABET, Inc
[4] Olds, B. M., Moskal, B.M. & Miller, R. L. (2005). Assessment in Engineering Education: Evolution,
Approaches and Future Collaborations. Journal of Engineering Education; 94 (1), 13-25.
[5] Kilgore D., Atman C.J., Yasuhara K., Barker, T.J. & Morozov, A. (2007). Considering Context: A Study of
First-Year Engineering Students. Journal of Engineering Education 96 (4), 321-334.
[6] Adams, R. S. & Atman, C. J. (2000). Characterizing engineering student design processes: An illustration of
iteration. Proceedings, ASEE Conference and exhibition, 2000.
[7] Atman C., J., Adams, R., Cardella, M.E., Turns, J., Mosborg S. & Saleem J. (2007). Engineering Design
Processes: A Comparison of Students and Expert Practitioners Journal of Engineering Education October 2007,
Vol 96(4), 359-379.
[8] Vergara, C. E., Urban-Lurain, M., Dresen, C., Coxen, T., MacFarlane, T., Frazier, K., et al. (2009). Aligning
computing education with engineering workforce computational needs: New curricular directions to improve
computational thinking in engineering graduates. Paper presented at the Frontiers in Education, San Antonio,
TX.
[9] Vergara, C. E., Urban-Lurain, M., Dresen, C., T., Frazier, K., et al. (2011). Computational Expertise in
Engineering: Aligning Workforce Computing Needs with Computer Science Concepts. ASEE, Vancouver BC,
AC 2011-1050.
[10] Committee on the Engineer of 2020, Educating the engineer of 2020: Adapting engineering education to the
new century. National Academy Press: Washington, DC, 2005.
[11] Educating Engineers: Designing for the future of the field. The Carnegie Foundation for the Advancement of
Teaching 2008.
[12] Alexander, P. A. (2003). The development of expertise: The journey from acclimation to proficiency.
Educational Researcher, 32(8), 10-14.
[13] Bransford, J. (Ed.). (2000). How people learn brain, mind, experience, and school (Expanded ed.). Washington,
D.C.: National Academy Press.
[14] Byrnes, J.P., (1996). Cognitive Development and Learning in Instructional Contexts, Boston, Mass.: Allyn and
Bacon.
[15] Alexander, P. A., & P. K. Murphy. (1999). Nurturing the seeds of transfer: A domain-specific perspective.
International Journal of Education Research 31:561–76.
[16] Denning, P. J. (2003). Great principles of computing. Communications of the ACM, 46(11), 15-20.
[17] Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33-35.
[18] Vergara, C. E., Briedis D., Buch N., Esfahanian, A-H., Sticklen, J., Urban-Lurain, M., Paquette, L., Dresen, C.
& Frazier, K. (2012). Integrating Computation Across Engineering Curricula: Preliminary Impact on Students.
FIE Annual Conference. Seattle, WA 2012.
[19] Vergara, C. E., Urban-Lurain, M., Sticklen, J., Esfahanian, A-H., McQuade, H., League, A., Bush, C. J. &
Cavanaugh, M. (2014). Towards Improving Computational Competencies for Undergraduate Engineering
Students. Paper to be presented at the ASEE annual conference Indianapolis, IN, June 2014.
[20] Bernard, Russell H. (2011) Research Methods in Anthropology: Qualitative and Quantitative Approaches-Fifth
Edition, Plymouth, United Kingdom. AltaMira Press.