0% found this document useful (0 votes)
25 views

Assessing Students Prior Knowledge and Learning in An Engineering Management Course For Civil Engineers

This document discusses assessing student learning in an undergraduate engineering management course for civil engineers. It describes: 1) Using a pre-test and post-test to measure student learning and the impact of course pedagogies. Additional assessment was done by comparing post-test results to embedded indicators collected annually. 2) The engineering management course focuses on developing professional skills identified by ASCE as important for future engineers. Course objectives align with these skills and Bloom's Taxonomy. 3) The civil engineering department uses embedded indicators mapped across the curriculum to sequentially develop student learning and measure performance in achieving outcomes.

Uploaded by

Debora Dujong
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views

Assessing Students Prior Knowledge and Learning in An Engineering Management Course For Civil Engineers

This document discusses assessing student learning in an undergraduate engineering management course for civil engineers. It describes: 1) Using a pre-test and post-test to measure student learning and the impact of course pedagogies. Additional assessment was done by comparing post-test results to embedded indicators collected annually. 2) The engineering management course focuses on developing professional skills identified by ASCE as important for future engineers. Course objectives align with these skills and Bloom's Taxonomy. 3) The civil engineering department uses embedded indicators mapped across the curriculum to sequentially develop student learning and measure performance in achieving outcomes.

Uploaded by

Debora Dujong
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Paper ID #26302

Assessing Students’ Prior Knowledge and Learning in an Engineering Man-


agement Course for Civil Engineers
Dr. Simon Thomas Ghanat P.E., The Citadel
Dr. Simon Ghanat is an Assistant Professor of Civil and Environmental Engineering at The Citadel
(Charleston, S.C.). He received his Ph.D., M.S., and B.S. degrees in Civil and Environmental Engineering
from Arizona State University. His research interests are in Engineering Education and Geotechnical
Earthquake Engineering. He previously taught at Bucknell University and Arizona State University.
Dr. William J. Davis P.E., The Citadel
William J. Davis is Dept. Head & D. Graham Copeland Professor of Civil Engineering and Director of
Construction Engineering at The Citadel in Charleston, SC. His academic experience includes: transporta-
tion infrastructure planning and design, infrastructure resilience, traffic operations, highway safety, and
geographic information systems. His research interests include: constructing spatial databases for bet-
ter management of transportation infrastructure, improving transportation design, operation, safety and
construction, understanding long-term effects of urban development patterns, and advancing active living
within the built environment for improved public health. He teaches courses in interchange design, trans-
portation engineering, highway design. engineering management, geographic information systems, and
land surveying. He has served in numerous leadership positions in ITE, ASCE and TRB.

c American Society for Engineering Education, 2019


Assessing Students’ Prior Knowledge and Learning in an
Undergraduate Engineering Management Course for Civil
Engineers

Abstract

The objectives of this study were (1) to use a pre-test to assess the knowledge of Civil
Engineering students at The Citadel in their understanding of engineering management topics
prior to taking a required junior-level course in engineering management, and (2) to use a post-
test to assess student learning as a result of various pedagogies employed though the course
instruction. Additional insight into broader student performance indicators was accomplished by
comparing post-test results with embedded indicator data, which is collected annually, evaluated
against department standards and used in department assessment of student outcomes.

Introduction

Inclusion of engineering management within the curriculum provides beneficial learning


experiences for undergraduate engineering students including expanded professional skills,
preparation for successful careers, and bridging of competency gaps [1]. Development of
professional and leadership skills has been shown to progressively improve through the college
experience, when included as part of the curriculum [2]. Placing an emphasis on “softer”
engineering skills can be used to compliment traditionally required technical curriculum, where
most of the course material is focused on teaching students’ analytical methods [3].
Competencies of graduates to be prepared to function as engineering managers is a strategically
important topic for engineering educators and department assessment procedures to address [4].
Engineering management is an important course in the curriculum to engage students in
developing lifelong learning skills, considering global economic issues and understanding the
role of professional societies, beyond traditional analytical course material [5]. To prepare
graduates with expanded professional skills, undergraduate programs are modifying curriculum
and course material to meet the needs of the engineering profession [6].
Engineering Management Course at The Citadel
Engineering Management is a required three-credit hour course for undergraduate civil
engineering students taken during their junior or senior year at The Citadel, and is a prerequisite
for the two-course capstone design sequence. Engineering Management focuses on development
of professional skills needed to prepare graduates for careers in consulting engineering, public
works administration, and construction management. In recent years, the curriculum has been
modified to incorporate expanded professional skill outcomes, as identified by American Society
of Civil Engineers (ASCE) in “A Vision for Civil Engineers 2025,”and ASCE Body of
Knowledge (BOK) 2 [7-9]. These landmark policy documents have influenced undergraduate
engineering curriculums, across the U.S. and beyond, to include a more specific list of
professional skills needed to prepare graduates to meet management and leadership challenges in
their future career paths.
ASCE’s “Vision for Civil Engineers in 2025” states graduates should be prepared to lead society
in establishing a sustainable world and improve the global quality of life. Future practicing civil
engineers are envisioned to be master builders, stewards of the environment, innovators,
managers of risk, and leaders of public policy [7-9]. The ASCE Body of Knowledge (BOK) 2
provides an aspirational foundation for how engineering programs should prepare civil
engineering students to meet ever increasing societal and public policy demands for engineering
practice [10]. Based on this vision for future engineers set forth in ASCE BOK 2, faculty in the
Department of Civil and Environmental Engineering (CEE) adopted 22 student outcomes, eight
(8) of which are directly focused on developing student professional skills and competencies. As
shown in Table 1, all eight (8) of these outcomes are included as course objectives in
Engineering Management, identified with adopted levels of Bloom’s Taxonomy. Assessing
student achievement of fundamental course objectives is relatively straightforward through
application of course Embedded Indicators [11, 12].

Table 1 – Engineering Management Course objectives and Bloom’s Taxonomy


Course Objective Bloom’s Taxonomy
1. Explain lifelong learning skills needed for successful engineering 3-Application
careers.
2. Apply key aspects of project management, and scheduling within an 3-Application
engineering context.
3. Demonstrate the ability of multidisciplinary teams to effectively 3-Application
examine engineering solutions.
4. Use key business concepts to illustrate effective approaches to 4-Analysis
business development, project relationships, proposal submittal, and
consultant selection.
5. Relate characteristics of effective communication to project design, 4-Analysis
alternatives evaluation, and recommended solutions.
6. Recognize fundamental influences of public policy on engineering 2-Comprehension
standards, design requirements, and professional practice.
7. Explain legal and ethical responsibilities of professional engineers. 2-Comprehension
8. Identify leadership principles and proficiencies use to address 2-Comprehension
challenges within the engineering profession.

CEE Department Embedded Indicators

Departmental outcomes aligning the curriculum along professional skills were established to link
course goals across a course-by-course strategy for student development. An essential
component of this plan was adoption of Embedded Indicators, aligned with CEE Department
outcomes, and mapped across all four years of the undergraduate curriculum. Embedded
Indicators are mapped to appropriate Bloom’s Taxonomy levels and organized sequentially to
provide a progression of student learning and instructional development.
After students are exposed to instructional material, Embedded Indicators collectively measure
student performance as determined by graded test questions, assignments, reports and projects
commonly used by instructors to assess student learning. Prior to teaching a Civil Engineering
course, faculty pre-identifies specific Embedded Indicator tools for use in measuring each goal
contained in the course syllabus. Throughout the semester, students are assessed using pre-
designated tools. If student performance averages for an Embedded Indicator is measured as
75% or higher, it is concluded students have collectively achieved appropriate learning
requirements and met departmental standards. Example work from three representative students
(good, average, poor) is included with an Embedded Indicator summary that provides an
assessment of student performance and is mapped to reflect linkage with appropriate 1-22
program outcomes and Bloom’s Taxonomy.

Pedagogical Techniques Employed in Course

Students learn more effectively by actively analyzing, discussing, and applying content in
meaningful ways rather than by passively absorbing information [13]. Various teaching and
learning techniques were employed to improve the student learning of key concepts in
engineering management.

To assist students with learning course material, each student was required to teach a lesson
during the semester. This method is equally beneficial for those students who are being taught
and the peer teachers [14, 15]. Peer teachers can reinforce their own learning by instructing
others and students feel more comfortable when interacting with a peer [14, 15]. Daily quizzes
on assigned reading were administered at the beginning of class. These quizzes were given to
increase students’ attendance, preparation, participation, study habits and to improve exam
scores. Short YouTube videos were shown daily to facilitate and stimulate some introductory
discussions on each day’s topic. One-Minute papers [16] were used to monitor student learning
and address students’ misconceptions and preconceptions. Students were typically asked to
write a concise summary of the presented topic, write an exam question for the topic, or answer
in 60 seconds a big picture question from the material that was presented in the current or
previous course lesson.

In-class debates cultivate active engagement of students, placing responsibility of comprehension


on the shoulders of the students [17]. Debates afford many benefits in addition to promoting
active engagement and mastery of the content [17]. Because debates require listeners and
participants to evaluate competing choices [17], they develop higher order critical thinking skills
by advancing up levels of Bloom’s Taxonomy [15, 17, 18]. For these reason, debates of ethical
dilemma case studies were employed to further facilitate active learning and promote critical
thinking skills. Students were provided with three ethical dilemma case studies. The class was
divided into six teams; two teams were assigned to each case. The members of each team
worked together to prepare a solution to their ethical dilemma, which they presented to the class.
Students were required to devise a solution, explain and defend their solution through an
ethnically-based argument. Each team was required to prepare a presentation consisting of three
slides. Teams were assessed based on the strength and delivery of their ethical argument.
Everyone was responsible to be familiar with all cases.
An icebreaker activity was used to facilitate the teamwork prior to beginning of the term project.
Each team was asked to build the tallest free-standing structure in18 minutes, out of 20 sticks of
spaghetti with one-yard of tape, one-yard of string, and one marshmallow. This activity was a
great way for each team to get acquainted and dive-into the dynamics of effective teamwork.
To further influence understanding of engineering management concepts, students were asked to
conduct an in-depth study of on-campus parking at The Citadel. The project was predicated on
data demonstrating that the demand for parking has dramatically increased over the past few
years and often parking facilities cannot meet the parking demand, resulting in a variety of
problems that could be resolved through engineering-based solutions. Students were asked to
prepare an engineering proposal to be submitted to the university decision makers and
transportation engineering faculty in the CEE Department. Each team prepared a proposal
containing a detailed scope of work; description of parking issues, preliminary evaluation of
available data, identification of possible solutions, establishment of a project management plan;
and schedule of tasks needed to develop project deliverables and engage stakeholders.
Assessment of Prior Knowledge and Learning

A commonly accepted assessment instrument useful for both diagnostic and formative purposes
is the concept inventory [19, 20, 21], which refers to any kind of research-based assessment
technique that measures conceptual understanding [19, 20, 21]. Use of concept inventories helps
instructors measure the effectiveness of their teaching [20, 21] and determines if students have
the correct understanding of important concepts on specific topics. When the same set of
questions is used, concept inventories may help in evaluating students’ pre- and post-knowledge
on a subject. Pre-tests establish students’ prior knowledge on a subject, and post-tests measure
the learning at the end of the educational experience [19, 20, 22]. These types of tests are also
helpful in distinguishing between learning and performance [22].
Students’ academic performance, such as examination scores, as a proxy for learning [23-26]
does not represent direct evidence of learning [26-29]. Learning is different from performance.
Learning is the difference between what students know at the beginning of the semester
compared with the end of the semester [22]. Performance is demonstrating mastery of course
material such as correct answer to a question on the final exam. This distinction between
learning and performance is important since students enter courses with unequal and varying
levels of knowledge, skills, and educational experiences [22]. Consequently, pretests are useful
in establishing prior knowledge, and post-tests are effective in measuring student learning [22].

An eight question pre- and post-test was developed based upon key concepts in engineering
management course (see Table 2). The pre-tests were administered to measure students’ prior
engineering management knowledge and to identify student misconceptions at the beginning of
the semester. The same short-answer test was administered on the last day of the term to assess
knowledge gained as a result of the course experience. Each question was scored against an
established correct answer. One of the authors graded the short-answer test for all sections to
ensure uniform application of grading rubric. When grading the pre- and post-test instruments, a
professor looked for key words, phrases and concepts. The professor scored each of the eight
questions with a bracketed score of zero (no credit for incorrect answer or no answer), 0.5
(partial credit for partially correct answer), or one (full credit for correct answer). It is important
to note neither the pre-test nor post-test counted toward the course grade.

Table 2. The short-answer questions for Engineering Management pre- and post-test (n=57)
No. Question
Q1 Describe what critical path is in project management.

Q2 Define acronym RFP in engineering management.


Q3 What are the characteristics of a project? What are the three key project constraints?

Q4 Explain the need for lifelong learning and describe skills required of a lifelong learner

Q5 Describe the term multiplier, which is commonly used in the consulting business.

Q6 What are the consequences of uncompensated scope creep?

Q7 Explain the role of a leader and list important leadership principles and attitudes
Q8 What is the difference between Quality Control and Quality Assurance?

The list of questions for the pre-test and post-test are strategically mapped to course goals and
learning objectives. Each of the eight-course goal are linked to support a specific student
learning outcome, as part of the Department Assessment process. A list of 23 Student learning
outcomes were developed and aligned with ASCE Body of Knowledge (BOK) 2 [7-9]. Course
embedded indicators are used to measure overall student performance through an assignment or
test question aligned with a student learning outcome, largely focusing on professional
skills. Currently, the pre-test and post-test results are stand-alone measures of student
knowledge growth, which are not used to support the Department Assessment process.

Results and Discussion

Research Question 1: To what degree do junior or senior Civil Engineering majors at The
Citadel have exposure to engineering management prior to the introductory engineering
management course?
Figure 1 shows the mean score (in percentage) for each question and analyzes students’
performance on each question on the pre-test. The pre-test mean of overall scores ranged from
13% to 19%. The pre-test scores for individual students ranged from zero to 46%. All students
scored below 10% on Questions Q1, Q2, Q5, and Q6. Student performance (at below 50% level)
on all questions of the pre-test is an extremely poor performance, indicating little to no prior
experience with these concepts. The highest score on the pre-test was Question 7 (explain the
role of a leader and list important leadership principles and attitudes), which is an important
theme in the Engineering Management course that the students successfully mastered. The
lowest scores on the pre-test were Question 1 (critical path in project management); Question 2
(defining acronym RFP); Question 5 (describe the term multiplier) and Question 6
(consequences of uncompensated scope creep).

The pre-test standard deviation of overall scores range from 8% to 37% (Figure 2). The pre-test
standard deviation for each question, a measure of student performance variation, is also shown
in Figure 2 and ranges from zero to 30%. The pre-test standard deviation for Questions 1, 2, 3,
4, 5, 6, 7, and 8 range from 10% to 13%, 0 to 19%, 22% to 30%, 26% to 29%, 0 to 11%, 13 to
19%, 13% to 23%, and 21% to 25%, respectively.

100

80

60
Mean (%)

46
39.4 42
40 35
31.5 29 29
25 23
17 19
20 13 11 1315
8 5.3 8 4
2.6 4 2.6 0 0
2 0 0
0
Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 total

2018 Section 1 (n =19) 2018 Section 2 (n =24) 2017 (n =14)

Figure 1. Mean score for each question on the pre-test

100

80
Standard Deviation (%)

60

40 37
30
26 272926 2524
22 23 21
19 19 19
20 16
111013 11 13 13 11
8
0 0 0 0
0
Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Total

2018 Section 1 (n =19) 2018 Section 2 (n =24) 2017 (n =14)

Figure 2. Standard deviation of each pre-test question


Research Question 2: What do the students gain in conceptual understanding about engineering
management from the beginning of the course to the end?

The same short-answer test in Table 1 was administered on the last day of the term to assess
knowledge gained as a result of the course educational experience. Figures 3 and 4 illustrate the
mean and standard deviation of overall scores on the post-test in this study. The post-test student
performance mean and standard deviation of overall 1-8 question scores range from 50% to 84%,
and 10% to 23%, respectively.

Based on results summarized in Figures 1 and 3, students experienced considerable progress in


gaining an improved conceptual understanding of engineering management concepts through
completion of the engineering management course. Measured gains in students’ conceptual
understanding were 37% and 69% for 2018 and 2017, respectively. There was an increase from
an overall average percentage correct of 16 % on the pre-test to an overall average percentage
correct of 61% on the post-test, across all eight questions, and sample size of 57 students.

100 92 89 92
88 86 86
82 84
76 79 79
80 70 69 71 73
63 66
56
Mean (%)

60 50
45 45
41
40 34
24 25 23
21
20

0
Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Total

2018 Section 1 (n=19) 2018 Section 2 (n =24) 20 17 (n =14)

Figure 3. Mean score for each question on the post-test

100
Standard Deviation (%)

80

60
47 47 44 47
42 42 43 41
38 38 36 3739 36 38 36
40 31 3029 33
23 25 23
21 18
20 15
10

0
Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Total

2018 Section 1 (n =19) 2018 Section 2 (n =24) 2017 (n =14)

Figure 4. Standard deviation for each question on the post-test


A statistical analysis was conducted on all pre-test and post-test data to detect changes in
students’ understanding of the engineering management concepts over the course of the
semester. Comparison of the pre- and post-test scores was completed using the paired t-test at
five percent level of significance, and the results are shown in Table 3. The next step in
analyzing pre- to post-test gains was to evaluate changes in correct responses for individual
questions. The paired sample t-test was conducted for each question to test for statistically
significant differences between pre- and post-test scores. Comparison of the student’s
performances in all sections showed that all students performed similarly on each question and
overall score when measuring conceptual understanding from pre-test to post-test (see Table 3).
All eight questions showed a statistically significant difference between the pre- and post-tests
(all p < 0.001). The comparison of sections showed that the students in the Section 1 did not
show significant gains in Question 3, the characteristics of a project and key project constraints,
with p > 0.05 (see Table 3). Furthermore, the comparison showed that students in Section 2 did
not show significant gains in Questions 6 and 8.

In an evaluation of post-test results for the combined 24 question responses, 21 of 24


demonstrated statistically significant differences between paired pre-test to post-test results (all p
< 0.001), equating to an 87 percent effectiveness for identified course concepts, over the three
section offerings of engineering management from 2017 to 2018.

Table 3. Paired-t test results for the pre- and post-test results
Section 1 Section 2 Section 3
(df = 18) (df = 23) (df = 13)
Mean p- Mean p- Mean p-
Paired t value Paired t value Paired t value
Measure Diff (%) Diff (%) Diff (%)
Total 37 7.04 < 0.001 37 10.57 < 0.001 69 26.60 < 0.001

Q1 73 8.32 < 0.001 86 15.2 < 0.001 89 15.7 < 0.001

Q2 24 2.46 < 0.001 54 6.03 < 0.001 79 7.78 < 0.001

Q3 13 1.09 >0.05 29 4.37 < 0.001 60 5.67 < 0.001

Q4 40 6.43 < 0.001 46 6.26 < 0.001 86 8.0 < 0.001

Q5 31 3.31 < 0.001 25 3.14 < 0.001 89 15.7 < 0.001

Q6 40 4.03 < 0.001 13 1.54 >0.05 68 6.03 < 0.001

Q7 37 4.38 < 0.001 38 4.98 < 0.001 88 7.32 < 0.001

Q8 36 3.44 < 0.001 6 0.83 > 0.05 71 8.27 < 0.001


Course Embedded Indicator Data and Analysis

Embedded Indicator data from 2017 and 2018 used for course assessment is tabulated and
presented in Table 4. The following observations are noted:
 Relatively small variations exist in data tabulations for these eight-course outcomes over the
two-year analysis period, with an overall student performance of 82.5%.
 A possible explanation for lower performing outcomes is likely related to the difficulty in
explaining complicated engineering business and project management nuances needed for
future successful careers to undergraduate students who have little experience or knowledge
of real-world situations. Also, students knew in advance they would be receiving a grade for
Embedded Indicator questions, typically implemented on tests and/or major projects, while
pre-test and post-test results were not administered to students as graded assignments.

Table 4. Summary of engineering management embedded indicator outcomes

Embedded Mean (2017) Mean (2018)


Outcome Bloom’s Indicator Tool (n = 40) (n = 45)
Lifelong learning 3 Homework 77 85.8
Project Management 3 Final Exam 76 78.8
Inter-disciplinary Teamwork 3 CATME 83 89.2
Business Concepts 4 Proposal 55 87.6
Communication 4 Presentation 90 92.7
Public Policy 2 Final Exam 85 78.9
Ethical Responsibility 2 Assignment 86 76.7
Leadership 2 Final Exam 88 88.4

A comparison of the mean post-test and embedded indicators results is provided in Figure 5.
Post-test combined results trailed Embedded Indicator combined results by an average of 21%
over the two-year period. Concept inventories are widely credited as a viable means to provide
reliable data and to positively influence pedagogical practices [21]. Results from the post-test
concept inventory is helpful in distinguish between student learning and performance [22].
100
84 80 84
80
Mean (%)

60 53

40

20

0
2017 2018
Year

Post-test Embedded Indicators

Figure 5. A comparison of post-test and embedded indicators results

Conclusions
Based on the pre- and post-test results, students experienced measurable gains in conceptual
understanding of engineering management concepts during the course, across all three sections
taught in 2017 and 2018, which includes a total sample size of 57 students. There was an
increase from an overall average percentage correct of 16 % on the pre-test to an overall average
percentage correct of 61% on the post-test, over the eight questions posed to students. In an
evaluation of post-test results for the combined 24 question responses, 21 of 24 demonstrated
statistically significant differences between paired pre-test to post-test results (all p < 0.001),
equating to an 87% effectiveness for identified course concepts. Results indicated that since the
post-test was unannounced, it provided a true measure of student learning. The pre-test to post-
test changes in overall scores were influenced by a variety of pedagogical techniques used for the
course, as described in this paper.

Future research should include expansion to include subsequent engineering management course
offerings for the pre-test, post-test instrument, to increase the sample size for improved insight
into student learning and performance across these important course subject matter.
Additionally, post-test results should be compared with Fundamentals of Engineering subject
area results and corresponding Senior Exit Survey questions mapping to program outcomes of
interest. Lastly course professors should continue to identify and administer teaching techniques,
instructional methods and strategic assignments and exercises to continue to improve student
learning and performance.

References
[1] R. Unal, Keating, C., P. Kauffmann, and W. Peterson, “Engineering Management The Minor of Choice,”
Proceedings of the American Society of Engineering Education Annual Conference, Montreal, Canada, 2002

[2] P. Dunn, B. Pearce, “Introducing Project Management to Senior Civil Engineering Students,” Proceedings of
the American Society of Engineering Education Annual Conference, Chicago, IL, 2006.
[3] E.T. Pascarella,, P.T. Terenzini, (Eds.). (2005). How College Affects Student: Volume 2 A Third Decade of
Research: Volume 2 A Third Decade of Research.

[4] D. Merino, “A Proposed Engineering Management Body of Knowledge (Embok)” Proceedings of the American
Society of Engineering Education Annual Conference, Chicago, IL 2006.

[5] S. Murray, and S. Raper, “Encouraging Lifelong Learning For Engineering Management Undergraduates.
Proceedings of the American Society of Engineering Education Annual Conference, Honolulu, Hawaii, 2007.

[6] W. Davis, K. Bower, R. Welch, D. Furman, “Developing and Assessing Student’s Principled Leadership Skills:
to achieve the Vision for Civil Engineers in 2025,” Proceedings of the 120th. American Society for Engineering
Education Annual Conference, Atlanta, GA, 2013.

[7] The Vision for Civil Engineering in 2025, American Society of Civil Engineers, Reston, VA, June 2006.

[8] Achieving the Vision for Civil Engineering in 2025: A Roadmap for the Profession, American Society of Civil
Engineers, Reston, VA, 2009.

[9] Civil Engineering Body of Knowledge for the 21 st Century, Preparing the Civil Engineer for the Future, Second
Edition, Committee on Academic Prerequisites for Professional Practice, American Society of Civil Engineers,
Reston, VA, 2008.

[10] S.G. Walesh, "The Raise The Bar Effort: Charting The Future By Understanding The Path To The Present –
The BOK and Lessons Learned," Proceedings of the American Society for Engineering Education Annual
Conference, Austin, TX, 2012.

[11] C. Bonwell, and J. Eison, Active learning: Creating excitement in the classroom. Washington, D.C. Jossey-
Bass, 1991.

[12] C. Mayers, and T. Jones, Promoting active learning: Strategies for the college classroom. San Francisco:
Jossey-Bass, 1993.

[13] S.T. Ghanat, J. Kaklamanos, K. Ziotopoulou, I. Selvaraj, and Fallon, D. “A Multi-Institutional Study of Pre-
and Post- Course Knowledge Surveys in Undergraduate Geotechnical Engineering Courses,” Proceedings of ASEE,
New Orleans, LA, 2016.

[14] N.A. Whitman, and J.D. Fife, Peer Teaching: To Teach Is To Learn Twice. ASHE-ERIC Higher Education
Report No. 4. 1988.

[15] S.T. Ghanat, and W. Davis, “Pedagogical Techniques Employed in an Engineering Management Course,”
proceedings of ASEE-SE, Daytona Beach, FL, 2018.

[16] T.A. Angelo, and K.P. Cross, Classroom Assessment Techniques A Handbook for College Teachers: 2nd ed,
Jossey-Bass Publishers, San Francisco, CA, 1993.

[17] A. Snider, and M. Schnurer, Many sides, Debate across the curriculum. New York: International Debate
Education Association, Upper Saddle River, N.J, 1999.

[18] A. Freeley, and D. Steinberg, Argumentation and Debate: Critical thinking for reasoned decision making, 11 th
ed, Belmont, CA, 2005.

[19] S.T. Ghanat, J. Kaklamanos, I. Selvaraj, C. Walton-Macaulay, and M. Sleep, “Assessment of students’ prior
knowledge and learning in an undergraduate foundation engineering course,” Proceedings of the American Society
for Engineering Education 2016 Annual Conference and Exposition, Columbus, Ohio, 25–28 June 2017.
[20] T. Reed-Rhoads, and P.K. Imbrie, “Concept inventories in engineering education,” School of Engineering
Education, Purdue University.

[21] A. Madsen, S.B. McKagan, and E.C. Sayre, “Best practices for administering concept inventories,” The Physics
Teacher, vol. 55, no. 9, pp. 530-536, 2017.

[22] M. Delucchi, “Measuring student learning in social statistics: A pretest-posttest study of knowledge gain,”
Teaching Sociology, vol. 42, no. 3, pp. 231-239, 2014.

[23] R.C. Borresen, “Success in Introductory Statistics with Small Groups.” College Teaching 38(1):26–28, 1990.

[24] M. Delucchi, “Assessing the Impact of Group Projects on Examination Performance in Social Statistics.”
Teaching in Higher Education 12(4):447–60, 2007.

[25] D.V. Perkins, and R.N. Saris, “A ‘Jigsaw Classroom’ Technique for Undergraduate Statistics Courses.”
Teaching of Psychology 28(2):111–13, 2001.

[26] S.Yamarik, “Does Cooperative Learning Improve Student Learning Outcomes?” Journal of Economic
Education 38:259–77, 2007.

[27] P. Baker, “Does the Sociology of Teaching Inform Teaching Sociology?” Teaching Sociology 12(3):361–75,
1985.

[28] J. Chin, “Is There a Scholarship of Teaching and Learning in Teaching Sociology? A Look at Papers from 1984
to 1999.” Teaching Sociology 30(1):53–62, 2002.

[29] B. Lucal, C. Albers J. Ballantine, J. Burmeister-May, J. Chin, S. Dettmer, and S. Larson. “Faculty Assessment
and the Scholarship of Teaching and Learning: Knowledge Available/Knowledge Needed.” Teaching Sociology
31(2):146–61, 2003.

You might also like