Innovative Teaching Methods
Innovative Teaching Methods
Douglas G. Schmucker
The Pennsylvania State University
Abstract
This paper describes several innovative teaching methods that the author has implemented in four courses in order to
increase student involvement in the lessons. These methods include questioning techniques, physical
demonstrations, team-oriented in-class exercises using toolkits developed by the author, and lesson presentation
techniques. The methods have been significantly inspired by the T4E teaching model, which was developed at the
USMA and at whose NSF-sponsored short course the author attended.
Student data both before and after the implementation are included along with faculty assessments. Comments from
other assistant professors who have implemented various aspects of the model are also included as are the author's
anecdotes. In the three semesters of implementation, the author has observed improved student performance as
measured by written exams in addition to positive student and peer evaluations.
1. Introduction
One challenge faced by the author since entering the engineering education profession has been
learning how to use the lesson time as a catalyst for student learning rather than simply a time of
transmitting information. To help address this challenge, the author attended the one-week NSF-
sponsored short course Teaching Teachers to Teach Engineering (T4E) held at the United States
Military Academy1,2. The first year (Summer, 1996), the author was a participant; the second
year (Summer, 1997), he was a consultant.
This paper describes several innovative teaching methods that include the T4E methods and those
inspired by T4E. The general goal was to increase student involvement in the lesson with the
underlying premise that along with increased involvement comes increased student learning. The
specific objectives of the methods were to:
The implementation of these methods at Penn State required the development of an extensive
number of training aids, re-organization of course material, and extensive lesson preparation.
Basic descriptions of the methods and training aids are provided here along with references for
more details. Student response to the author's instructional approach is provided both before and
after the implementation. The available data includes both university and instructor administered
evaluation surveys. Comments from peer reviews are included in addition to those from other
Page 3.343.1
Penn State professors who have implemented various aspects of the T4E model. The paper
closes with interpretation of the data and student comments as well as author anecdotes.
2. Teaching Methods
The T4E teaching model has been a significant source of inspiration for the author and hence will
be briefly described. Detailed description of the motivation behind the model and its various
aspects have been published in both hard-copy1,2 and electronically3,4 (world wide web). The
T4E model is first and foremost about effective communication. It is a well-defined plan by
which to engage students directly in the lesson presentation. This plan is organized via a well-
defined structure and executed via well-defined presentation skills and techniques. These
include specific questioning techniques, physical demonstrations, and planned exercises. This
paper focuses first on descriptions of individual techniques and then on a method for planning
the use of and effectively executing those techniques within a typical lesson.
An obvious method by which to directly engage (or involve) students in the lesson is by asking
questions. One challenge for the inexperienced (and sometimes experienced) teacher is how to
select and ask “good” questions. That is, how does one select and ask short, clear, unambiguous,
and non-trivial questions that are also not too difficult?
One helpful approach for selecting a “good” question is to first think about the techniue used to
deliver the question. The following model characterizes eight techniques; these are evenly
divided into two categories: Basic and Intermediate. The category refers to the skill required in
using the technique more so than the difficulty of the question itself.
The four Basic Questioning Techniques are relatively self-explanatory. The description following
Page 3.343.2
the technique's title details the manner in which the technique should be executed. The Basic
question requires that individuals be known either by name or some other distinguishing
characteristic. It is a good fall-back technique to use if no student initially responds to a Jump
Ball question. The Choir is a useful technique for eliciting a quick response from the entire class,
particularly for a question that activates or reinforces prior knowledge. The technique conveys
that the answer should be readily known by all and that the question is not a trick question.
The Volunteer question runs the risk of being accompanied by many seconds of silence (the
“null” response) while the instructor patiently waits for a response (and students avert their
eyes!). It is also, however, a wonderful opportunity for the instructor to probe the students'
understanding of the lesson material. For if there is no response, the instructor knows that
something may be wrong. Perhaps the question was too difficult or incomprehensible. One
follow-up technique for the “null” response is to ask the students to show by a raised hand if they
understood the question. Or, ask who did not understand. With this latter approach, it is
imperative that the instructor then select an individual who didn't raise their hand (thus implying
that they understood the question). Frequently, the student really did not understand the question
and did not want to acknowledge it. If this is the case, then either the question can be re-stated or
the answer can be explained. It is particularly important, however, that whomever was finally
“selected” to answer the question be given an opportunity to acknowledge their new
understanding of the question or that they be given a follow-up question to which they can
successfully respond.
The Intermediate questions are designed to keep the students “on their toes.” The Misdirected
Question refers to looking at one student while actually asking a different student the question.
The Blind Question is one in which an individual (or everybody) is asked a question while the
instructor's “back” is to the class, erasing the chalkboard, for instance. The Expert question
refers to a student who has previously demonstrated expertise. For example, once a student has
successfully answered a question, he or she becomes the “expert” for that lesson or even for that
semester. The Expert -- Not technique reverses this scenario and is used to intergject humor but
must be used carefully so as to not offend the student. An opportunity to use this technique is
often found when a student provides an erroneous answer to a trivial question ... particularly if
that student has correctly answered that question in a previous lesson. Another opportunity is
when someone asks a question that has just been asked and answered during that lesson.
These questioning techniques should be used in a pre-planned fashion and with some variety.
They require, however, that the instructor take the risk of stopping his/her “lecture” and actually
communicate with the students. To be a successful questioner, one eventually needs to learn the
students' names. For class sizes beyond 30 students, learning a student’s name may initially
appear to be insurmountable. Two useful techniques are to first identify students by their
clothing (and begin to learn names) and to use a seating chart. The author was initially reluctant
to use a seating chart for fear of negative student response. However, when using it during the
Fall 1997 semester for two different classes, not one complaint was heard from 93 students.
The strategy of how one weaves these questions into a lesson plan will be discussed later.
Page 3.343.3
2.2 Physical Demonstrations
The nature of most engineering disciplines lends itself to using physical models to demonstrate
lesson topics. This author firmly believes that physical models are an essential part of a balanced
engineering curriculum. The expense both in terms of finances and time in devloping the models
can be daunting, however. Indeed, the “cost”of the traditional laboratory is one reason the author
has heard for justifying the development of computer simulation/animation programs. A
personal concern for the author develops, though, as to whether students retain or even develop
an understanding of the real physical behavior of such “computerized” models. This author
asserts that computer models are best used as a supplement rather than as a replacement for
physical models.
Alternatives to the traditional laboratory include seminars where hands-on demonstrations are the
focal point of active exercises and in-class demonstrations5. The structural engineering
curriculum is ideally suited for such an approach. Indeed, the building in which the classroom is
located becomes itself a model that can be discussed during the lesson. Student feedback that
will be discussed later indicates that students particularly enjoy problems that relate to on-
campus facilities --- facilities that the students can “get their hands on.”
After being inspired by the extensive collection at the USMA during the 1996 T4E workshop, the
author developed an extensive number of physical models, demonstration models, and training
aids; the number exceeds 30 at the time of this writing. The most interesting models are
discussed in Meyer, et. al.6 and Schmucker;7 still more may be viewed on the web4. Three
principle objectives in developing both the USMA and Penn State models were:
One model that the author has developed is the Structural Engineering Toolkit4,7 (SET) shown in
Figure 1. The SET was specifically created to facilitate creation of “spur-of-the-moment”
modeling ideas. The SET is composed of rods and connectors obtained from the commercially
available children's toy K'Nex. The color-coded rods and connectors are easily and quickly
assembled to produce a wide variety of models that represent structural elements and/or entire
structural systems. The resulting models are quite durable and with imagination can also be used
to create models that demonstrate specific modes of structural behavior.
The SET also facilitates both in-class and out-of-class exercises that supplement classical and
numerical techniques. Each SET was sized to accommodate exercises for teams of two to four
students. The in-class exercises tend to be focused on understanding relatively simple structural
concepts and/or behavior whereas the out-of-class exercises permit more detailed exploration.
Page 3.343.4
Figure 1: Structural Engineering Toolkit (SET)
The SET is also ideal for design projects where the students must conceptualize and visualize the
project, propose alternative solutions, and even post-analyze the performance of their constructed
design.
The technique that pulls the above techniques together and enables them to be effective is a
lesson preparation device known as Board Notes8. This device is a clever organizational tool and
was rated as one of the most valuable items of the T4E workshop by the participants2.
Board Notes are a specially formatted set of lesson notes consisting of boxes that represent a
segment of the chalkboard. An example sheet of Board Notes for Lesson 8 of the introductory
structural analysis course CE240 is shown in Fig. 2. Within the boxes, the instructor records the
information in the precise manner that it will placed on the chalkboard (or overhead, power point
slide, etc.) during the lesson presentation. Some versions of the Board Notes have space
available for the instructor to place reminders, instructions to self, etc.
• One is able see the entire presentation hence making it easier for the instructor to see the relationships between
various lesson topics during the lesson in the way that the student will be seeing it;
• The Board Notes are a map of the lesson. Should the instructor become lost or confused during the
presentation, it is easy to recover by glancing at the “picture” on the Board Notes;
• The Board Notes free the instructor from being concerned with what information will be put on the board and
where it will go. Hence, the instructor can focus on the students and how they are receiving the information;
• The Board Notes are self-pacing. For this author, eight to ten boards constitute a 50-minute lesson.
Page 3.343.5
• The Board Notes provide a fairly accurate image of what ends up in students’ notes.
An important aspect not conveyed in Figure 2 is the use of color. Color chalk is an important
aspect of the lesson presentation, not merely for aesthetic reasons but also for pedagogical value.
The different colors represent the hierarchy of ideas. In the author's classes, blue is always used
for major points and green is used for sub-points. Interim equations are done in white whereas
more important equations are in yellow. For graphics, the colors are invaluable in providing
Page 3.343.6
clarity. For example, an undeflected structure is drawn in white, forces are always drawn in red,
and deflected shapes in yellow. An additional benefit is that the colors also add a distinctive feel
to the class.
3. Assessment
3.1 Student Response
Three types of student data and comments were collected: formal university-administered
evaluations, formal instructor-administered evaluations, and formal and informal student
comments. For the purpose of this paper, the distinction between university- and instructor-
administered evaluations is defined by the amount of participation that the instructor has with the
administration of the evaluation. For the university-administered evaluations, the students hand-
out, collect,and submit the forms for processing. For the instructor-administered evaluations, the
author hands-out and collects the forms and submits them for processing.
At the end of each semester, the university administers the Student Rating of Teaching
Effectiveness (SRTE) evaluation. This data has been collected each semester since the author
entered engineering education (Fall 1995). The SRTE data represents the most consistent data
that the author has with which to track the “before” and “after” T4E implementation effects. The
SRTE results and formal student comments are not provided to the instructors in the Civil and
Environmental Engineering Department until the middle of the next semester.
The College of Engineering Instructional Services at Penn State has developed an evaluation
form similar to the SRTE called the Engineering Feedback on Teaching Evaluation (EFTE).
This is an instructor-administered (but university processed) evaluation and has been in broad use
since the Fall 1997 semester. The results of the EFTE are immediately available and hence can
be used to make mid-semester adjustments rather than waiting for future courses.
The author collected EFTE data immediately after each of two interim exams (the morning after
the evening exam) and at the end of the semester; hence there are 3 EFTE data sets. The
semester-end data was in addition to the SRTE data. Informal student comments have been
collected by the author during each semester since Fall 1995, but the questions were not uniform
nor were the timing of evaluations uniform.
The SRTE evaluations request that the student respond using a rating or ranking scale (1 =
lowest rating, 4 = average rating, 7 = highest rating). The EFTE evaluations provide statements
to which the student responds to the degree of (dis-)agreement (1 = strong disagreement, 3 =
undecided, and 7 = strong agreement). Note that the average, or neutral value on the SRTE scale
is higher than that of the EFTE (4 versus 3, respectively). This may be significant when
comparing between the two sets of results. In the author's opinion, the EFTE is a more useful
data set because the student responds to more specific statements than the SRTE as will be
shown.
SRTE and EFTE questions selected for presentation are detailed in Tables 1 and 2 of the
Appendix, respectively. Questions selected for discussion were based upon commonality
Page 3.343.7
between the SRTE and EFTE and relevance to the methods discussed in this paper. Additional
EFTE questions will be discussed because of their particular relevancy to this paper.
Response to selected SRTE questions is shown in Figure 3. The questions selected here are
similar to several EFTE questions. The responses to those EFTE questions (only Fall 1997 data
available) correlate significantly with the SRTE data in Figure 3 and hence are not shown. For
reference, the topics of the questions for data shown in Figure 3 are listed below with both the
SRTE and EFTE question numbers, respectively.
7
Average Response for the Semester
A3
3 A4
B1
B5
2 B7
4 4
Before T E After T E B10
1
FA ’95 SP ’96 FA ’96 SP ’97 FA ’97
Semester
Figure 3: Selected SRTE Data Before and After T4E Implementation
(Selection is based upon similar questions from EFTE)
The data for these six questions shows a generally increasing trend with the exception of the
Spring 1997 semester. Note in particular the exceptionally high rating for instructor preparation
(B5 and 19). These averages are all above 6.0. Also important is that the Fall 1997 SRTE data
are all above 5.5. Note that Fall 1997 data includes only data for the CE240 course. The author
also taught CE447, a senior-level elective course in structural analysis. The Fall 1997 semester
Page 3.343.8
was the first time that author taught this course. Nearly all averages were approximately one full
point lower than the CE240 scores. Written student comments indicate that the primary basis for
the lower rating was the difficult and time-consuming course project. Although many of the
students enjoyed the project and felt that it was a valuable experience, most appeared to feel that
it needed to be significantly restructured to fit better within the course objectives and that more
weight should be given the project in the course grade to reflect the significant amount of effort.
It is the opinion of the author that the first two (of three) interim exams during the Spring 1997
had a contributing factor to the decline in student rating during this semester as observed in
Fig.3. This is not to say that the author believes that students only give good ratings to
instructors who give easy exams. Rather, the author had particular difficulty in creating exams
for this particular class that were within the reach of the class and also satisfied the author's
opinion of appropriate technical competency. In students’ words, the exams were “too difficult.”
Yet, overall the ratings are generally positive.
Figure 4 shows that there is a similar trend towards improved evaluations for the remaining
SRTE data. Again, there is a dip observed during the Spring 1997 class. It was definitely
observed that the morale of the Spring 1997 class suffered during the early portion of the
semester and sadly this appears to be reflected in question B2 (positive atmosphere for learning)
where the average drops from 5.9 to 5.4 from the Fall 1996 to Spring 1997 semester. For the Fall
1997 CE240 course, we observe an increase to 6.2 for this question, this despite a second exam
that was quite difficult for a majority of the class.
7
Average Response for the Semester
B2
3 B3
B4
B6
2 B8
Before T4E After T4E B9
1
FA ’95 SP ’96 FA ’96 SP ’97 FA ’97
Semester
Figure 4: SRTE Data for Remaining Questions
Page 3.343.9
Somewhat curiously, there is a relatively downward trend in Fig. 4 for question B8 (flexibility of
teaching methods to accommodate individual differences) after the implementation of the T4E.
This is initially disappointing given that a part of the goal of this implementation is to appeal to
different learning styles. The averages are still relatively positive (4.4 for Spring 1997 and 4
means an “average” rating). It would be helpful if follow-up questionnaires were used to pin-
point the causality of the students’ responses; this points out one of the failings of end-of-the-
semester evaluations. The response from Fall 1997 suggests that this trend as reversed, but it
does bear watching.
Clearly, there is insufficient data to make definitive conclusions about the role of T4E in these
trends. However, the generally positive trend and the student comments to be discussed later
suggest that the student’s perceive that the author’s teaching is improving.
The EFTE data for selected questions during the Fall 1997 semester are shown in Fig. 5. This
data represents one course, a junior-level structural analysis course. The first two sets of data
were collected 12 hours after each respective exam. Hence, these should be considered to be
somewhat of a “lower bound.” The last set of data was collected during the last week of the
semester. The student response overall is once again, quite favorable. An understandable dip
occurs after the second exam as a reflection of exam difficulty. Once again, the students felt that
time was a factor although the performance as measured by the exam scores was satisfactory.
5 (24)
(25)
(30)
4 (Question Number)
1
Exam 1 Exam 2 End
Evaluation
Figure 5: Remaining EFTE Data
Of particular note in this figure are the responses to questions 4, 23, 24, and 25. Question 4
specifically addresses whether “the method of presenting information in class enhances my
learning.” The end of the semester average was 5.84. Recall that this area was a potential
Page 3.343.10
concern given the Spring 1997 SRTE data. The end of the semester averages for questions 23,
24, and 25 were 5.45, 5.82, and 5.89, respectively. These questions each address specific
presentation techniques. “The instructor should continue to ask questions in the manner he does;
use the presentation style that he does; implement boardwork the way he does,” respectively.
The data with respect to physical models was not included here because the wording of the
statement varied each time. At the end of the semester, given an alternative of spending money
of physical models or computer software, the students chose the physical models (4.86/7). Recall
that a 3 is neutral for the EFTE.
Student comments have correlated well with both the SRTE and EFTE data. Typical comments
over the past five semesters have been:
“The design projects were interesting, especially the Kunkle Lounge project because you could
see the structure you were evaluating.” (Kunkle Lounge is a facility adjacent to the classroom
building.)
“I find myself at attention almost all of the time in this class due to the instructor.”
“Some of the lectures were interesting especially when visual aids were used.”
“The board presentation, the use of colored chalk to emphasize different forces and deflections.”
(Response to “What did you like best?”)
“Dr. Schmucker always had a positive attitude and was always willing to listen to students, used
student interaction.”
“It wasn't boring like most of my classes. The presentations were interesting (“wacky fun
noodle”) and helpful.”
“The clear organization of lectures.” (Response to “What did you like best?”)
By far and away, the item least liked about the author's courses over those five semesters have
been the exams. Students clearly feel that they are: too long, too difficult, and/or not over
appropriate material. The evaluation data clearly supports these comments but also illustrates
that, although exam difficulty may have a partial effect on the average response, the overall
evaluation of the instructor has not been seriously marked down. The one particular exception
may be the Spring 1997 semester.
Overall, it is the opinion of the author that although there is room for improvement that the
students are reacting favorably to the implementation of these teaching techniques. This is
further supported by numerous verbal comments by the students outside of class and after the
course is completed.
Page 3.343.11
3.2 Peer Comments
A total of six peer reviews have been performed: three during the first semester of the author's
teaching career and three during the fifth semester. The T4E implementation began during the
third semester. For no particularly reason, each peer review occurred during the same course:
CE240 Fundamentals of Structural Analysis. The first-semester evaluations were all positive and
commented that author appeared to have good interaction with the students and was well
organized. Note from the SRTE data, however, that some of the lowest student marks are from
this same semester. The fifth-semester evaluations were also positive. Two of these evaluations
were a standard part of the promotion and tenure process. The third evaluation was performed at
the request of the author.
Comments are listed below with the semester in which they were collected shown in parentheses.
“This is Mr. Schmucker’s first class of his first semester. My sense is that he enjoys
teaching and interacting with students, and he appears to have good teaching instincts.”
(Fall 1995)
“He showed confidence and the lecture was effective. I think he has a good start for his teaching
career. Continue to enhance the friendly relationship with students in the classroom.” (Fall
1995)
“Dr. Schmucker has obviously put a great deal of effort into the preparation and presentation of
this introductory course. The students appear to appreciate this effort.” (Fall 1997)
“The preparation for this lecture was, I'm certain, intense. The instructor used the physical
model to elicit from the class: (1) a conceptualization of problem (forces, deflections); (2) an
estimate of weights, forces, moments, etc.; (3) a solution which was found by volunteers who
acted as official calculators for class.” (Fall 1997)
“The interaction with the students was nothing short of outstanding. The students were all (50+)
involved, interested and responsive as the engineering problem was defined, analyzed and
solved. ... I know the students enjoyed the participatory approach, and I enjoyed it as well.”
(Fall 1997)
“This was a well-organized and fast-paced class, yet (he) was very effective at maintaining
student participation through questions, discussions, and demonstrations. (He) exhibited
excellent technical knowledge, his communication skills are terrific, and he was in touch with
the student level of competence and progress.” (Fall 1997)
The above teaching techniques and other aspects of the T4E model have been presented to the
Department of Civil and Environmental Engineering at Penn State and at a Penn State College of
Engineering Teaching Workshop. In both audiences, the participants have mentioned that they
particularly like the Board Notes. Said one colleague, “Having the board notes certainly
increased the clarity and organization of what went to the chalk board .... The board notes freed
me from having to think about where items in my notes would be placed on the chalk board
Page 3.343.12
The methods and techniques discussed within this paper are certainly not the only ones available
for effective teachers. However, for this author, they have provided a specific approach by which
to interact with students and begin to understand their approach to the material. These techniques
have also provided a directed avenue by which to interject the author's energy and enthusiasm
for engineering into the classroom.
The T4E model has been particularly helpful in all of these aspects. The well-defined structure
(only a partial of which was discussed here) of the model provides significant flexibility for
personal style. In many ways, it is a compendium of well established teaching principles. Its
focus, however, is directed towards specific techniques of how to accomplish the effective
(student-based) communication that is so highly desired. It is in this task-oriented approach that
the author finds much value in the model.
Appendix
Table 1: SRTE Questions
Number Question
A3 Rate the overall quality of this course.
A4 Rate the overall quality of the instructor.
B1 Rate the clarity of the instructor’s explanations.
B2 Rate the the instructor’s skill in maintaining a positive atmosphere for learning.
B3 Rate the adequacy of the instructor’s knowledge of the subject matter.
B4 Rate the instructor’s ability to convey his/her experiences with the subject matter.
B5 Rate the instructor in terms of his/her preparation for class.
B6 Rate the effectiveness of the instructor in demonstrating the significance of the subject matter.
B7 Rate the effectiveness of exams in testing understanding and not memorization.
B8 Rate the flexibility of teaching methods used to accomdate individual differences.
B9 Rate the effectiveness of the instructor’s presentations.
B10 Rate the instructor’s skill in encouraging students to think.
1. Conley, C.H., Samples, J.W., and Lenox, T.A., “Teaching Teachers to Teach Engineering”, Proceedings, 1996
ASEE Annual Conference, ASEE, June 1996.
2. Samples, J.W., Costello, M.F., Conley, C.H., Lenox, T.A., and Ressler, S.J., “Teaching Teachers to Teach
Engineering: A Year Later,” Proceedings, 1996 ASEE Annual Conference, ASEE, 1996, Session 2230.
5. Kresta, S.M., “Hands-on Demonstrations: An Alternative to Full Scale Lab Experiments,” J.Engineering
Education, vol. 87, no. 1, pp.7-9.
6. Meyer, K.F., Ressler, S.J., Lenox, T.A., “Visualizing Structural Behavior: Using Physical Models in Structural
Engineering Education,” Proceedings, 1996 ASEE Annual Conference, ASEE, 1996, Session 3515.
7. Schmucker, D.G., “Models, Models, Models: The Use of Physical Models to Enhance the Structural Engineering
Experience,” Proceedings, 1998 ASEE Annual Conference, ASEE, 1998, Session 3515.
8. Ressler, S.J., Meyer, K.F., and Lenox, T.A., “A Teaching Methodology that Works! Organizing a Class,”
Proceedings, 1996 ASEE Annual Conference, ASEE, 1996, Session 1675.
DOUGLAS G. SCHMUCKER
Dr. Schmucker is an Asst. Prof. in the Dep’t. of Civil and Env. Engineering at the Pennsylvania State University. He
begins an Asst. Prof. position at Valparaiso University, Fall 1998.. He graduated from Valparaiso University with a
B.S.C.E. in 1990 and earned M.S. and Ph.D. degrees from Stanford University in 1991 and 1996, respectively. He
has taught courses in structural analysis, introductory structural steel design, and structural dynamics.
Page 3.343.14