E-Assessment for Engineering Drawing
E-Assessment for Engineering Drawing
6 & 7 SEPTEMBER 2018, DYSON SCHOOL OF DESIGN ENGINEERING, IMPERIAL COLLEGE, LONDON,
UNITED KINGDOM
ABSTRACT
An effective method by which students learn the fundamentals of drawing practice is hand drawing
part and general assembly engineering drawings. These drawings are then marked to assess the
students’ knowledge and understanding. However, when the student cohort size increases up to several
hundreds, the time taken to mark the detail within a portfolio of drawings becomes significant too.
Simply reducing the number of drawings would deprive students of learning how to draw a wider
range of components and assemblies, so does not represent an ideal solution.
This paper discusses how an online assessment is used in conjunction with existing hand drawings to
reduce the time taken to mark coursework. The online test comprises five different question types,
such as, matching pairs, multiple-choice, and numerical, which are used to mark different aspects of
drawing practice. The paper describes why these question types are used and how they can test for
specific knowledge and understanding, which then need not be reassessed in the portfolio of hand
drawings. Also discussed, is the importance of the portfolio of engineering drawings as an effective
means for developing students’ technical drawing skills.
1 INTRODUCTION
This paper discusses the experience of using an online test or e-assessment as part of the engineering
design teaching to the first year of a MEng degree. The MEng degree programmes are accredited by
engineering institutions, such as, the IMechE, and therefore must adhere to the Output Standards of the
UK-SPEC (UK Standard for Professional Engineering Competence), which is adopted by QAA
(Quality Assurance Agency for Higher Education) as the subject benchmark statement for engineering
[1]. Our teaching of engineering design within the Design unit should be mindful of “Design at this
level is the creation and development of an economically viable product, process or system to meet a
defined need…Graduates will need the knowledge, understanding and skills to:” meet the specific
learning outcomes for Design, which is one of the five engineering-specific areas of learning an
accredited degree must demonstrate. This paper focuses on the said three out of six terms of
Interpretation, knowledge, understanding and skills, for Design when introducing students to drawing
practice for the first time in semester one of the first year. Once students have this underpinning
technical drawing ability, they, in semester two, design a product to meet a defined need.
The assessment strategy for the Design unit prior to the e-assessment was based entirely on
coursework. Students would develop their drawing skills by creating engineering drawings of both
individual parts and an assembly that together formed a portfolio. The portfolio is only one of the
design assessments in the first year, but it is the main focus of this paper. The portfolio is an effective
means of developing drawing skills because each drawing exercise is chosen to increase the technical
breadth of student knowledge, and in so doing provides many areas of the drawing standard from
which to assess and give specific feedback on. However, when the student cohort size increases up to
several hundred, the time taken to mark the detail within a portfolio of drawings becomes significant
too. The longer the time it takes to mark also means the longer it is before feedback can be given.
This becomes undesirable because feedback is essential for student learning from knowing where to
improve, and, if feedback is delayed, it deprives students the opportunity of applying their learning in
the following exercise or assessment.
EPDE2018/1124
2 ASSESSMENT PLANNING
The objective of reducing the time taken to mark the portfolio of engineering drawings was not a
simple case of just reducing the number of drawings. It was important to recognize that the majority of
students have no drawing experience whatsoever, and therefore, an appropriate teaching strategy is to
gradually step up their learning with several drawing exercises. To do this, the semester is planned to
give students a new drawing exercise each week, so they can repeatedly practise applying their
knowledge and deepening their understanding, but also broaden their technical ability with increasing
difficulty of each consecutive drawing. Furthermore, by keeping a higher number of drawings comes
the flexibility as educators to breakdown certain areas of the drawing standard [2], formerly BS308-
1:1993, into smaller chunks of learning across several assessments making easier for students to
digest.
This initial mapping supports three decisions. Firstly, if the long-established process of students
learning by completing a series of drawings is to change, then it is important to know where the
different aspects in these drawings develop the terms of Interpretation and enable students to meet the
learning outcomes of the Design unit? When the objective is to reduce marking time, it would easy to
simply reduce the number of drawings, but this might leave unacceptable gaps in the grid in table 1
which shows how the Design unit’s content maps across the terms of Interpretation. The second
decision is on how the existing summative assessment should change from the format where detailed
feedback is given on each and every engineering drawing? This style of feedback is valuable in the
learning process because it pinpoints areas within the portfolio with specific information describing
how each student can improve. Unfortunately, this comprehensive feedback at a detailed level in each
drawing is very time-consuming for large cohorts. The third and final decision is to identify which
drawing specifics are relatively time-consuming to mark and whether or not they could be tested
instead by an e-assessment?
EPDE2018/1124
Table 2. Use of online questions types to test UK-SPEC’s terms of Interpretation
Knowledge Understanding
MCQ X
Numerical X
question
Moodle
Yes/No X
types Matching pair (X) X
Single word X
Comparing both tables, a useful time saving option is in using the numerical question for marking
tolerances. Two of the drawings in the portfolio require students to calculate tolerances from the
Limits and Fits standard, BS4500A. A correct numerical answer in the e-assessment is only achieved
if students understand how to interpret the alphanumeric reference and correctly calculate the upper
and lower values. The answers in Moodle are set to recognize both upper and lower tolerance values,
and the full upper and lower tolerance dimensions. The matching pair question type presents students
with two lists of components, and then poses a question, which if understood, would give answers of
correct pairs within the lists. However, before pairs can be formed, students need knowledge of the
individual components to begin with, so this is why in table 2 there is a bracketed ‘X’ in the
knowledge column also. Using a mixture of all these question types, the e-assessment is formed with
fifty questions and students are given one hour to complete it.
EPDE2018/1124
3 DISCUSSION
Introducing the e-assessment into first year Design unit has enabled the existing summative
assessment of students completing a portfolio of drawings to be retained, whilst reducing the time it
takes to mark. However, such introductions are not always well received, because online tests are
strongly associated with multiple-choice questions and, as Scoulter [in 5] argues ‘…they promote
memorization and factual recall and do not encourage (or test for) high-level cognitive processes.’
The e-assessment here does comprise multiple-choice questions, and whilst recall may be sufficient
for some other questions require more than recall, because they, for example, in testing students’
understanding of 3rd angle projection require them to choose a specific elevation from several possible
answers. This style of question is repeated with elevations of different components which reduce the
likelihood of guessing correct answers. The e-assessment uses another type of question structure called
numerical which requires students to perform a calculation and enter a number corresponding to, for
example, an upper tolerance for a given nominal diameter. This kind of assessment tends to agree with
other researchers, such as, Johnstone & Arnbusaidi [in 5]; believing higher cognitive levels of learning
can be evaluated by MCQs because it depends on how the tests are constructed. The mapping of
question types to the UK-SPEC’s terms of Interpretation, shown in table 2, is useful in recognizing
these two different views of how these assessments test for knowledge and or understanding.
The UK-SPEC’s terms of Interpretation for Design also requires students to develop the skill of
engineering drawing. This is supported by the (existing) coursework where they create a portfolio of
drawings in which they should apply tolerances, create sections, and draw a range of dimensions in
both part and assembly engineering drawing formats.
The main reason for using an e-assessment is the advantage of automatic marking. However, there are
other advantages one of which is the ability to give feedback. The feedback in Moodle may vary from
a single mark up to detailed explanations geared specifically to the answer. The test here is set to
return just the overall result as a percentage, once it has calculated the different weightings set for each
question reflecting approximately levels of difficulty. It might be that in the future the e-assessment is
developed to include annotated feedback, but presently even the mark – revealed at the end of the test
- gives some feedback on learning as well as a summative assessment to the cohort who would
otherwise have to wait several weeks for the portfolio to be marked.
The Moodle software has many settings for the online test. A couple of settings used here are
‘question bank’ and ‘overrides’. The ‘question bank’ is a store of questions from which the test is
constructed. Furthermore, several banks may be created enabling questions to be grouped into
categories, which is the structure of the e-assessment here, and shown in figure 1.
EPDE2018/1124
additional time in assessments can have their allowance pre-set within the Moodle software ensuring
the summative assessment on a time basis is fair for all learners.
If there is a drawback or note of caution it would be in the time needed to create the online test. There
is a learning curve with any new software, but particularly in knowing how to navigate around the
many settings of the software, preparing illustrations within questions for a design context, and to
thoroughly pre-test before going live.
4 CONCLUSIONS
This paper describes how an e-assessment has been created as a summative assessment within a
Design unit on an accredited MEng degree. The learning of drawing practice within the first year
Design unit is mapped against the UK_SPEC’s terms of Interpretation, and used as a means for
reflecting on how effective the e-assessment is in requiring more learning from students than just
memory recall.
The motivation for the e-assessment was from the rise in cohort size and the associated increase in
time to mark and give feedback on Design coursework. The restructuring of the coursework’s
assessment criteria and retention of the portfolio of drawings are explained to show how the marking
time is reduced whilst preserving the opportunity for students to develop their skills for creating
engineering drawings.
The e-assessment was created within the VLE, Moodle, and is shown to offer several useful features
that include deterring plagiarism by various randomizing settings, and increasing fairness across
student learning requirements by the use of ‘over-rides’ for automatically adjusting time allowances.
REFERENCES
[1] Engineering Council The Accreditation of Higher Education Programmes, UK Standard for
Professional Engineering Competence, Third Edition, 2014.
[2] BS8888:2017 Technical product documentation and specification.
[3] Reeves T.C. Alternative assessment approaches for online learning environments in
higher education, Journal Educational Computing Research, 2000, Vol. 23 (1) 101-111
[4] Buzzetto-More, N., Alade A.J., Best Practices in e-Assessment, Journal of Information
Technology Education, 2006, Vol. 5
[5] Nicol, D., E-assessment by design: using multiple-choice tests to good effect, Journal of
Further and Higher Education, 2007, Vol. 31 (1).
EPDE2018/1124