0% found this document useful (0 votes)
2 views

E-Assessment for Engineering Drawing

This paper discusses the implementation of an online assessment (e-assessment) to enhance the teaching of engineering drawing in a first-year MEng Design unit at the University of Bath. The e-assessment aims to reduce the marking time of student portfolios while maintaining the effectiveness of learning technical drawing skills, utilizing various question types to evaluate knowledge and understanding. The study highlights the benefits of automatic marking and timely feedback while preserving the integrity of the traditional portfolio assessment method.

Uploaded by

crackintheshat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

E-Assessment for Engineering Drawing

This paper discusses the implementation of an online assessment (e-assessment) to enhance the teaching of engineering drawing in a first-year MEng Design unit at the University of Bath. The e-assessment aims to reduce the marking time of student portfolios while maintaining the effectiveness of learning technical drawing skills, utilizing various question types to evaluate knowledge and understanding. The study highlights the benefits of automatic marking and timely feedback while preserving the integrity of the traditional portfolio assessment method.

Uploaded by

crackintheshat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

INTERNATIONAL CONFERENCE ON ENGINEERING AND PRODUCT DESIGN EDUCATION

6 & 7 SEPTEMBER 2018, DYSON SCHOOL OF DESIGN ENGINEERING, IMPERIAL COLLEGE, LONDON,
UNITED KINGDOM

AN E-ASSESSMENT FOR ENGINEERING DRAWING


Rod VALENTINE
University of Bath, UK

ABSTRACT
An effective method by which students learn the fundamentals of drawing practice is hand drawing
part and general assembly engineering drawings. These drawings are then marked to assess the
students’ knowledge and understanding. However, when the student cohort size increases up to several
hundreds, the time taken to mark the detail within a portfolio of drawings becomes significant too.
Simply reducing the number of drawings would deprive students of learning how to draw a wider
range of components and assemblies, so does not represent an ideal solution.
This paper discusses how an online assessment is used in conjunction with existing hand drawings to
reduce the time taken to mark coursework. The online test comprises five different question types,
such as, matching pairs, multiple-choice, and numerical, which are used to mark different aspects of
drawing practice. The paper describes why these question types are used and how they can test for
specific knowledge and understanding, which then need not be reassessed in the portfolio of hand
drawings. Also discussed, is the importance of the portfolio of engineering drawings as an effective
means for developing students’ technical drawing skills.

Keywords: Technical drawing, online testing, undergraduate, engineering design

1 INTRODUCTION
This paper discusses the experience of using an online test or e-assessment as part of the engineering
design teaching to the first year of a MEng degree. The MEng degree programmes are accredited by
engineering institutions, such as, the IMechE, and therefore must adhere to the Output Standards of the
UK-SPEC (UK Standard for Professional Engineering Competence), which is adopted by QAA
(Quality Assurance Agency for Higher Education) as the subject benchmark statement for engineering
[1]. Our teaching of engineering design within the Design unit should be mindful of “Design at this
level is the creation and development of an economically viable product, process or system to meet a
defined need…Graduates will need the knowledge, understanding and skills to:” meet the specific
learning outcomes for Design, which is one of the five engineering-specific areas of learning an
accredited degree must demonstrate. This paper focuses on the said three out of six terms of
Interpretation, knowledge, understanding and skills, for Design when introducing students to drawing
practice for the first time in semester one of the first year. Once students have this underpinning
technical drawing ability, they, in semester two, design a product to meet a defined need.
The assessment strategy for the Design unit prior to the e-assessment was based entirely on
coursework. Students would develop their drawing skills by creating engineering drawings of both
individual parts and an assembly that together formed a portfolio. The portfolio is only one of the
design assessments in the first year, but it is the main focus of this paper. The portfolio is an effective
means of developing drawing skills because each drawing exercise is chosen to increase the technical
breadth of student knowledge, and in so doing provides many areas of the drawing standard from
which to assess and give specific feedback on. However, when the student cohort size increases up to
several hundred, the time taken to mark the detail within a portfolio of drawings becomes significant
too. The longer the time it takes to mark also means the longer it is before feedback can be given.
This becomes undesirable because feedback is essential for student learning from knowing where to
improve, and, if feedback is delayed, it deprives students the opportunity of applying their learning in
the following exercise or assessment.

EPDE2018/1124
2 ASSESSMENT PLANNING
The objective of reducing the time taken to mark the portfolio of engineering drawings was not a
simple case of just reducing the number of drawings. It was important to recognize that the majority of
students have no drawing experience whatsoever, and therefore, an appropriate teaching strategy is to
gradually step up their learning with several drawing exercises. To do this, the semester is planned to
give students a new drawing exercise each week, so they can repeatedly practise applying their
knowledge and deepening their understanding, but also broaden their technical ability with increasing
difficulty of each consecutive drawing. Furthermore, by keeping a higher number of drawings comes
the flexibility as educators to breakdown certain areas of the drawing standard [2], formerly BS308-
1:1993, into smaller chunks of learning across several assessments making easier for students to
digest.

2.1 Identifying e-Assessment areas


In order to identify which content of the Design unit is suitable for the e-assessment, a review of
learning outcomes against the UK-SPECs ‘interpretations’ for Design. A sample of the first year
Design unit’s content is mapped against the ‘interpretations’ of skills, understanding and knowledge,
and shown in table 1.
Table 1. Mapping learning outcomes against UK-SPEC’s terms of Interpretation
Skills Knowledge Understanding
Create part drawings Different types of dimension How to dimension
Create general assemblies Parts list & balloon How to construct geometry
referencing
Draw with different line Feature representation: Rules of drawing projection
weights thread, spline, gear, etc.
Write clear annotations Cross sections Rules of sectioning
Calculate tolerances Limits & fits, linear How to use BS4500A
tolerances

This initial mapping supports three decisions. Firstly, if the long-established process of students
learning by completing a series of drawings is to change, then it is important to know where the
different aspects in these drawings develop the terms of Interpretation and enable students to meet the
learning outcomes of the Design unit? When the objective is to reduce marking time, it would easy to
simply reduce the number of drawings, but this might leave unacceptable gaps in the grid in table 1
which shows how the Design unit’s content maps across the terms of Interpretation. The second
decision is on how the existing summative assessment should change from the format where detailed
feedback is given on each and every engineering drawing? This style of feedback is valuable in the
learning process because it pinpoints areas within the portfolio with specific information describing
how each student can improve. Unfortunately, this comprehensive feedback at a detailed level in each
drawing is very time-consuming for large cohorts. The third and final decision is to identify which
drawing specifics are relatively time-consuming to mark and whether or not they could be tested
instead by an e-assessment?

2.2 e-Assessment structure


The University’s virtual learning environment (VLE) is Moodle. Within Moodle is an option to add a
‘quiz’, which is otherwise known as an online test, and within it are several types of question from
which to build an online test. In table 2 are the question types used in the Design e-assessment and
where they address (some of) the terms of Interpretation.

EPDE2018/1124
Table 2. Use of online questions types to test UK-SPEC’s terms of Interpretation
Knowledge Understanding
MCQ X
Numerical X
question
Moodle
Yes/No X
types Matching pair (X) X
Single word X

Comparing both tables, a useful time saving option is in using the numerical question for marking
tolerances. Two of the drawings in the portfolio require students to calculate tolerances from the
Limits and Fits standard, BS4500A. A correct numerical answer in the e-assessment is only achieved
if students understand how to interpret the alphanumeric reference and correctly calculate the upper
and lower values. The answers in Moodle are set to recognize both upper and lower tolerance values,
and the full upper and lower tolerance dimensions. The matching pair question type presents students
with two lists of components, and then poses a question, which if understood, would give answers of
correct pairs within the lists. However, before pairs can be formed, students need knowledge of the
individual components to begin with, so this is why in table 2 there is a bracketed ‘X’ in the
knowledge column also. Using a mixture of all these question types, the e-assessment is formed with
fifty questions and students are given one hour to complete it.

2.3 Coursework structure


In order to achieve a reduction in marking time of the coursework, the submission format of it (i.e.
portfolio of drawings) now needs to be changed. However, it is important to recall the benefits of the
portfolio and, where possible, retain them. ‘Portfolios have been widely accepted as assessment
methods for decades in fields such as art, architecture and engineering…portfolios enable faculty to
judge interim steps and draft products that were involved in the completion of the task or course of
study’, [3]. Here, support is given to the wider amount of information generated by students in a
portfolio and using web-based resources to preserve the content in the form of a digital portfolio.
Suggestion is for the digital portfolio to be live, and for students to make annotations via electronic
journals or reflections.
Whichever form the assessment takes, it must be able to measure how well students address the unit’s
outcome. Popper [in 4] supports portfolios as a means of assessing learning outcome achievement
and, interestingly, also useful for diagnosing curriculum deficiencies that require improvement.
Clearly, a portfolio is an effective means of developing and capturing drawing skills within a Design
course, so it was decided not to change the quantity or type of drawings. Automatically, the tutorial
activity which supports the drawing exercises remains unchanged too. However, the change comes in
how the assessment criteria are written. Rather than mark all of the drawings for accurate geometric
representation, dimensions, sectioning, tolerancing, etc., it was decided to place one or two general
learning outcomes on the early drawings and, in effect, mark them on whether the general outcome is
met or not. The detailed marking comes in the final drawing, which, arguably, may be fairer as it gives
those new to drawing time to familiarize themselves with the subject and not be penalized for errors of
drawing detail in their early learning.
The new arrangement of learning outcomes across the portfolio begins with the first drawing being
assessed for correct projection. There is detail in the geometry of each elevation, but this is for
students to practise without penalty and quickly complete so that they are then ready for the next
drawing. The second learning outcome against which drawings are assessed is correct projection and
dimensioning. The outcomes throw a spot light on different areas of engineering drawing, another one
of which is tolerancing, up to the final drawing where it is assessed fully. The early portfolio drawings
are part drawings where students practice specific areas of technical drawing and the final drawing is
an assembly. The distribution of marks changes from the existing portfolio where each drawing has an
equal weighting to one where the final assembly drawing is 60% and the others carry an equal
weighting of the remaining 40%. This broadly reflects the distribution of time spent marking the
portfolio of drawings, where now the overall time is much reduced.

EPDE2018/1124
3 DISCUSSION
Introducing the e-assessment into first year Design unit has enabled the existing summative
assessment of students completing a portfolio of drawings to be retained, whilst reducing the time it
takes to mark. However, such introductions are not always well received, because online tests are
strongly associated with multiple-choice questions and, as Scoulter [in 5] argues ‘…they promote
memorization and factual recall and do not encourage (or test for) high-level cognitive processes.’
The e-assessment here does comprise multiple-choice questions, and whilst recall may be sufficient
for some other questions require more than recall, because they, for example, in testing students’
understanding of 3rd angle projection require them to choose a specific elevation from several possible
answers. This style of question is repeated with elevations of different components which reduce the
likelihood of guessing correct answers. The e-assessment uses another type of question structure called
numerical which requires students to perform a calculation and enter a number corresponding to, for
example, an upper tolerance for a given nominal diameter. This kind of assessment tends to agree with
other researchers, such as, Johnstone & Arnbusaidi [in 5]; believing higher cognitive levels of learning
can be evaluated by MCQs because it depends on how the tests are constructed. The mapping of
question types to the UK-SPEC’s terms of Interpretation, shown in table 2, is useful in recognizing
these two different views of how these assessments test for knowledge and or understanding.
The UK-SPEC’s terms of Interpretation for Design also requires students to develop the skill of
engineering drawing. This is supported by the (existing) coursework where they create a portfolio of
drawings in which they should apply tolerances, create sections, and draw a range of dimensions in
both part and assembly engineering drawing formats.
The main reason for using an e-assessment is the advantage of automatic marking. However, there are
other advantages one of which is the ability to give feedback. The feedback in Moodle may vary from
a single mark up to detailed explanations geared specifically to the answer. The test here is set to
return just the overall result as a percentage, once it has calculated the different weightings set for each
question reflecting approximately levels of difficulty. It might be that in the future the e-assessment is
developed to include annotated feedback, but presently even the mark – revealed at the end of the test
- gives some feedback on learning as well as a summative assessment to the cohort who would
otherwise have to wait several weeks for the portfolio to be marked.
The Moodle software has many settings for the online test. A couple of settings used here are
‘question bank’ and ‘overrides’. The ‘question bank’ is a store of questions from which the test is
constructed. Furthermore, several banks may be created enabling questions to be grouped into
categories, which is the structure of the e-assessment here, and shown in figure 1.

Figure 1. e-Assessment questions structured into banks


These banks of questions have two particularly useful functions. Firstly, it helps to quickly identify the
areas of the unit’s content being tested. And secondly, if the number of questions created in each bank
is greater than that needed for the actual e-assessment, then it is possible to randomize the questions
answered in between students. This helps to discourage blind plagiarism. The other setting, override,
is useful for adjusting the time allowed for individual students. Students who are registered for

EPDE2018/1124
additional time in assessments can have their allowance pre-set within the Moodle software ensuring
the summative assessment on a time basis is fair for all learners.
If there is a drawback or note of caution it would be in the time needed to create the online test. There
is a learning curve with any new software, but particularly in knowing how to navigate around the
many settings of the software, preparing illustrations within questions for a design context, and to
thoroughly pre-test before going live.

4 CONCLUSIONS
This paper describes how an e-assessment has been created as a summative assessment within a
Design unit on an accredited MEng degree. The learning of drawing practice within the first year
Design unit is mapped against the UK_SPEC’s terms of Interpretation, and used as a means for
reflecting on how effective the e-assessment is in requiring more learning from students than just
memory recall.
The motivation for the e-assessment was from the rise in cohort size and the associated increase in
time to mark and give feedback on Design coursework. The restructuring of the coursework’s
assessment criteria and retention of the portfolio of drawings are explained to show how the marking
time is reduced whilst preserving the opportunity for students to develop their skills for creating
engineering drawings.
The e-assessment was created within the VLE, Moodle, and is shown to offer several useful features
that include deterring plagiarism by various randomizing settings, and increasing fairness across
student learning requirements by the use of ‘over-rides’ for automatically adjusting time allowances.

REFERENCES
[1] Engineering Council The Accreditation of Higher Education Programmes, UK Standard for
Professional Engineering Competence, Third Edition, 2014.
[2] BS8888:2017 Technical product documentation and specification.
[3] Reeves T.C. Alternative assessment approaches for online learning environments in
higher education, Journal Educational Computing Research, 2000, Vol. 23 (1) 101-111
[4] Buzzetto-More, N., Alade A.J., Best Practices in e-Assessment, Journal of Information
Technology Education, 2006, Vol. 5
[5] Nicol, D., E-assessment by design: using multiple-choice tests to good effect, Journal of
Further and Higher Education, 2007, Vol. 31 (1).

EPDE2018/1124

You might also like