0% found this document useful (0 votes)
12 views

ieee answer

The document presents a web-based tool for the automated evaluation of descriptive exam responses using Natural Language Processing (NLP) to enhance grading efficiency and fairness. It outlines the system's ability to analyze responses through keyword extraction, semantic analysis, and sentiment analysis, providing immediate feedback to students and reducing grading inconsistencies. The proposed method aims to modernize examinations by allowing scalable and accessible assessments while minimizing manual oversight.

Uploaded by

jobinjinsa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

ieee answer

The document presents a web-based tool for the automated evaluation of descriptive exam responses using Natural Language Processing (NLP) to enhance grading efficiency and fairness. It outlines the system's ability to analyze responses through keyword extraction, semantic analysis, and sentiment analysis, providing immediate feedback to students and reducing grading inconsistencies. The proposed method aims to modernize examinations by allowing scalable and accessible assessments while minimizing manual oversight.

Uploaded by

jobinjinsa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Automatic Invigilation and Evaluation of Answer

Scripts Using NLP

Jobin George Joyel Jose K R Jishnu Dev


Computer science Engineering Computer science Engineering Computer Science Engineering
Jyothi Engineering College Jyothi Engineering College Jyothi Engineering College
Thrissur,India Thrissur,India Thrissur,India
[email protected] [email protected] [email protected]

Mohit Mohan K Dr.Swapna B Sasi


Computer Science Engineering Computer Science Engineering
Jyothi Engineering College Jyothi Engineering College
Thrissur,India Thrissur,India
[email protected] [email protected]

Abstract: Automated Review and Assessment of Response analysis and facial recognition to identify instances of
Scripts A web-based tool called Using NLP was created to make
cheating.
evaluating descriptive test responses more efficient. While modern
digital systems mostly concentrate on objective examinations, The system uses natural language processing (NLP) to
traditional examination methods rely on human grading, which compare descriptive responses to standard responses by
takes a lot of time and manpower. By utilizing Natural Language examining keywords, synonyms, and semantic context. By
Processing (NLP) to compare responses to a database of pre- minimizing human error and enhancing assessment
written responses through methods including keyword extraction, reliability, strategies like stemming and stop-word
synonym matching, and semantic analysis, this system gets
beyond these restrictions. To provide accurate and context-aware
elimination guarantee impartial, consistent, and fair
evaluation, important processing steps include sentiment analysis, scoring.
stop-word elimination, and stemming..The technology improves
evaluation efficiency, scalability, and fairness by reducing manual Students may comprehend their performance right away
grading and offering immediate feedback. It promotes a digital, with instant feedback, while teachers gain from quicker
accessible, and effective examination framework by empowering
grading and more insightful information about students'
educational institutions to administer, oversee, and assess
descriptive exams more successfully. development. By ensuring consistency, automated scoring
reduces the inconsistencies that are frequently found in
Index Terms— Automated Answer Evaluation, Natural hand grading.
Language Processing, Descriptive Exam Assessment, AI-Based
Grading, Keyword Extraction, Semantic Analysis, Sentiment The system's high scalability allows universities to safely
Analysis, Educational Technology, Digital Examination, Scalable administer huge exams. By using this technology,
Assessment Systems.
education becomes more efficient, uniform, and
accessible, and assessments are in line with the demands
I. INTRODUCTION
of contemporary digital learning.

The Automated Review and Assessment of Response II. RELATED WORKS


Scripts Exams are modernized through the integration of
NLP-based grading and automated proctoring. It ensures
safe and scalable examinations by doing away with manual Automated answer evaluation has advanced
oversight, subjective grading, and inefficiencies. Online significantly, utilizing deep learning and natural language
registration, notifications, and exam taking are all available processing (NLP) techniques. The usefulness of NLP-driven
to students, and AI-powered invigilation uses behavior grading systems for evaluating student responses was
investigated by Burstein et al. [1], who emphasized the 1. Worldwide Accessibility
significance of linguistic elements in automated scoring.
Their research showed how grading accuracy might be  Enables candidates to take descriptive exams
increased by syntactic and semantic analysis. The remotely without geographical limitations.
significance of feature engineering in text evaluation was  Utilizes NLP to automate answer evaluation,
also highlighted by Heilman and Madnani [2], who looked reducing manual grading efforts.
into the use of machine learning approaches, such as
regression models, for assessing descriptive answers. 2. Exam Notification

Semantic similarity analysis has also been applied in  Sends email notifications upon successful
recent advancements to improve answer grading. In order authentication, ensuring candidates receive exam
to assess open-ended responses, Sukkarieh et al. [3] details.
presented a model that makes use of latent semantic  Includes essential information such as exam date,
analysis (LSA), demonstrating its capacity to capture time, and login credentials for easy access.
conceptual understanding that goes beyond keyword
matching. Additionally, Zhang et al. [4] investigated the use
· 3. Standard Answer Repository
of allows applicants to take descriptive tests at a distance
without being restricted by location.
 Stores model answers along with explanations,
essay grading, which greatly increased the accuracy of
definitions, and key concepts for comparison.
evaluating contextual relevance and linguistic coherence.
 Facilitates structured evaluation by ensuring
consistency in answer assessment.
The resilience of automated grading systems has also
been enhanced by developments in natural language
· 4. Keyword Matching
processing techniques. Chali and Hasan [5] addressed the
problem of different phrase patterns in student responses
by proposing a system that combines synonym-based  Extracts significant keywords and synonyms from
scoring with keyword matching. In a similar vein, Riordan student responses.
et al. [6] showed how deep learning models—specifically,  Compares extracted terms with the standard
LSTMs and attention mechanisms—can be used to better answer to determine accuracy and relevance.
comprehend the semantic depth of responses and improve
scoring consistency. · 5. Major Steps in Answer Processing

Building on these advancements, our study presents an  Connection Word Elimination: Removes non-
Automated Descriptive Answer Evaluation System that essential words (e.g., articles, prepositions) to focus
grades student responses using Siamese architectures and on key content.
transformer-based models (RoBERTa). To offer a thorough  Stemming and Semantic Analysis: Converts words
assessment method, our system combines sentiment to their root forms and evaluates meaning rather
classification, stemming, connection word removal, and than exact word matches.
semantic similarity analysis. Through the use of keyword-
based scoring methods and a uniform answer repository, · 6. Scoring Mechanism
our approach manages sentence structure variations with
ease while maintaining assessment accuracy.  Assigns scores based on the number of matched
keywords and their contextual relevance.
By combining adaptive learning frameworks and  Uses weighted scoring to give partial marks for
automated feedback production, our method not only partially correct or well-structured responses.
complements but also advances current research. This
hybrid approach improves the fairness and dependability
of response evaluation while increasing the scalability of
digital examinations.
.
III. PROPOSED METHOD
[3] J. E. Nalavade, “Descriptive answer evaluation system using natural
language
processing,” pp. 5–10, 2023.
[4] M. F. Bashir, “Subjective answers evaluation using machine learning
and natural
language processing,” pp. 90–95, 2021.
[5] L. M. Chandrapati, “Automated system for checking works with free
response
using intelligent tutor’s comment analysis in engineering education,”
pp. 35–40,
2020.
[6] P. V. Prasad, N. Krishna, and T. Jacob, “Ai chatbot using web speech
api and
node.js,” pp. 360–362, 2022.
[7] P. S. Devi, S. Sarkar, T. S. Singh, L. D. Sharma, C. Pankaj, and K. R.
Singh, “An
approach to evaluating subjective answers using bert model,” pp. 1–4,
2022.
[8] T. R. Gupta and R. S. Mehta, “Machine learning-based grading of
descriptive
answers using nlp,” pp. 15–20, 2021.
[9] Gunawansyah, “Automated essay scoring using natural language
processing and
text mining method,” pp. 1–3, 2020.
[10] A. Shylesh, “Automated answer script evaluation using deep
learning,” pp. 50–55,
2023.
[11] A. A. Tambe, “Automated essay scoring system with grammar
score analysis,”
pp. 70–75, 2022.
[12] M. A. Sayeed, “Automate descriptive answer grading using
Fig. 1. Flow Chart reference based
models,” pp. 10–15, 2022.
.
ACKNOWLEDGMENT

We would like to sincerely thank our guide, Ms. Anumol


C S, Assistant Professor, CSE, JECC, for all of her help,
support, and encouragement during this research. Her
knowledge and perceptions have greatly influenced the
direction of our work.

Additionally, we would like to express our gratitude to the


faculty and staff of JECC's Computer Science and
Engineering Department for providing the tools and a
supportive environment needed to finish this project
effectively.

We appreciate everyone's efforts in helping to make this


project a success.

REFERENCES
[1] T. M. Shetty and S. J. Rao, “Evaluating descriptive answers with a
hybrid machine
learning approach,” pp. 30–35, 2022.
[2] H. L. Zhang and X. Y. Chen, “Utilizing transformer models for
automated grading
of student responses,” pp. 20–25, 2021.

You might also like