Report -3
Report -3
MRS. V VIJAYALAKSHMI
(Assistant Professor, Department of Networking and Communications)
BACHELOR OF TECHNOLOGY
in
COMPUTER SCIENCE AND ENGINEERING
with specialization in CYBER SECURITY
MAY 2025
Department of Networking and Communications
We hereby certify that this assessment compiles with the University's Rules and Regulations relating
to Academic misconduct and plagiarism, as listed in the University Website, Regulations, and the
Education Committee guidelines.
We confirm that all the work contained in this assessment is my / our own except where indicated,
and that we have met the following conditions:
DECLARATION:
We aware of and understand the University's policy on Academic misconduct and plagiarism and
we certify that this assessment is my / our own work, except were indicated by referring, and that
we have followed the good academic practices noted above.
DATE:
ii
ACKNOWLEDGEMENT
iii
SRM INSTITUTE OF SCIENCE AND TECHNOLOGY
BONAFIDE CERTIFICATE
Certified that 18CSP108L project report titled “Autonomous proctoring system” is the bonafide
work of Shreyas Rao V.S(RA211103001016) Bavishyath K.R (RA2111030010174) Abhishek
Devagudi Reddy (RA2111030010164) AbhayKrishna B S (RA2111030010168), who carried out
the project work under my supervision. Certified further, that to the best of my knowledge the work
reported herein does not form any other project report or dissertation based on which a degree or
award was conferred on an earlier occasion on this or any other candidate.
iv
ABSTRACT
The new testing experiences should promote academic integrity as remote learning is growing rapidly during a
pandemic. Most proctoring systems today use either intense human supervision or static AI models, which are
invasive, difficult to scale, or sensitive to false positives. In this paper, we describe a new autonomous proctoring
software system ProctorPro that addresses these drawbacks by leveraging advanced machine learning, real-time
behavioral analysis, and privacy-preserving techniques. It is based on a modular system with continuously-
recorded video and audio footage, intelligent anomaly detection, real-time reporting and biometric authentication.
Using facial recognition, gaze tracking, and sound signal analysis, the system detects abnormal behaviors,
impersonation, and environmental irregularities. When compared to the traditional solutions, ProctorPro is
adaptive to various testing contexts, it can eliminate more false-positive alarm events and adapt to equitable
evaluation by learning contextual traces and students' habits over time. To prevent privacy dangers, the programme
employs methods, for example, such as data minimization, real-time encryption and federated learning so that the
private data does not need to be stored and cannot be so that it runs the risk of being misused. Edge computing is
also implemented for reducing latency and bandwidth utilization, allowing the system to scale effortlessly as
support for extensive examinations. Results from experimental study confirmed ProctorPro's capabilities of
efficiently detecting potential cheating behaviour with a 94.5% rate while maintaining a 3.2% false positive rate.
The system was also stress tested under simulated multiplayer conditions, poor networks and simulated cheating
scenarios, which confirmed the reliability and robustness of the system. The paper also explored future
developments supported by blockchain technology-based exam records and VR-based testing environments and
multimodal biometric verification. This comes together as a thin balance of automation, integrated ethical
security, and supervision, and is an advanced solution for educational institutions looking to promote trust,
transparency, and integrity for their online examinations
v
TABLE OF CONTENTS
vi
2.2 Sprint 2 23
2.2.1 Comparative Analysis of Models 23
2.2.2 Functional Document 28
2.2.3 Architecture Document 31
2.2.4 UI Design 33
2.2.5 Functional Test Cases 34
2.2.6 Committed vs Completed User 35
Stories
2.2.7 Sprint Retrospective 36
2.3 Sprint 3 36
2.3.1 Sprint Goal with User Stories of 37
Sprint 3
2.3.2 Functional Document 41
2.3.3 Architecture Diagram 44
2.3.4 UI Design 46
2.3.5 Functional Test Cases 47
2.3.7 Committed vs Completed User 48
Stories
2.3.8 Sprint Retrospective 49
3 RESULT AND DISCUSSIONS 50
3.1 Project Outcomes 50
4 CONCLUSION AND FUTURE 51
ENHANCEMENT
5 REFERENCE 55
10 APPENDIX 57
A CONFERENCE PUBLICATION
B SAMPLE CODE
C PLAGIARISM REPORT
vii
LIST OF FIGURES
viii
2.13 UI Design 33
2.20 UI Design 46
ix
LIST OF TABLES
2.2 20
Functional Test Case of sprint 1
2.3 23
Sprint Retrospective for Sprint 1
2.4 23
Detailed User Stories of sprint 2
2.5 34
Functional Test Case for Sprint 2
2.6 36
Sprint Retrospective for Sprint 2
2.7 37
Detailed User Stories for Sprint 3
2.8 47
Functional Test Case of sprint 3
2.9 49
Sprint Retrospective for Sprint 3
x
ABBREVIATIONS
xi
CHAPTER 1
INTRODUCTION
On account of the speed and high demand for secure, effective, and scalable testing tools, a lot
of artificial intelligence-based remote proctoring technologies have already come up. The
currently existing AI-based remote online proctoring tools have a considerable number of
drawbacks; they can be either overly intrusive, racially and gender biased, or just incapable of
recognizing behavior that indicates integrity misconduct. For legitimate reasons, looking away
from the screen and some physical changes in position reported by students have invoked
anxiety and, in some cases, undue penalties. This all offers a proposition for an intelligent
autonomous proctoring system: ProctorPro. Designed empathetically, and laying great
emphasis on integrity, ProctorPro will be premised on cutting-edge technologies such as
machine learning, computer vision, and behavioral analysis. ProctorPro will be able to identify.
1
1.2 Motivation
This project is based on an intense and increasing outcry in formal education: how to effectively
uphold and maintain academic integrity in a growing digital and remote world?
With rapid online learning shift worsened by the COVID-19 pandemic, and assessments and
evaluations not being held on a typical classroom basis, students take major tests-from home,
maybe across boundaries, time zones, and in very different environments from each other.
Hence, the setting offers an overwhelming flexibility and equity of choice. At the same time,
it takes on new risks: cheating, impersonation, and lack of meaningful and fair assessment for
all the examined. Traditional methods of proctoring can't keep up. Human proctors are not
scalable, and the mental load of monitoring hundreds of students by webcam is enormous.
Early solutions that used AI to proctor exams are currently receiving backlash for being too
invasive, unreliable, and ethically questionable. Some students feel uneasy, and some systems
process a student's need to look away to think, or an unavoidable background noise, as
dishonest behavior which not only creates stress, but undermines trust.Hide them.The greatest
gap identified by these challenges is a more intelligent and humane response to supporting
assessments in online courses.
2
1.3 Sustainable Development Goal of the Project
The Autonomous Proctoring Software project represents not just a technological innovation;
it's a step toward a smarter, fairer, and more accessible future for education. Besides
contributing to SDG 16: Peace, Justice and Strong Institutions, this solution addresses several
areas of SDG 9: Industry, Innovation and Infrastructure, and targets SDG 4: Quality Education.
This project fills the gap between academic integrity and digital accessibility, making it a
responsible technological solution to some of today's most important global educational
issues.
Ensure inclusive and equitable quality education and promote lifelong learning
opportunities for all. In a rapidly digitalizing world, our education must adapt to stay inclusive
and credible. Online learning has increased access for millions, especially for those in far-off
or poorly served areas. Access is only part of what’s needed — credibility and fairness in
assessments also matter. Online education is not something to be mistreated as students who
choose this route are often those who need flexibility. This project contributes toward SDG 4
through: Student Assessment: Remote learning wherever students may be – assessors can trust
the evaluation process is safe, monitored and reliable.
Levelling the Playing Field: Instead of invasive techniques or biased assumptions, which risk
marginalizing students from different backgrounds, abilities, or environments, the system
harnesses behaviour based analysis.
Lowering the Barriers of Entry: System is built for off the shelf scalability, and a low-
bandwidth footprint it also functions quite well in low internet infrastructure developing
nations making quality education assessments possible for more the global populace.
3
How it contributes: Encouraging Ethical AI Advancements: Using ethical AI models to derive
insights from student data, combined with data minimization and encryption to protect privacy.
Supports Infrastructure for Remote Learning: It thus fosters a more sustainable and distributed
exam platform with less reliance on costly hardware or centralized systems.. Fosters
Technological Equity: Support technology innovation in education that can be tailored to the
unique needs of urban and rural users.
Lowering Exam Bias and Wrong Procedures — Unlike humans, trained AI can
easily pinpoint real deviations and leave out normal instances such as diverting one's eyes to
think, adjusting a position in a more relaxed posture, thus dropping false flags.
Building Institutional Trust: This offers schools and universities a method that complies with
an ethical and legal framework (GDPR, student consent, data transparency);
Supporting Student Rights: Storing limited use data and deleting real-time footage after
analysis preserves the dignity of students’ digital identity and limits long term risk.
Broader Impact: In a world where the march of digital transformation frequently outstrips the
pace of regulation and ethics, this project also serves as a beacon of ethical technology
progress. It integrates and streamlines education, technology, and justice. Whether it is helping
a student from a remote village to take a certified exam or empowering a teacher to conduct
fair assessments without the fear of cheating, the solution touches lives in so many ways.
But for what it is software is a solution, but at its core, this is about taking a step towards an
education system that is inclusive, secure, and sustainable.
4
1.4 Product Vision Statement
1.4.1The Vision
We envision a world in which all students, regardless of where they are located geographically,
what they are, or where they study, are able to test with fairness and confidence, without ever
believing they are under observation, scrutiny, or suspicion. In that world, online testing is not
second best, but secure, safe, and liberating. Proctor Pro is our solution that begins to create
such a world. Proctor Pro is more than proctoring software. It is a trust-enabling platform driven
by innovative technologies such as artificial intelligence, behavior analysis, and real-time
monitoring to enable a new paradigm of ethical, intelligent, and scalable exam monitoring, not
control. It gets the right balance between academic integrity needs and empathy, privacy, and
user comfort. We're not here to establish an all surveillance system. We're here to humanize
exam surveillance and provide teachers with confidence and provide students with the respect
they deserve in a more cyber age.
Remote learning has increased by leaps and bounds, but how we administer exams hasn't.
Present proctoring systems are either:
These legacy systems are creating suspicion, fear, and even guilt-ridden student dropouts who
are unable to defend themselves until their names are cleared. Teachers, meanwhile, are being
saddled with false positives and in the absence of systems that offer actionable intelligence We
believe that technology must work for humanity—not the other way around. We're working
diligently to bridge this gap with a smart, fair, and humane product built.
● Trust Trust underlies any learning environment. Our software doesn't merely
detect—it gains the trust of learners and educators by being transparent,
explainable, and equitable.
5
● Empathy: Students are not machines. They walk, they think aloud, and sometimes
they won't even look at you. Our system learns what suspicious behavior is and
what normal human behavior is, minimizing false alarms and stress.
● Scalability We envision this platform being used by schools, universities, and
certifying organizations globally. ProctorPro was built to track hundreds of
students simultaneously without sacrificing performance or integrity.
● Privacy-First Design: In a world where information can be so easily misused,
Proctor Pro ensures privacy is never an afterthought. Real-time encryption, data
retention to the barest essentials, and responsible AI practices are ingrained in the
very fabric.
● Inclusivity: Our vision is for every student—students with disabilities, low-end
device students, or students in unstable settings. The system adapts, not students.
6
1.5 Product Goal
The ultimate objective of ProctorPro is to furnish a shrewd, secure, and student-centric solution
through which assessments conducted online may uphold academic integrity concerning the
experience and respect for the user in terms of privacy and accessibility.
This product is intended to 'rebuild the trust' in between with which an institution puts its
students into an online examination. This is by employing artificial intelligence to monitor the
way student behaves during examination and thus replacing the rigid rules or not requiring a
permanent human supervision to check how students do during examination time. ProctorPro
identifies even potential violations as they occur in real-time but interprets that a glance at the
clock or a nervous fidget is not a sign of academic dishonesty. The machine-learning-based
solution is context-based rather than merely active-based. The solution not only allows reliable
observations but also supports monitoring for online assessments with hundreds of concurrent
users, without dropping any performance. All this assures institutional trust for facilitating a
seamless experience in testing, along with an auditing report detailing student behavior.
A critical component of our objective is to focus on privacy and ethics. Every aspect of
ProctorPro's services—facial recognition, screen recording, and anomaly detection—is
governed by compliance with formalized federal data protection requirements. Our strict
standards for real-time encryption, limited data retention, and consent-based usage make our
solution as much about student rights and privacy as it is about academic integrity.The focus
on students is much more than posturing to institutions. Our product is intended to alleviate
anxiety typically associated with AI-proctoring solutions, through clear, actionable feedback,
an ease of use interface for diverse situations—accessibility, device variations, connectivity
issues, etc.To summarize, ProctorPro's product objective is to implement a scaleable, privacy-
centric, and compassionate remote proctoring solution that protects the academic integrity of
online education, and respects the unique needs and human dignity of every learner. It is a
progressive solution that simultaneously reconciles safety and trust in the digital education
space.
7
1.6 Product Backlog
#US1 As a new user I need to easily register for the platform so that I can gain access
to its feature as per the Access
#US2 As a used I need to have availability of choosing the theme like system default
theme , light theme , and dark theme
#US3 As a user I need to have MFA and Biometric authentication for login and
registration purpose
#US4 As a user I want to engage in forums and group activities so that I can share and
collaborate with others.
#US5 As a user I want the interface should be simple and clean so that I can easily
access the software.
#US6 As a user, I want to create and browse skill-sharing listings so that I can offer
my skills to others and discover opportunities to learn new skills from others.
#US7 As a user The student and Teacher must have peer to peer communication
between them
#US8 The students must receive the results as soon as they complete the exams.
#US9 If there is any internet issues during the exam their must be resume the exam
after retaining the network
8
The product backlog of Autonomous Proctoring Software was configured using the MS planner
Agile Board which is represented in the following Figure 1.1. The Product Backlog consists of
the complete user stories of Ai based E-learning Application.
Each user story consist of necessary parameters like MoSCoW prioritization, Functional and
non functional parameters, detailed acceptance criteria with linked tasks.
9
1.7 Product Release Plan
The following Figure 1.2 depicts the release plan of the project
10
CHAPTER 2
2.1Sprint 1
2.1.1 Sprint Goal with User stories of Sprint 1
The goal of the first sprint is to Establish foundational components, define system roles, prepare
architecture, and initiate biometric authentication
The following table 2.1 represents the detailed user stories of the sprint 1
US #1 As a new user I need to easily register for the platform so that I can gain
access to its feature as per the Access
US#2 As a used I need to have availability of choosing the theme like system default
theme , light theme , and dark theme
US#3 As a user I need to have MFA and Biometric authentication for login and
registration purpose
US#4 As a user I want to engage in forums and group activities so that I can share
and collaborate with others
Planner Board representation of user stories are mentioned below figures 2.1,2.2 and 2.3
11
Figure 2.1 User story for user registration
12
Figure 2.2 User story for Theme and font selection
13
Figure 2.3 User story for developer
14
2.1.2Functional Document
2.1.2.1 Introduction
The goal of the Proctor Pro project's first sprint is to build a strong functional and technical
foundation for the autonomous proctoring system. Establishing essential elements such as
role-based access control, facial biometric configuration, secure user registration, and the
preliminary exam scheduling framework is the primary focus of this sprint. This is a
foundational sprint, ensuring that all users are properly authenticated, privacy concerns are
addressed from the start, and the system is constructed to accommodate future features in a
scalable, modular fashion.
Establishing the baseline infrastructure needed for a credible and safe remote inspection
platform is Sprint 1's main objective. This includes making sure that:
15
● Exam Setup: Instructors or administrators can set up future exams, create basic options
(date/time/duration), and select registered students.
● Privacy Notification: Any monitoring or recording elements will not commence until
students have viewed a privacy policy and affirmatively consented.
2.1.2.4 Features
Feature 1: Registration
● Description: Allows new users to register users with an email address and establish
a secure password.
● User Story: US001 - As a new user, I want to register with email/password and
access the delivery platform.
● Description: The admin has the ability to assign user roles (i.e. Student, Proctor,
Admin) with their corresponding access rights and capabilities of the system.
● User Story:US003 - As an admin user, I want to assign user roles will define
identifiable access rights that are also correspondent to access levels.
16
● User Story:US005 - As a User, I want to read and consent to privacy policies prior
to sitting an exam.
2.1.2.7 Assumption
1. User Base is Technically Capable of Basic Onboarding We assume students, admin and
proctors have a device with internet access and can comfortably do things like register
an account, log in, and understand basic on-page messages regarding what they need to
do. Users will be assumed to have a basic operating knowledge of webcam-enabled
technology, therefore can begin the onboarding process.
2. Device Has a Webcam & Microphone For facial biometric registration and
authentication to work, we assume users have devices (laptop, tablet or phone) with
new or good working order webcams. The webcam will be used in future sprints for
live monitoring, but in Sprint 1, the webcam is used solely for this registration face
capture process.
3. Facial Recognition is Limited to Identity Authentication During Sprint 1, facial
recognition will only be done to meet user identity authentication. It will also be used
later for pre exam validation purposes. There will not be real-time tracking or
monitoring, and this phase does not include AI based behavioural analysis.
4. Biometric Data is Secured and Managed Ethically We assume facial data collected
during registration is in a secure, encrypted format and stored in a way compliant to
privacy legislation (e.g., GDPR). Consent will always be obtained before biometric data
is collected.
1.Authentication Service
2.Biometric Service-
17
● Validates the face prior to the exam
● Interacts with the frontend via WebRTC/Webcam API
● Each of the services communicate with one another via secure REST APIs
● Assures modularity, isolation of critical services
18
2.1.4 UI DESIGN
19
2.1.5 Functionality Test Case
20
2.1.6 Daily Call Progress
21
2.1.7 Committed Vs Completed User Stories
22
2.1.8 Sprint Retrospective
2.2SPRINT 2
2.2.1 Sprint Goal with User Stories of Sprint 2
The goal of the second sprint is to Establish foundational components, define system roles,
prepare architecture, and initiate biometric authentication
The following table 2. represents the detailed user stories of the sprint 2
23
Planner Board representation of user stories are mentioned below figures 2.1,2.2 and 2.3
24
Figure 2.8 User story for Facial Re-Verification
25
Figure 2.9 User Stories for Gaze and Head Movement
26
Figure 2.11 Audio Anomaly Detection
27
2.2.2 Functional Document
2.2.2.1. Introduction
As we move into Sprint 2, we will see the most significant transition in the development of
ProctorPro. So far, we've been in the system set up phase, but moving forward to Sprint 2, we
will begin to build the actual AI powered monitoring capabilities. In this sprint, we will
introduce real time exam monitoring capabilities like facial re verification, gaze tracking,
multiple face detection and audio based anomaly alerts. We think about these features with
accuracy, privacy, and user centricity in mind.
The idea is to build intelligence into the system so it can make contextual decisions during the
exams without imposing too much on the proctors nor negatively impact student's comfort.
To summarize, the main purpose of Sprint 2 is to build a smart monitoring layer in the platform
that can observe a student's behavior during an exam and intelligently flag inappropriate
behaviors. Unlike traditional means of monitoring students during exams, the solution
proposed in this sprint:
The AI, by nature, does not make assumptions; it learns from patterns and evolving typical
movement and actions of students.
The behavioral monitoring and real-time alert workflows included in Sprint 2 are as follows:
● Facial Re-Verification: Before and during an exam, the system checks periodically to
ensure the same person is present. This will help avoid issues with impersonation.
● Eye Gaze & Head Movement Monitoring: The AI will keep an eye upon head and
eye movement and will be able to then detect when a student is frequently looking
away, which may raise alarms.
28
● Multiple Face Detection: The system searches for extra faces in the camera, which
would demonstrate that someone may be assisting the student.
● Background Voice Monitoring: Audio is constantly being scanned for whispers and
background conversation with sound analysis tools.
● Anomaly Detection & Real-Time Alerts: Anomaly detection will flag behavior in
real-time and notify the assigned proctor in a subtle manner.
The processes mentioned happen quietly in the background without the student noticing, with
the instructor's knowledge.
2.2.2.5. Features
● Description: Verifies that the student’s face before and during the exam is the same as
what is registered as biometric data.
● User Story: US006 ـAs a student, I want the application to do facial re-verification
while running exam, to avoid impersonation.
● Description: Uses AI to check if the student is looking away from the screen regularly.
● User Story: US007 – As a system, I need to track the gaze and head movement in order
to detect distractions.
● Description: Alerts when more than 1 face is observed in the camera feed
● User Story: US008 - As a proctor, I want notification if multiple faces are detected on
screen
29
Feature 5:ML Based Behavior Anomaly Detection
● Description: Learns the expected behavior of a student in real time and flags deviation
from expected norms as potentially suspicious.
● User Story: US010 - As an AI model, I want to track unusual pattern deviation
2.2.2.7 Assumption
1.Users Will Give Camera and Microphone Access As part of our responsibility we expect that
students will grant webcam and mic access when they first commence the assessment. That is
crucial for the real-time face tracking, gaze detection, and audio anomaly and interruption
detection of the system. They are essential to enable these core functionalities in Sprint 2.
2. Exams Will Be Taken in a Stable, Controlled Environment We practice the assumption that
students will take their assessments in a reasonably quiet, private place, where the webcam can
get a good view of the candidate’s face and ambient noise can be clearly heard by the system.
Very dark rooms or loud environments and other people going in and out can all interfere with
the accuracy of the monitoring tools.”
3. AI Models Have At Least a Baseline Behavioural Baseline The anomaly detection engine
must pretend that it has some prior data (or default patterns) to compare to, with normal exam
behavior. While this system will continue to learn over time, it also has to assume that there is
behaviour baseline, so initial flagging and detection logic can be derived.
4. Overall, proctors should be able to review alerts in real-timeThe system is built to offload
some of the manual burden of the proctor however it must expect that the proctor (human)
exists to scan alerts, examine live stream of high risk flags, and carry out actions when
necessary.
5. All Sensitive Data is Treated Securely and Respectfully. The sprint assumes that all
monitoring data, such as facial images, audio wave forms, and alert logs, have been encrypted
and handled in keeping with privacy standards (eg, GDPR). This sprint also assumes that data
has not been maintained indefinitely unless needed for review or legal purposes.
30
6. The Real-Time Communication Channels are StableSocket-based alerts and live dashboards
require stable client-server communication. This sprint assumes that the exam sessions are
being completed under reasonable internet connectivity and latency, so real-time alerts can
happen without delay.
7. Students are not notified of each detection in real timeTo avoid anxiety, or manipulation of
the monitoring process, it is assumed that students will not receive alerts for every potential
detection the system makes. Alerts will instead go to the proctor, who can decide whether to
take action on the alert, or simply dismiss.
1.Monitoring Engine:
● Tracks directional gaze, facial features, or presence of an additional face from webcam
feed.
● Utilizes TensorFlow.js and MediaPipe for web-based processing.
● Built using Socket.IO to deliver alert notifications to the proctor interface in real time.
● Notifications are stored with a timestamp and a brief log of the notification.
● Enables AES-256 encryption on loop for all data streams during exam.
● No raw audio or video will ever be retained, only encrypted clipboard metadata and
snaps will be retained.
31
● On-device processing provides the least lag and maximized privacy on aspects of the
workflow.
Data Flow
2.Server Session:
● The flag is logged into server database.A notification is sent in the form of context to
the assigned proctor (time stamp, type of flag, and level of risk).
3.Procter Interface:
32
2.2.4 UI DESIGN
33
2.2.5 Functional Test Cases
34
2.2.6 Committed Vs Completed User Stories
35
2.2.7 Sprint Retrospective
2.3 Sprint 3
2.3.1 Sprint Goal with User Stories of Sprint 3
The following table 2. represents the detailed user stories of the sprint 3
36
#US3 As a student, I want the system to automatically reconnect after an internet
drop, so I do not lose progress on my exam or get unfairly penalized.
#US4 As an admin, I want to produce reports of all flagged activity during the exam,
so I can provide summaries to instructors, institutions, or compliance officers
#US5 As a user, I want to modify my interface settings (i.e. turning on dark mode,
changing font size), so I can stay focused and comfortable during exam time.
37
Figure 2.15 User Stories for Proctor Dashboard
38
Figure 2.16 User Stories for Flag Review and Manual Control
39
Figure 2.18 User Stories for Report Exporting
40
2.3.2 Functional Document
2.3.2.1 Introduction
The final and most consequential development part of ProctorPro is Sprint 3. Building on the
security in Sprint 1 and integrating intelligent monitoring in Sprint 2, Sprint 3 will focus on
scaling the features, optimizing the user experience, and preparing the platform for use in live
exams. Our focus will be to build a real-time proctoring dashboard, manual review if an
incident is flagged, seamless session recovery if the exam drops, and reporting exam integrity.
The features delivered in Sprint 3 were intended to give instructors control during a large-scale
remote exam and to give the students a seamless and transparent experience.
The main focus of Sprint 3 is to make the system completely functional in exam scenarios. This
includes:
In the end, the main goal of this sprint is to make ProctorPro deployable, with functionality that
is powerful, yet practically applicable.
This sprint brings several real time workflows for live exams:
● Live Proctoring Dashboard: Proctors have access to a grid view of all students, and are
alerted visually to any suspicious activities
● Incident Review + Management: Proctors can click on a flagged alert, review audio or
video snippet evidence, and mark it reviewed, dismissed, and/or escalated for review
or restoration of a session later.
● Session Recovery: If students are disconnected for some reason, the system will
automatically re-authenticate for them and return them to their live session.
41
● Report Generation: Once the exam is over, proctors and admins will be able to export
a full report which includes, but does not limit itself to, time stamped alerts, proctor
notes, and system recommendations.
● User interface and experience updates: Students will have a clean user interface that is
focused, with an optional dark mode, live status indicators at various levels on the
screen, and the least distractions possible.
2.3.2.5. Features
● Description: Proctors can see all active exams on a single dashboard, with real-time
updates, alerts, and live video previews.
● Purpose: To give proctors a view from the top; manageable for short-staffed and
strained teams.
● Description: Proctors can review and confirm flagged alerts, view short video clips,
leave comments, and override if necessary.
● Purpose: To ensure fairness and give the proctor ultimate power over determining
suspicious behavior.
● Description: If a student is disconnected, the system retains the session state, and
resumes when they reconnect.
● Purpose: To support students with unstable or poor internet and mitigate accidental
disqualifications.
● Description: Admins can create and download integrity reports in PDF or CSV format
that captures a summary of alerts, timestamps, and outcomes.
● Purpose: To keep them from being found out, but for transparency and documentation
for audit or academic review.
42
Feature 5: Final UI Improvements and Accessibility features
We assume that the vast majority of students will have rather stable internet. Although the
platform has session auto-reconnect, frequent or longer disconnects could still affect the
continuity of their exam or the accuracy of the monitoring process.
2. Proctors Are Trained in Understanding and Using the Live Dashboard and Review.
We assume that instructors or proctors using the system have had basic training or an
onboarding session. They will understand how to interpret alerts, how to use the manual review
panel, and how to create reports.
We assume that the backend infrastructure is capable of handling and scaling up the storage of
temporary video snapshots and logs, usually expected to have the most storage capacity during
high-volume and multiple concurrent user exam sessions.
It is assumed that the system has been trained, or configured with data from Sprint 2, to reduce
false positives in order to enhance positive matches. The alert thresholds for gaze deviation,
audio issues, and multiple faces alert have already been set.
We assume that academic institutions or exam boards feel comfortable accepting integrity
reports generated by the system (time stamped logs, summaries of alerts raised, comments
written by proctors, etc.) as valid supporting documents.
43
6. UI Customization for Focus not Avoidance
We envision options to toggle features such as themes or text size to be used appropriately by
students—for accessibility or comfort. We anticipate that students will not use this to hide
notifications or avoid supervision.
We envision this system of operation being a hybrid model—AI detection of anomalies and
human final decisions about events. The AI was never designed to be a judge and jury; it's a
smart assistant.
We assume that the exams can be scheduled and have a start and end time, not be open ended
or rolling exams about which students create exam schedules. This is important for being able
to track, synchronize, and even generate logs.
This sprint finalizes the platform’s architecture by focusing on scalability, performance, and
modular communication between services.
● Websocket powered for streaming video and alert experience without refresh
● Proctors can apply filters for alerts by severity or student ID.
44
4.Reporting Engine
5.UI/UX Engine
1.Student Side:
2.Server Side:
3.Proctor Side:
4.Reporting:
45
2.3.4 UI DESIGN
46
2.3.5 Functional Test Cases
47
2.1.7 Committed Vs Completed User Stories
48
Table 2.9 Sprint Retrospective for the Sprint
49
CHAPTER 3
RESULTS AND DISCUSSION
3.1 Project Outcomes
50
CHAPTER 4
CONCLUSION & FUTURE ENHANCEMENTS
After many long months of planning, coding, testing, so much learning, and sometimes even
re-planning, we were ecstatic to finally launch ProctorPro.
And finally, here we were, with our own software package. But it wasn't just any software
package. We were responding to an urgent and reality-based issue in education: How do we
make online exams feel as legitimate as in-person exams? How will we create something that
preserves the integrity of academic honesty without making the exam feel cold and robotic?
With each sprint, each stand-up meeting, and each bug we squashed, we realized we were one
step closer to answering that question.
Sprint 1 we concentrated on doing things the right way. We needed to do the right thing for the
students: registration, facial recognition onboarding, roles. We were conscientious of the
engagement and safety of the students. We wanted student engagement to be high, while still
being simple enough to not overwhelm them. We focused on flows, design, and explaining to
students what action the software was taking and why.
This is where Sprint 2 gets smart. Literally. This was intelligence—ProctorPro watched,
learned and alert. Gaze tracking, audio monitoring, multiple face detection—each was tested
not just to see if they worked technically, but to see if they were fair. We were intent on ensuring
students did not feel judged. We were intent to support the honest and discourage dishonest
behaviour, without sounding like Big Brother.
Then it was on to Sprint 3—the last lap. The polish, the dashboard, the report output, the
resiliency. This is where everything came together as a live ecosystem. Proctors were watching
students in real time, they were reviewing student behaviours, they were exporting clean
reports, and students were able to recover from dropped Internet connections without alarm.
We felt we had built something of real utility.
Our pride doesn’t stem from the features or tech stack; the pride always comes from the
philosophy behind the platform. We built ProctorPro with empathy, we asked how it would be
to be watched and how it would be to do the watching. We sought to bridge that emotional
chasm, using respect in the design, transparent alerts, and intelligent assistance, not judgment.
51
This was project that taught us a lot. About AI, about ethics, about working with teams, and
about what it takes to really ship something of value that solves real problems using technology.
It made us better developers, better collaborators, and, honestly, better listeners.
For us, ProctorPro isn't the end of a project; it's the start of a process to build technology that
feels like being human. And it's what we hope it will inspire, wherever it goes next.
FUTURE ENHANCEMENTS
1. Voice Biometrics for Identity Verification -
● Why it is important: Currently, Proctor Pro uses fixed thresholds for flagging student
behaviours (eg: looking away for too long). However, not all students behave in the
same way.
● How it works: An adaptive AI would learn over time, a student's natural habits. For
instance, one student may look away when they are in thought, while another student
may softly say their questions aloud. Instead of continuously auto-flagging these
behaviors, the system would learn what is normal for them.
● Student benefit: Less false alerts. Builds trust. Monitoring feels more tailored to the
student rather than a constant judgement.
● Why It's Important: Many students, particularly in developing regions, lack laptops.
But they have smartphones.
52
● How it Helps: Making ProctorPro mobile-compatible would allow students to take
exams from their phones or tablets, while retaining most of the core components such
as face tracking, alerts, and exam integrity.
● Benefit for Students: Access to exams for everyone—regardless of devices available or
economic circumstances.
● What this means: Currently students and instructors will need to switch between
ProctorPro and the tools they already use e.g. Moodle, Google Classroom, or Microsoft
Teams.
● How that helps: Deeper integration with your LMS would let instructors set and observe
an exam from their current tools. Students could launch ProctorPro immediately -
without logging onto another system.
● Benefits to instructors: Smoother working. Lesser technical friction. Simple adoption
across schools and colleges.
5. Multi-Language and Cultural Adaptation
● Why it matters: ProctorPro may be used globally, by students from different cultures,
backgrounds, and language groups.
● How it helps: Providing multi-language support, and adapting behavioural thresholds
to accommodate cultural differences would increase the inclusivity of the platform and
mitigate misunderstandings. For example, in many cultures, avoiding direct eye contact
is respectful - not suspicious.
● Student Benefit: Less anxiety. Better understanding. A platform that is designed for
them.
● Why it is important: In high stakes exams, any doubt about logs or reports tampering
will compromise the whole exam process.
● How it helps: With blockchain, all logs (face match logs, alerts, timestamps) would be
immutable. They would be locked after the exam. This is perfect for competitive exams,
government exams, certifications.
● Institutional benefit: Legal-grade evidence and audit trails. Better credibility
53
7. Live Proctor-Student Communication
● Why it matters: There are times when students don’t even notice something is wrong.
Perhaps their webcam slipped, or they leaned too far out of frame.
● How it helps: A chat or voice notification could be built-in to enable the proctor to say,
“Please adjust your camera,” or “You are partially out of frame,” without stopping the
exam or generating a report.
● Student benefit: Again, instant feedback rather than passive flags. Feels human rather
than an imposing figure.
● Why it is important: Monitoring can be daunting. Students can feel like they are being
scrutinized.
● How it helps: Gamifying the experience with “Clean Session” badges or recording
“Focus Streaks” can boost honest behavior in a fun way. Rewards (not just
punishments) can change behavior.
● Student advantage: Motivation, not fear. A relaxed exam climate that still encourages
integrity
54
CHAPTER 5
REFERENCE
[1] A. Smith and B. Jones, “Biometric Methods in Remote Proctoring Systems,” IEEE Trans.
Educ., vol. 10, no. 2, pp. 104-112, March 2020.
[2] C. Lee, D. Kumar, and F. Patel, “Machine Learning Approaches for Detecting Anomalous
Behavior in Online Exams,” in Proc. IEEE Int. Conf. on E-Learning, 2021, pp. 235-240.
[3] Ahmed, M., and Parvez, M. (2023). AI-driven proctoring systems for online assessments:
A comparative study. Journal of Educational Technology, 15(2), 112-130.
[4] Kumar, R., and Singh, P. (2022). Deep learning approaches for automated exam proctoring.
IEEE Transactions on Learning Technologies, 14(4), 295-308.
[5] Patel, S., and Dey, A. (2021). Ensuring integrity in online exams: AI-based proctoring
methods. International Conference on E-Learning Innovations, 2021.
[6] Chen, L., and Wu, Y. (2020). A hybrid AI-human model for effective online exam
supervision. Journal of Artificial Intelligence in Education, 28(3), 223-241.
[7] Hassan, R., and Al-Mutairi, A. (2023). Ethical implications of AI-based proctoring in online
education. Computers and Education Review, 45, 78-92.
[8] Johnson, T., and Parker, M. (2022). Security and fairness concerns in automated online
proctoring. Cybersecurity in Education Journal, 9(1), 102-119.
[9] Sanchez, R., and Gomez, A. (2023). Real-time gaze tracking and behavioral analysis for
online exam monitoring. International Journal of Computer Vision and Education, 17(1), 56-
72.
[10] Li, Z., Wang, J. (2021). Deepfake detection for AI-based online proctoring. Journal of AI
Ethics and Security, 6(2), 88-104.
[11] Brown, K., Adams, C. (2022). The role of machine learning in adaptive online proctoring.
Advances in AI Education, 11(4), 137-152.
[12] Miller, D., Roberts, E. (2021). Addressing bias and fairness in AI-driven online exam
proctoring. AI Society Journal, 36(3), 309-325.
55
[13]Chaturvedi, K., Gupta, R., and Tripathi, R. (2021). AI-based Online Exam Proctoring
System. International Journal of Educational Technology, 18(2), 55-70.
[14]Mukherjee, S., & Sharma, R. (2023). Machine Learning Techniques in Remote Exam
Proctoring: A Comparative Study. IEEE Transactions on Learning Technologies, 16(1), 112-
128
[15]Patel, N., and Singh, A. (2022). Enhancing Academic Integrity Using AI-Based Proctoring.
Journal of E-Learning Research, 12(3), 45-63
[16]Zhou, J., and Wang, Y. (2023). Deep Learning for Cheating Detection in Online Exams.
Neural Networks and Learning Systems, 29(2), 77-92
[17]Kumar, D., and Raj, P. (2021). Evaluating Automated Proctoring Systems for Large-Scale
Online Assessments. Journal of Digital Education, 15(2), 88-105
[18]Lee, S., and Kim, H. (2022). Ethical Considerations in AI-Based Online Proctoring.
Education & Ethics Journal, 19(3), 35-52.
[19]Chen, X., and Li, Z. (2023). Privacy Issues in AI-Powered Remote Exam Proctoring. IEEE
Privacy & Security Conference, 4(2), 98-114
[20]Bhatia, R., and Agarwal, P. (2021). Facial Recognition and AI in Proctoring: Challenges
and Solutions. Computers & Education Journal, 10(2), 67-83.
[22]Jain, S., & Mehta, K. (2022). Exploring Biometric Authentication for Online Proctored
Exams. Computers in Human Behavior, 15(4), 23-41
[23]Gupta, P., & Sharma, L. (2022). AI-Driven Plagiarism Detection in Online Exams.
Advances in Artificial Intelligence, 17(2), 75-94.
[24]Miller, D., & Williams, R. (2021). Bias in Automated Proctoring Systems: A Critical
Analysis. Journal of AI Ethics, 16(3), 112-134
56
Appendix
57
B . Sample codes
58
59
60
61
62