0% found this document useful (0 votes)
4 views8 pages

127350

The document presents a Learning Analytics Dashboard developed for an introductory Python programming course, focusing on improving learning outcomes and addressing gender diversity among students. The dashboard analyzes graded Jupyter Notebooks to provide insights into student performance, submission rates, and potential challenges faced by different groups. The paper outlines the dashboard's design, features, and initial findings, as well as plans for future enhancements to better support educators and students in programming education.

Uploaded by

raveendar1819
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views8 pages

127350

The document presents a Learning Analytics Dashboard developed for an introductory Python programming course, focusing on improving learning outcomes and addressing gender diversity among students. The dashboard analyzes graded Jupyter Notebooks to provide insights into student performance, submission rates, and potential challenges faced by different groups. The paper outlines the dashboard's design, features, and initial findings, as well as plans for future enhancements to better support educators and students in programming education.

Uploaded by

raveendar1819
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

A Learning Analytics Dashboard for Improved Learning Outcomes and

Diversity in Programming Classes

Iris Groher1 a
, Michael Vierhauser2 b
and Erik Hartl1
1 Johannes Kepler University Linz, Institute of Business Informatics, Software Engineering, Linz, Austria
2 University of Innsbruck, Department of Computer Science, Innsbruck, Austria

Keywords: Learning Objectives, Assurance of Learning, Learning Analytics, Dashboard, Diversity.

Abstract: The increased emphasis on competency management and learning objectives in higher education has led to
a rise in Learning Analytics (LA) applications. These tools play a vital role in measuring and optimizing
learning outcomes by analyzing and interpreting student-related data. LA tools furthermore provide course
instructors with insights on how to refine teaching methods and material and address diversity in student
performance to tailor instruction to individual needs. This tool demonstration paper introduces our Learning
Analytics Dashboard, designed for an introductory Python programming course. With a focus on gender
diversity, the dashboard analyzes graded Jupyter Notebooks, to provide insights into student performance
across assignments and exams. An initial assessment of the dashboard, applying it to our Python programming
course in the previous year, has provided us with interesting insights and information on how to further improve
our class and teaching materials. We present the dashboard’s design, features, and outcomes while outlining
our plans for its future development and enhancement.

1 INTRODUCTION through targeted intervention (Vieira et al., 2018).


Making use of LA can support both students and
In recent years, the systematic management of educators in many different ways. For students, it
competencies and learning objectives has gained can help to personalize the learning path and enhance
widespread popularity, particularly in higher educa- their learning experience. Students can further use LA
tion (Malhotra et al., 2023; Bergsmann et al., 2015). to monitor their own progress and the individual feed-
In this context, Learning Analytics (LA) has become back gained can help them understand their strengths
a major endeavor and a means to track and analyze and weaknesses and make improvements. Educators
learning outcomes and achievement of competencies. can use LA to improve the quality of their courses
LA is primarily concerned with the measurement, col- and refine their teaching methods, course materials,
lection, analysis, and interpretation of data related and support services. They can identify difficulties of
to students and their learning in order to understand their students and potential drop-outs, measure course
and ultimately optimize learning outcomes (Scheffel engagement and assessment performance, and mea-
et al., 2014). sure learning objectives to ensure that their courses
The importance of applications that support teach- meet the required standards. LA can further play a
ing and learning has increased significantly in re- crucial role in supporting diversity in higher educa-
cent years, due in no small part to the COVID-19 tion by helping educators identify, understand, and
pandemic. Through the use of technologies such as address disparities in student performance, engage-
online platforms, virtual learning environments, and ment, and outcomes.
learning management systems (LMS), huge amounts As a step towards LA in Programming Education,
of data are generated, which creates opportunities for and to foster diversity analysis, we have created a
measuring the learning success of students and when Learning Analytics Dashboard for our introductory
necessary, positively influencing learning outcomes Python programming course. The dashboard takes
graded Jupyter Notebooks (Johnson, 2020) from as-
a https://orcid.org/0000-0003-0905-6791 signments or exams as an input, and provides statis-
b https://orcid.org/0000-0003-2672-9230 tical analyses and visualization of assignments and

618
Groher, I., Vierhauser, M. and Hartl, E.
A Learning Analytics Dashboard for Improved Learning Outcomes and Diversity in Programming Classes.
DOI: 10.5220/0012735000003693
Paper published under CC license (CC BY-NC-ND 4.0)
In Proceedings of the 16th International Conference on Computer Supported Education (CSEDU 2024) - Volume 2, pages 618-625
ISBN: 978-989-758-697-2; ISSN: 2184-5026
Proceedings Copyright © 2024 by SCITEPRESS – Science and Technology Publications, Lda.
A Learning Analytics Dashboard for Improved Learning Outcomes and Diversity in Programming Classes

their individual exercises or tasks. Furthermore, we programming learning platform Artemis integrates
have put specific emphasis on the gender diversity competency-based learning to generate personal-
aspect, allowing us to drill down into submissions ized learning paths for individual students (Sölch
and gain valuable insights into how well certain tasks et al., 2023). Other work analyzes IDE usage pat-
were performed by different groups of students. Our terns of students to get insights into their skills
main goal was, for us as educators and course in- and performance (Ardimento et al., 2019). Uta-
structors, to gain insight into the challenges and dif- machant et al. (Utamachant et al., 2023) assess stu-
ficulties our students have with the different topics dent engagement levels and identify at-risk students
covered in our Python course. As we were facing a through learning activity gaps. In general, LA has
gender gap, with respect to course performance and been a growing issue in recent years with active re-
drop-outs in the previous semesters, which has also search and a slew of tools on the commercial market.
been frequently reported as a major issue (Marquardt Moreover, established LMSs have integrated capabil-
et al., 2023; Rubio et al., 2015; Groher et al., 2022) ities into their platforms. For example, Moodle, as
in computer science classes, we wanted to find out if, an open-source platform, provides analytics capabil-
where, and to what extent, female students might face ities via a plug-in extension (Moodle, 2023). Moo-
increased difficulties in our course. dle Analytics provides several different models (static
In this tool demonstration paper, we present our and ML-based) that allow generating statistics about,
initial version of the dashboard, its application in our for example, drop-out risks, activities that are due to
programming course, and the insights and findings we submission, and further predictive models. In this
gained when using the LA capabilities of the dash- context, Mwalumbwe et al. (Mwalumbwe and Mtebe,
board. We also report on the current and future plans 2017) conducted a study with the intent to develop an
to further expand the capabilities of the dashboard. LA tool and analyze data from Moodle LM systems.
The remainder of this paper is structured as fol- Focusing on students as a target user group, Peraic
lows. In Section 2 we provide a brief introduction and Grubisic (Peraić and Grubišić, 2022) have pre-
to the topic of LA in programming education and re- sented a “Learning Analytics Dashboard for students”
lated tools and provide a brief introduction to our in- (LAD-s), providing visualization for student success
troductory Python programming course and the main and engagement and further providing predictive an-
requirements that stem from this course, guiding the alytics capabilities.
initial development of our dashboard. In Section 3 Woodclap (Woodclap, 2023) is another platform,
we then present the dashboard and its features, with that focuses on virtual classrooms facilitating com-
a concrete application use case in Section 4. Finally, munication with students on smartphones, messages,
in Section 5 we discuss enhancements, additional fea- and real-time interaction while monitoring student en-
tures we are planning on adding as part of our ongoing gagements and providing feedback on teaching tech-
work, and conclusions. niques. Moreno-Medina et al. (Moreno-Medina et al.,
2023) used this setting with chemical engineering stu-
dents in combination with gamification strategies to
2 BACKGROUND AND COURSE assess and improve student participation and moti-
vation. Krusche and Berrezueta-Guzman (Krusche
SETTING and Berrezueta-Guzman, 2023) provide an interactive
learning environment for programming classes foster-
In this section, we present the background and tools ing iterative performance enhancement by real-time
related to our work, and a brief overview of our in- feedback mechanisms. However, the platform does
troductory programming course, and requirements for not integrate support for diversity analysis and is lim-
our LA dashboard derived from our experiences. ited to task-level analysis.
While existing systems offer valuable functionali-
2.1 Learning Analytics in Programming ties for course management and related analysis, they
Education lack specific support for assignment-level and task-
level analysis of programming courses. Also, support
Learning analytics has already successfully been ap- for diversity analysis is often limited. This motivated
plied in programming education. López-Pernas and us to develop a customized dashboard for our setting.
Saqr (López-Pernas and Saqr, 2021) combine data
from different sources, such as learning manage-
ment systems and programming assessment tools
to identify learning patterns among students. The

619
CSEDU 2024 - 16th International Conference on Computer Supported Education

2.2 Course Setting there’s also a foresight to expand the platform’s capa-
bilities to incorporate the needs of students and pro-
We started our introductory Python programming gram managers in the future. For now, we defined the
course in 2021, as part of a new university-wide digi- following requirements for our dashboard:
talization initiative, where all study programs (includ-
R1. Course Management. A fundamental re-
ing non-technical/CS-related ones) should gain some
quirement revolves around the management of
familiarity with programming and algorithmic think-
courses. This includes functionalities to set
ing. As part of this, we took over the programming
course settings, such as determining the start
education for business students, particularly, business
and end dates, entering the number of assign-
administration and economics.
ments, defining requirements such as the num-
With a total of 6 ECTS, the course is split into
ber of submissions and points for passing the
a weekly, slide-based lecture with additional live-
course, and setting the number of students en-
coding sessions, and a corresponding weekly exer-
rolled in the course.
cise where students should apply the concepts from
the previous lecture by solving examples during class R2. Document Management. The dashboard
and as part of their homework. Pair programming should allow for the seamless upload of graded
is applied during the exercise and by this students Jupyter Notebooks, adhering to a defined JSON
should work together on programming tasks covering format.
the topic of the lecture. Additionally, homework as- R3. Analytical Insight into Assignments. To track
signments consisting of 5-6 individual tasks are dis- course progress and ensure equitable assess-
tributed that have to be completed and submitted by ment, there’s a need to provide analytics about
the students within one week. Tutors manually cor- the number of submissions per assignment over
rect the assignments, give feedback, and assign points the semester.
to the tasks of the assignments. Students are graded
R4. Student Data Management. This encompasses
based on the points they receive for the weekly as-
the ability to manage pertinent student data, in-
signments and an exam at the end of the semester.
cluding their names, IDs, gender, and details
The main challenge, in this case, was that, com-
about their enrolled study program.
pared to a computer science study program, where
one can expect a certain level of technical (and mathe- R5. Descriptive Statistics on Performance. For an
matical) background, the students participating in our in-depth analysis of student performance, edu-
courses are quite diverse, with different educational cators require a distribution of points per assign-
backgrounds and prior knowledge related to program- ment over the semester for all students. Addi-
ming. For most students, our course was the first tionally, a separate analysis filtered by gender,
time they have written code and/or executed a pro- visualized using box plots to depict the variabil-
gram written by themselves. ity and central tendency is needed. For each as-
For this purpose, we opted for Python as a pro- signment a detailed breakdown into the number
gramming language, instead of Java – which is the of submissions, average points for the collective
standard language for programming education in CS student body, and an analysis separated by gen-
courses, in conjunction with Jupyter Notebooks. The der is necessary. An average effort metric fur-
weekly assignments are distributed as Jupyter Note- ther illuminates the student’s engagement levels.
books and the students submit their solutions as note- A similar granularity of insights is required for
books in Moodle. The final exam at the end of the each task and exam.
semester is conducted with the CodeRunner (Lobb, R6. Individual Student Analytics. For personal-
Richard and Hunt, Tim, 2023) plugin in Moodle. ized feedback and support, each student’s pro-
file should be enriched with their performance
2.3 Stakeholders & Requirements metrics, including points per assignment, aver-
age points, pass status, number of submissions,
In higher education, effective utilization of LA plays and other relevant details.
an important role in ensuring effective curriculum R7. Export Capability. Recognizing the diverse
management and enhancing student outcomes. This uses of such data, there should be a provision for
is especially important for lecturers, and course in- exporting student-specific data for further anal-
structors who engage directly with the course content ysis or reporting purposes.
and the students. While the initial design of our plat-
form primarily targets the needs of these educators,

620
A Learning Analytics Dashboard for Improved Learning Outcomes and Diversity in Programming Classes

Figure 1: Learning Analytics Dashboard: Assignment-level analysis view.

3 ANALYSIS DASHBOARD tasks within the assignment) or a noticeable decline in


submissions over time. Fig. 1 provides an overview of
In this section, we present details of our Learning An- the main view of the dashboard. The top part [A] pro-
alytics Dashboard by first providing an overview of its vides information about the raw submission numbers
main features from this first iteration prototype and for each assignment and allows us to easily identify
then providing a brief overview of the technical de- if submission numbers are declining for a particular
tails of the implementation. A short demonstration assignment, or steadily over time. Additionally, the
video of the main Dashboard features1 is available on- lower [B] part provides an overview of the results for
line. each assignment, i.e. the points achieved by students,
and the spectrum (Box Plot) of the results. This helps
3.1 Features us to identify exercises that might be particularly dif-
ficult (where students have received fewer points) or
The requirements of our initial implementation were potential effects of different educational backgrounds
driven by the need to gain insights into students’ abil- (where we have a broad spectrum of points achieved).
ity to successfully solve assignments, the submission • Detailed Analysis on Assignment and Task Level.
rates of individual assignments and constituent tasks, While analysis on the assignment level can provide
and identifying potential gender gaps (cf. R3, R5), some valuable insights into the overall course, it does
hence largely by our own requirements. In the fol- not provide sufficient details on how students handle
lowing, we provide a brief overview of the function- individual assignments and the topics and ultimately
ality and show examples in the dashboard (cf. Fig. 1 learning objectives associated with these assignments
and Fig. 2 ) (and the constituent tasks). The second analysis level
• General Analysis and Trends of Assignments. In (cf. Fig. 2), therefore, is concerned with drilling down
our previous programming classes, we have often ex- into individual assignments and tasks part of the as-
perienced drop-outs in the middle of the semester or signment (cf. R6). The top part [C] in this case pro-
even towards the end of the course. Therefore, one of vides again an overview of submission numbers and
our main requirements was to get a better overview results, whereas the bottom part [D] goes into further
of individual assignments (handed out on a weekly detail for each of the tasks. For each task, we get de-
basis), and whether there was a steady number of tailed insights into how well the students performed,
handed-in assignments (and successfully completed in terms of the points achieved. The ”point-per-point”
visualization (Grouped Bar Charts) provides detailed
1 https://github.com/TeachingAndLearningSciences/res insights on the distribution of points for individual
ources tasks and allows us to identify tasks that might be po-

621
CSEDU 2024 - 16th International Conference on Computer Supported Education

Figure 2: Learning Analytics Dashboard: Task-level analysis view.

tentially too difficult or complex. capabilities, additional functionality is related to the


• Gender Analysis. A cross-cutting concern for all ability to define courses with respective course set-
analysis activities is the aspect of gender. Based on tings (e.g., the number of students part of the course,
our previous, multi-year, experience of offering basic grading schemes, and number of assignments) (cf.
programming classes for various different study pro- R1). Additionally, as establishing interfaces to exist-
grams, we have observed gender gaps in several of our ing university systems where student data is stored,
courses. Research in this area has shown that precau- is typically challenging, we added the ability to store
tionary measures and actions can (at least partially) basic student information (e.g., names, gender, study
rectify such issues, for example, by providing ap- program), to ensure data privacy, stored only locally
propriate teaching material and assignments (Schmitz on university premises (cf. R4). In conjunction with
and Nikoleyczik, 2009; Spieler and Slany, 2018). this, we also added the ability to export results (cf.
• Automated Analysis. One of our key requirements R7) in a standard CSV format, to enable grading in-
was to facilitate automated analysis of graded note- formation to be fed back to the existing university
books, while retaining manual grading of assignments grading system.
performed manually by tutors. Tutors do not only
grade assignments and tasks, but provide individual 3.2 Implementation
feedback about how well a problem was solved, and
give hints and samples when a task could not be com- To facilitate easy access and availability to a broad
pleted. With weekly assignments, the workload for range of users, we decided to implement our Learn-
tutors is already quite high and we did not want to bur- ing Analytics Dashboard as a web application using
den them with additional requirements (e.g., entering JavaScript. The components of the dashboard are
results in yet another tool – i.e. our dashboard). In- structured in a 3-tier architecture, presentation, logic,
stead, we opted for an automated parser, that reads out and data layer with central data storage. For this pur-
assignment/task points (which are entered in a struc- pose, we use a PostgreSQL database where informa-
tured manner) and stores them as JSON information tion in courses, and results extracted from the Jupyter
in the meta-data of the notebook (cf. R2). This in- notebooks are stored. As the dashboard uses sensi-
formation is then used for subsequent analysis in the ble data concerning students and learning outcomes,
dashboard. As a positive side-effect, this also decou- we refrained from using cloud services, but deployed
ples the dashboard from the specific structure/format our application as containers using Docker2 . The core
of the assignments and allows for updated/changed implementation, presentation, and logic use Node.js3 ,
Jupyter notebooks in the future, as long as the grading Next.js4 , and React5 .
information is provided in the predefined JSON for-
mat. This further contributes to the aspect of general-
izability of the dashboard with potential applications 2 https://www.docker.com
to other (types) of programming classes (cf. further 3 https://nodejs.org/en
discussion in Section 5). 4 https://nextjs.org
• Other Capabilities. Besides the main analysis 5 https://react.dev

622
A Learning Analytics Dashboard for Improved Learning Outcomes and Diversity in Programming Classes

4 APPLICATION EXAMPLE &


DISCUSSION
As an initial assessment of the usefulness of our
Learning Analytics Dashboard application, we used
it to analyze one iteration of our Python course in the
summer semester of 2022. In this section, we present
the insights and findings and further discuss its limi-
tations and potential threats that need to be taken into
account.

4.1 Analyzed Python Course


As part of this initial validation, we used the
dashboard to analyze students’ performance in 10
Matplotlib
homework programming assignments throughout the
course. The assignments covered a range of topics
from variables and data types, to advanced modules
like NumPy and Pandas, as well as object orientation.
Fig. 3 (top) shows the number of submissions for Pandas

each assignment and the distribution of points re-


ceived for all students. The bottom part shows the
detailed results for two tasks part of Assignment 9.
The analysis of the assignments with the help of our Figure 3: Results from our analysis – Trends in Assignment
dashboard revealed several key insights. Submission/Points (top) and detailed results for tasks (Pan-
• Submission Trends. While there was a slight de- das and Matplotlib) in the modules assignment (bottom).
crease in the number of submissions for Assignments
9 and 10, we did not observe a significant drop-out. no significant gender-based performance gap was de-
The lower numbers for Assignments 9 and 10 can be tected. We explicitly designed this course in an inclu-
attributed to the fact that only 8 out of 10 submissions sive way based on our previous findings in an intro-
were mandatory. ductory Java course. Further analysis of exam results
• Assignment Metrics. For the first three assign- and time spent on homework assignments, however,
ments, covering basic concepts, data types, and sim- are required to further confirm the gender equality in
ple programs, we observed a high average score and our Python course.
little spread, However, further into the semester and
with increasing complexity, from Assignment 4 on-
ward, we observed a lower average score and a higher
4.2 Implications and Limitations
spread in points.
Using the visualization capabilities of the dashboard
• Assignment Drill-Down. The ability to further an-
we were able to identify issues during the semester
alyze constituent tasks of an assignment also revealed
pertaining to assignments and the topics covered in
some interesting insights, particularly for assignments
the class. Parts of these findings were later – at the
where we already observed a significant spread in
end of the semester – also confirmed through a ques-
points. Particularly, for Assignment 9, which covered
tionnaire we sent out to students, where we specif-
the topic of modules including Math, NumPy, Mat-
ically asked for issues/challenges they experienced,
plotlib, and Pandas, we observed the largest variabil-
and improvements for the class. Assignment 9, cov-
ity in scores. Notably, one-third of the students scored
ering different modules like Pandas and Matplotlib,
less than 1 out of 3 points in the pandas task, whereas
seemed to be particularly difficult for the students in
half of the students reached the maximum score of 3
our course. We, therefore, developed additional ma-
(cf. Fig. 3 – bottom part). For Assignment 10, which
terial and planned an extra lesson in the following
focused on object orientations, the median score in-
semester. Also, the rule that students only need to
creased, indicating better understanding compared to
submit 8 out of 10 exercises leads to the fact that
the previous assignment.
many students drop the last two exercises (covering
• Gender Gap. Throughout the semester and across
modules and object orientation). As a result, we plan
all assignments, in contrast to our initial assumptions,

623
CSEDU 2024 - 16th International Conference on Computer Supported Education

to change this rule in future semesters especially the be interpreting the dashboard’s data. Contextualizing
topic of modules is needed in courses of subsequent the data with qualitative insights is also recommended
semesters. to avoid simplistic or misleading conclusions.
• Limitations. This preliminary validation does not Future work should focus on addressing these is-
capture other potential factors affecting student per- sues through a combination of technical safeguards,
formance, such as attendance, participation in tuto- policies, and user education to ensure that the dash-
rials, or specific educational backgrounds hence, we board serves as an effective, ethical, and secure edu-
can only draw limited conclusions about the learning cational tool.
outcomes of the course. However, the primary pur-
pose was to assess the usefulness of our dashboard
and the initial set of visualizations and statistical anal- 5 CONCLUSION AND FUTURE
yses that are provided. Furthermore, we so far only
covered one semester, but after initial positive results, WORK
our future plans to extend and apply the dashboard to
subsequent iterations and other programming classes The rapid development of LA in higher education em-
(cf. Section 5) will provide us with additional data and phasizes the need for systematic management of com-
relevant stakeholders for our tool. petencies and learning objectives. In this tool demon-
stration paper, we introduced an innovative Learning
Analytics Dashboard specifically designed for an in-
4.3 Discussion troductory Python programming course. This dash-
board not only aims to assist educators in pedagogi-
While our analytics dashboard offers many possibil-
cal decisions but also focuses on the critical area of
ities for enhancing Python programming education,
gender diversity within the course setting.
it also raises several concerns that require attention.
Our initial application which we used for our own
One of the most important issues regarding the im-
analysis, provided a series of valuable insights into
plementation of our Learning Analytics Dashboard is
student performance and engagement, pointing out
the concern for student privacy. The dashboard col-
specific challenges regarding the topics of modules
lects and analyzes various types of data, including as-
in Python. We could not detect a significant gender
signment points, task points, and gender information.
gap and drop-out rates in our course. These insights,
While this data is important for educational insights,
even with our initial prototype, already demonstrated
it also raises questions about the confidentiality and
the power of LA as not just a reactive tool for un-
anonymity of student information. Ensuring that the
derstanding student performance, but as a proactive
data is securely stored and accessed only by autho-
mechanism that allows for targeted interventions to
rized personnel is vital. Additionally, the dashboard
enhance educational equality.
must comply with relevant data protection regulations
Future work will expand on these initial successes.
to ensure student privacy.
We plan to enhance the dashboard’s capabilities to
Ethical considerations extend beyond data pri-
include more diversified analytics features, poten-
vacy. The gender analysis feature, for instance, could
tially adding support for analyzing different educa-
inadvertently preserve stereotypes or biases if not
tional backgrounds. We further plan to add support
carefully designed and interpreted. There is also the
for competency management and the establishment
ethical question of how the data should be used. For
of links between competencies and assignment and
example, should low performance of students trigger
exam tasks and to analyze competency coverage of
an automatic alert to educational staff, or should the
the tasks and competency achievements of students
data only serve as an analytical tool for course im-
in the course. We are currently also working on sup-
provement? Balancing data utility and ethical consid-
port to increase the degree of automation. This in-
erations is crucial in this case.
cludes a dedicated grading-support plug-in in Visual-
The risk of data misinterpretation is inherent in
Studio for tutors, that generates the JSON data and
any analytics tool. In the educational context, incor-
automatically uploads notebooks to the dashboard
rect interpretation of the dashboard’s data could lead
when graded. Furthermore, we plan on going beyond
to misplaced educational interventions. For exam-
Jupyter notebook-based Python courses, and support
ple, a gender-based performance gap in assignment
for analysis capabilities over multiple semesters. The
points might be wrongly attributed to pedagogical is-
current version of our dashboard focuses on educa-
sues when external factors could be influencing the
tors as our primary stakeholders. In the future, we
data. Therefore, it is essential to provide adequate
will also provide views for students to monitor their
training for lecturers and program managers who will
progress in the course.

624
A Learning Analytics Dashboard for Improved Learning Outcomes and Diversity in Programming Classes

REFERENCES HCI International Conference - Late Breaking Pa-


pers. Interaction in New Media, Learning and Games,
Ardimento, P., Bernardi, M. L., Cimitile, M., and Ruvo, pages 390–408, Cham. Springer Nature Switzerland.
G. D. (2019). Learning analytics to improve coding Rubio, M. A., Romero-Zaliz, R., Mañoso, C., and
abilities: a fuzzy-based process mining approach. In de Madrid, A. P. (2015). Closing the gender gap in
Proc. of the 2019 IEEE International Conference on an introductory programming course. Computers &
Fuzzy Systems, pages 1–7. IEEE. Education, 82:409–420.
Bergsmann, E., Schultes, M.-T., Winter, P., Schober, B., Scheffel, M., Drachsler, H., Stoyanov, S., and Specht, M.
and Spiel, C. (2015). Evaluation of competence-based (2014). Quality indicators for learning analytics. Jour-
teaching in higher education: From theory to practice. nal of Educational Technology & Society, 17(4):117–
Evaluation and Program Planning, 52:1–9. 132.
Groher, I., Vierhauser, M., Sabitzer, B., Kuka, L., Hofer, A., Schmitz, S. and Nikoleyczik, K. (2009). Transdisciplinary
and Muster, D. (2022). Exploring diversity in intro- and gender-sensitive teaching: didactical concepts and
ductory programming classes: an experience report. technical support. International Journal of Innovation
In Proc. of the ACM/IEEE 44th International Confer- in Education, 1.
ence on Software Engineering: Software Engineering Spieler, B. and Slany, W. (2018). Female teenagers and
Education and Training, pages 102–112. IEEE. coding: Create gender sensitive and creative learning
Johnson, J. W. (2020). Benefits and pitfalls of jupyter environments. In Constructionism 2018: Construc-
notebooks in the classroom. In Proc. of the 21st an- tionism, Computational Thinking and Educational In-
nual Conference on Information Technology Educa- novation, pages 405–414.
tion, pages 32–37. ACM. Sölch, M., Aberle, M., and Krusche, S. (2023). Integrat-
Krusche, S. and Berrezueta-Guzman, J. (2023). Introduc- ing competency-based education in interactive learn-
tion to programming using interactive learning. In ing systems. In Companion Proc. of the 13th Interna-
Proc. of the 2023 IEEE 35th International Confer- tional Learning Analytics and Knowledge Conference,
ence on Software Engineering Education and Train- pages 53–56.
ing, pages 178–182. IEEE. Utamachant, P., Anutariya, C., and Pongnumkul, S. (2023).
Lobb, Richard and Hunt, Tim (2023). Moodle CodeRun- i-ntervene: applying an evidence-based learning ana-
ner. https://moodle.org/plugins/qtype coderunner. lytics intervention to support computer programming
[Online; Accessed 01-10-2023]. instruction. Smart Learning Environments, 10:37.
López-Pernas, S. and Saqr, M. (2021). Bringing synchrony Vieira, C., Parsons, P., and Byrd, V. (2018). Visual learning
and clarity to complex multi-channel data: A learn- analytics of educational data: A systematic literature
ing analytics study in programming education. IEEE review and research agenda. Computers & Education,
Access, 9:166531–166541. 122:119–135.
Malhotra, R., Massoudi, M., and Jindal, R. (2023). Shift- Woodclap (2023). Woodclap. https://www.wooclap.com/
ing from traditional engineering education towards de/. [Online; Accessed 01-10-2023].
competency-based approach: The most recommended
approach-review. Education and Information Tech-
nologies, 28(7):9081–9111.
Marquardt, K., Wagner, I., and Happe, L. (2023). Engag-
ing girls in computer science: Do single-gender inter-
disciplinary classes help? In 2023 IEEE/ACM 45th
International Conference on Software Engineering:
Software Engineering Education and Training (ICSE-
SEET), pages 128–140. IEEE.
Moodle (2023). Moodle Analytics. https://docs.moodle.or
g/402/en/Analytics. [Online; Accessed 01-10-2023].
Moreno-Medina, I., Peñas-Garzón, M., Belver, C., and
Bedia, J. (2023). Wooclap for improving student
achievement and motivation in the chemical engi-
neering degree. Education for Chemical Engineers,
45:11–18.
Mwalumbwe, I. and Mtebe, J. S. (2017). Using learning
analytics to predict students’ performance in moodle
learning management system: A case of mbeya uni-
versity of science and technology. The Electronic
Journal of Information Systems in Developing Coun-
tries, 79(1):1–13.
Peraić, I. and Grubišić, A. (2022). Development and eval-
uation of a learning analytics dashboard for moodle
learning management system. In Proc. of the 2022

625

You might also like