0% found this document useful (0 votes)
46 views5 pages

Measures For Recommendations Based On Past Students' Activity

revisao

Uploaded by

pedrinabrasil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views5 pages

Measures For Recommendations Based On Past Students' Activity

revisao

Uploaded by

pedrinabrasil
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Measures for recommendations based on past students’

activity

Michal Huptych1,2 Michal Bohuslavek1, 3 Martin Hlosta1 Zdenek Zdrahal1, 2


Knowledge Media Institute 1 CIIRC, 2 Faculty of Mechatronics, Informatics3
The Open University, Walton Hall Czech Technical University and Interdisciplinary Studies
Milton Keynes, MK7 6AA, UK Zikova street 1903/4 Technical University of Liberec
{michal.huptych; martin.hlosta; Prague, 166 36 Studentska 1402/2, 461 17 Liberec 1
z.zdrahal}@open.ac.uk Czech Republic Czech Republic
[email protected]

ABSTRACT ommendation of learning materials and the creation of per-


This paper introduces two measures for the recommenda- sonalised study plans ([8][2][4][11]).
tion of study materials based on students’ past study activ- In order to improve student learning, it is necessary to
ity. We use records from the Virtual Learning Environment know which learning activities lead the students towards
(VLE) and analyse the activity of previous students. We as- success. In the case of an online environment with large
sume that the activity of past students represents patterns, amounts of materials, this might be difficult to obtain man-
which can be used as a basis for recommendations to current ually. However, there are several techniques that allows to
students. process the data an automated way. The important is to
The measures we define are Relevance, for description of specify strategy which might by use for description and rep-
a supposed VLE activity derived from previous students of resentation of data.
the course, and Effort, that represents the actual effort of Recommender systems provide information, items of in-
individual current students. Based on these measures, we terest or services to the user according to the users’ activ-
propose a composite measure, which we call Importance. ities and preferences. This paper presents a new approach
We use data from the previous course presentations to to recommender design. Recommenders evaluate user be-
evaluate of the consistency of students’ behaviour. We use haviour and preferences and offer the user the most appro-
correlation of the defined measures Relevance and Average priate learning resource. There are different recommender
Effort to evaluate the behaviour of two different student techniques [1] [12] implemented in a number of recommender
cohorts and the Root Mean Square Error to measure the systems [7][15]. According to [12] these techniques can be
deviation of Average Effort and individual student Effort. divided into four categories:

• Collaborative techniques construct recommenda-


CCS Concepts tions from the behaviour and results of similar learners.
Similarity is usually calculated from the VLE activi-
•Applied computing → Education; E-learning; Distance
ties of the recommendation recipient and other learn-
learning;
ers in the present or past courses. A detailed descrip-
tion of collaborative recommenders can be found in
Keywords [12][3][5][2].
Learning strategy; Recommendation; Student Retention; Learn- • Content-based techniques use for recommendation
ing Analytics; Relevance; Effort only information about the users and their histories
[12]. Typical problem solving methods are Case Based
1. INTRODUCTION Reasoning and Attribute-based techniques, which de-
rive the recommendations from the learner profile [1][6]
Data and metadata generated by e-learning systems can
[18][12][14].
be fed back to various education-related tasks, such as the
evaluation of learning materials and the design of new ma- • Matrix/tensor factorization techniques consist of
terials [4], predictions of student performance [13][16], rec- decomposition of a tensor to factors. The recommen-
Permission to make digital or hard copies of all or part of this work for personal or
dation calculates factorization of known tensor values,
classroom use is granted without fee provided that copies are not made or distributed and use the product of factors to obtain the vector of
for profit or commercial advantage and that copies bear this notice and the full citation unknown values. For details see [16].
on the first page. Copyrights for components of this work owned by others than the
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or • Association rules are machine learning techniques
republish, to post on servers or to redistribute to lists, requires prior specific permission
and/or a fee. Request permissions from [email protected]. for discovering dependence patterns in data. The rec-
LAK ’17, March 13 - 17, 2017, Vancouver, BC, Canada ommender mines rules from activities of learners in the
c 2017 Copyright held by the owner/author(s). Publication rights licensed to ACM.
past to recommend activities to the current learner.
ISBN 978-1-4503-4870-6/17/03. . . $15.00 Examples of association rules used for recommenda-
DOI: http://dx.doi.org/10.1145/3027385.3027426 tion are in [10][17].
The approach presented in this paper draws from the col-
laborative techniques and association rules. We evaluate the Pw Pw−1
i=1 cp (i, a) − i=1 cp (i, a)
VLE activities of successful students in the previous presen- R (w, a) = PN , (1)
tation, compare it with the currently supported learner and i=1 cp (i, a)
recommend activities that should decrease the differences where P
cp (i, a) is the number
between the two. Some topics introduced earlier in the study w Pw−1 of clicks for the activity a in
week i, i=1 cp (i, a), i=1 cp (i, a) are cumulative clicks
plan are prerequisites for ones presented later, e.g. knowing from the beginning (week 1) to week w and w − 1, respec-
the HTML language is necessary to understand the design PN
tively. i=1 cp (i, a) is the cumulative sum until the last
of web applications. These dependencies are reflected in the
week N of the previous presentation. Henceforth:
learner behaviour and could be discovered from the measures
introduced in this paper.
• Relevance is always non-negative, ∀w∀a, R (w, a) ≥ 0,

2. PROBLEM DESCRIPTION • sum of the


P Relevance for each activity over all weeks
At the Open University (OU), courses (modules) take usu- is 1, ∀a, w R (w, a) = 1,
ally about 40 weeks and are offered to students in a number
of consecutive years. Each module has a study plan which • Relevance of each activity is the same for all students.
breaks the course content down into Blocks. Each Block
presents a different topic taught in the course and is further An example of the cumulative clicks for 5 selected activ-
divided into Parts (1 week long). Thus, Block 1 Part 1 refers ities is shown in Figure 1, the corresponding relevance is
to week 1, Block 1 part 2 to week 2, etc. The study plan shown in Figure 2.
usually does not significantly change between presentations.
Study materials are provided in the Virtual Learning Envi- 3.2 Capturing learners’ activity
ronment (VLE) and therefore student clicks can be recorded.
Each click has a ‘semantic label’ called activity type, which Further, we need a measure that can capture the activity
indicates the kind of interaction with the VLE. Examples of the learner in the VLE, that can be related to relevance.
of activity types are forum, resource, ou-content or quiz. Therefore, we create a measure Effort and define it as:
Clicks on different activity types have different information
Pw
content; resource is a page with text in pdf and therefore one cc (i, a) − w−1
P
i=1 i=1 cc (i, a)
click provides access to all the underlying content. On the E (w, a) = PN , (2)
i=1 cp (i, a)
other hand, ou-content refers to the study materials repre-
sented in usually highly structured HTML and the number where cc (i, a) is number of clicks for activity a in week i
of VLE accesses pretty well represents student effort. Key from current student,cp (i, a) is numberPof clicks P for activity
study materials in modules are represented as ou-content a in week i from previous presentation, w−1 i=1 c c , w
i=1 cc (i, a)
and for this reason we analyse clicks labeled as ou-content, are the numbers of cumulative clicks for given activity from
both in the previous, already completed presentation and in the beginning of the P current presentation to week w − 1 and
the current one. w, respectively, and N i=1 cp (i, a) is the number of cumula-
Each block in the study plan has associated expected tive clicks until the last week of the previous presentation.
study time. However, since student VLE interactions in the Henceforth:
previous presentations are recorded, the real effort required
for understanding each topic can be measured in terms of • sum of the Effort for each activity over all weeks can
the average number of clicks of successful students on the reach one of the following eventuality:
corresponding web pages. We assume that the performance
of students who passed the previous presentation well ap- 
proximates the effort required at present. < 1, if
PN
i=1 cp (i, a)

The problem addressed in this paper is how to use ou-

 Pw
> i=1 cc (i, a)


content VLE activities of the previous presentation and VLE



= 1, PN
data collected from current students to design a person-
X if i=1 cp (i, a)
∀a, E (w, a) is Pw (3)
alised study recommender that navigates students through w

 = i=1 cc (i, a)
 PN
the study plan. > 1, if i=1 cp (i, a)




Measuring of the time-on-task is not simple [9]. In our 
 Pw
< i=1 cc (i, a)
case the approximation by number of clicks is sufficient.
• Effort is given for each student individually.
3. RECOMMENDATION STRATEGY
The recommendation strategy is constructed from rele- • Average Effort is given as average of Effort over all
vance of the study material and learners’ activity. These students
concept are formally defined in the following sections.
Thus, the effort represents an approximation of the progress
3.1 Capturing study materials’ relevance for the given activity for an individual student. An example
Relevance is defined as a normalized difference of the aver- of effort is shown in Figure 3.
age cumulative students activity a, measured by the cumu- Relevance and Effort, formalized by (1) and (2), capture
lative number of clicks on a specific study activity, between our intuition of a transferring of the past experience (Rele-
two consecutive weeks i-1 and i: vance) to the behaviour of current student (Effort).
average cumulative clicks 80

60

40

20

2014/10 2014/11 2014/12 2015/01 2015/02 2015/03 2015/04 2015/05 2015/06 2015/07
date

activity name Block 1 Part 1 Block 1 Part 4 Block 2 Part 2 Block 3 Part 2 Block 4 Part 2

Figure 1: Average number of cumulative clicks in time

0.3
Relevance

0.2

0.1

0.0

−5 0 5 10 15 20 25 30 35 40
week

activity name Block 1 Part 1 Block 1 Part 4 Block 2 Part 2 Block 3 Part 2 Block 4 Part 2

Figure 2: Relevance derived from the cumulative clicks

3.3 Recommendation vious research [13] shows that VLE behaviour is the discrim-
Thus, we propose a recommender strategy to output for inative factor between successful and unsuccessful students.
each activity a in week i it’s Importance as: From the previous presentation we selected 1,062 students
and from the current one 922 students. We focus only on the
activity types for which we know that the repeated clicking
I (w, a) = R (w − 1, a) − E (w − 1, a) , (4) is relevant, i.e. ou-content.
where R (w − 1, a) and E (w − 1, a) are appropriate Rele- The Relevance and the Effort are both positive for all
vance and Effort for given activity in a previous week, re- activities and weeks. If we use an Average Effort (over all
spectively. Thus, the Importance represents a combination students) in particular weeks, we can postulate that the Rel-
of information of the Relevance of some activity in the pre- evance and the Average Effort should be correlated. To
vious week and Effort of the student for the given activity. measure the similarity, we use Pearson’s correlation.
Figure 4 shows that the Relevance of the educational ac-
4. EVALUATION tivities in the previous presentation is similar with the Effort
in the current presentation across all the weeks for successful
We can empirically evaluate similarity between students students. This means that a) the behaviour of the successful
behaviour for the current and previous presentation. We use students does not change from the previous to the current
2014 presentation for computing the Relevance and 2015 as presentation and b) the use of Effort value will recommend
the presentation for retrieving the learners Effort. the activity which should allow the learner to achieve similar
In both presentations, we select only successful students. results as the successful students in the topics where they
We disregard the failed/withdrawn students because the pre-
0.3

0.2
Effort

0.1

0.0

−5 0 5 10 15 20 25 30 35 40
week

activity name Block 1 Part 1 Block 1 Part 4 Block 2 Part 2 Block 3 Part 2 Block 4 Part 2

Figure 3: Example of the Effort

Block 6 Part 1
Block 5 Part 5
Block 5 Part 4
Block 5 Part 3
Block 5 Part 2
Block 5 Part 1
Block 4 Part 5
Block 4 Part 4
Block 4 Part 3 correlation
1.0
Average Effort

Block 4 Part 2
Block 4 Part 1
Block 3 Part 5 0.5
Block 3 Part 4
Block 3 Part 3 0.0
Block 3 Part 2
Block 3 Part 1 −0.5
Block 2 Part 5
Block 2 Part 4 −1.0
Block 2 Part 2
Block 2 Part 1
Block 1 Part 6
Block 1 Part 4
Block 1 Part 3
Block 1 Part 2
Block 1 Part 1
Block 1 Part 1
Block 1 Part 2
Block 1 Part 3
Block 1 Part 4
Block 1 Part 6
Block 2 Part 1
Block 2 Part 2
Block 2 Part 4
Block 2 Part 5
Block 3 Part 1
Block 3 Part 2
Block 3 Part 3
Block 3 Part 4
Block 3 Part 5
Block 4 Part 1
Block 4 Part 2
Block 4 Part 3
Block 4 Part 4
Block 4 Part 5
Block 5 Part 1
Block 5 Part 2
Block 5 Part 3
Block 5 Part 4
Block 5 Part 5
Block 6 Part 1

Relevance

Figure 4: Correlation matrix for Relevance of previous presentation and Average Effort for current presenta-
tion

are lagging behind. activities is shown in Table 1.


To show a deviation of the Average Effort and individual Dependencies between topics are shown in Figure 2. For
Efforts we use the Root Mean Square Deviation (RMSD) example, though the highest relevance of Block 1 Part 1 is
(definition in [16]). The RMSD for the selected particular in about week 1 of the presentation, the topic is obviously
[5] M.-I. Dascalu, C.-N. Bodea, A. Moldoveanu,
Table 1: RMSD of average Effort and particular in- A. Mohora, M. Lytras, and P. O. de Pablos. A
dividual Efforts recommender agent based on learning styles for better
Activity name RMSD Activity name RMSD virtual collaborative learning experiences. Computers
Block 1 Part 1 0.14 Block 3 Part 5 0.15 in Human Behavior, 45:243–253, 2015.
Block 1 Part 2 0.13 Block 4 Part 1 0.10 [6] H. Drachsler, H. G. Hummel, and R. Koper. Personal
Block 1 Part 3 0.11 Block 4 Part 2 0.11 recommender systems for learners in lifelong learning
Block 1 Part 4 0.12 Block 4 Part 3 0.12 networks: the requirements, techniques and model.
Block 1 Part 6 0.11 Block 4 Part 4 0.14 International Journal of Learning Technology,
Block 2 Part 1 0.17 Block 4 Part 5 0.14 3(4):404–423, July 2008.
Block 2 Part 2 0.16 Block 5 Part 1 0.12 [7] H. Drachsler, K. Verbert, O. C. Santos, and
Block 2 Part 4 0.17 Block 5 Part 2 0.12 N. Manouselis. Panorama of Recommender Systems to
Block 2 Part 5 0.18 Block 5 Part 3 0.14 Support Learning, In Recommender Systems Handbook
Block 3 Part 1 0.14 Block 5 Part 4 0.16 (eds: F .Ricci and L. Rokach and and S. Bracha).
Block 3 Part 2 0.11 Block 5 Part 5 0.12 Springer US, Boston, MA, 2015.
Block 3 Part 3 0.14 Block 6 Part 1 0.14 [8] G. Durand, N. Belacel, and F. LaPlante. Graph theory
Block 3 Part 4 0.12 – – based model for learning path recommendation.
Information Sciences, 251:10–21, December 2013.
[9] V. K. et al. Does time-on-task estimation matter?
also relevant in week 7 and 8. Similar dependencies exist implications on validity of learning analytics findings.
between other topics. Journal of Learning Analytics, 2(3):81–101, February
2016.
5. CONCLUSIONS AND FUTURE WORK [10] E. Garcı́a, C. Romero, S. Ventura, and C. de Castro.
An architecture for making recommendations to
In this work, we propose a novel strategy for personalized courseware authors using association rule mining and
study recommendation that utilises the information from collaborative filtering. User Modeling and
the successful students in the previous presentation. We User-Adapted Interaction, 19(1):99–132, February
define two measures, Relevance and Effort, which describe 2009.
a past students’ behaviour and current student’s effort, re-
[11] A. Garrido, L. Morales, and I. Serina. On the use of
spectively. Further, we define the theoretical principle of the
case-based planning for e-learning personalization.
recommendation based on these two measures, which we call
Expert Systems with Applications, 60:1–15, October
Importance.
2016.
We use the historical VLE activity for evaluation of our
[12] A. Klašnja-Milićević, M. Ivanović, and A. Nanopoulos.
concept by correlating Relevance and Effort, which repre-
Recommender systems in e-learning environments: a
sents consistency of students behaviour between both pre-
survey of the state-of-the-art and possible extensions.
sentations. The result shows a correlation (means ± std =
Artificial Intelligence Review, 44(4):571–604,
0.94 ± 0.05) between the activities of previous and current
December 2015.
students. We interpret this finding as confirmation that the
successful students have an important and significant pat- [13] J. Kuzilek, M. Hlosta, D. Herrmannova, Z. Zdrahal,
tern of learning. and A. Wolff. Ou analyse: analysing at-risk students
Currently, we are enriching the OUAnalyse system with at the open university. Learning Analytics Review,
the proposed recommender and we are planning to evaluate LAK15-1:1–16, March 2015.
it’s impact on students behaviour. [14] J. Liu, P. Dolan, and E. R. Pedersen. Personalized
news recommendation based on click behavior. In
Proceedings of the 15th International Conference on
6. REFERENCES Intelligent User Interfaces, pages 31–40. ACM,
[1] G. Adomavicius and A. Tuzhilin. Toward the next February 2010.
generation of recommender systems: A survey of the [15] J. Lu, D. Wu, M. Mao, W. Wang, and G. Zhang.
state-of-the-art and possible extensions. IEEE Trans. Recommender system application developments: A
on Knowl. and Data Eng., 17(6):734–749, April 2005. survey. Decision Support Systems, 74:12–32, June
[2] A. R. Anaya, M. Luque, and M. Peinado. A visual 2015.
recommender tool in a collaborative learning [16] N. Thai-Nghe, L. Drumond, A. Krohn-Grimberghe,
experience. Expert Systems with Applications, and L. Schmidt-Thieme. Recommender system for
45:248–259, March 2016. predicting student performance. Procedia Computer
[3] J. Bobadilla, F. Serradilla, and A. Hernando. Science, 1(2):2811–2819, 2010.
Collaborative filtering adapted to recommender [17] F.-H. Wang and H.-M. Shao. Effective personalized
systems of e-learning. Knowledge-Based Systems, recommendation based on time-framed navigation
22(4):261–265, May 2009. clustering and association mining. Expert Systems with
[4] M.-I. Dascalu, C.-N. Bodea, M. N. Mihailescu, E. A. Applications, 27(3):365–377, October 2004.
Tanase, and P. O. de Pablos. Educational [18] Y. J. Yang and C. Wu. An attribute-based ant colony
recommender systems and their application in lifelong system for adaptive learning object recommendation.
learning. Behaviour & Information Technology, Expert Systems with Applications, 36(2):3034–3047,
35(4):290–297, January 2016. March 2009.

You might also like