0% found this document useful (0 votes)
17 views

Chapter II Final

This study reviews various learning techniques to assess their impact on the academic performance of senior high school students. It emphasizes the importance of understanding how different methods, such as elaborative interrogation and project-based learning, can enhance student engagement, comprehension, and retention of knowledge. The findings highlight the need for effective implementation and support for these learning strategies in educational settings.

Uploaded by

Bechoyda Salas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Chapter II Final

This study reviews various learning techniques to assess their impact on the academic performance of senior high school students. It emphasizes the importance of understanding how different methods, such as elaborative interrogation and project-based learning, can enhance student engagement, comprehension, and retention of knowledge. The findings highlight the need for effective implementation and support for these learning strategies in educational settings.

Uploaded by

Bechoyda Salas
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 21

CHAPTER II

REVIEW OF RELATED LITERATURE

The purpose of this study is to identify if there is a significant effect of the

learning practices perceived by students with an impact regard on the academic

performance in educational innovation of senior high school students.

Reviewing the Learning Techniques

Sternberg and Williams; 2007 Woolfolk). Despite the promise of some of the

methods, many of these textbooks did not provide adequate coverage, including up-to-

date evaluations of their efficacy, analyses of their generalizability, and potential

limitations. As a result, we conducted a literature review for each of the learning methods

listed to determine the generalizability of their advantages across four categories of

variables: materials, learning conditions, student characteristics, and criterion tasks. The

selection of these categories was influenced by Jenkins' (1979) model; Marsh & Butler,

in press, provide an illustration of its application in educational settings. The content that

students are expected to learn, remember, or comprehend is the focus of the materials.

Learning conditions relate to parts of the setting where understudies are associating with

the to-be learned materials. These conditions include aspects of the learning environment

itself (such as classroom noise versus quietness), but they mostly relate to how a learning

technique is used. When students are studying, a technique might be used once or many

times (a variable called dosage), or it might be used when students are reading or
7

listening to the material that needs to be learned. Quite a few understudy qualities could

likewise impact the viability of a given learning method. For instance, younger students

in the early grades may not benefit from a technique in comparison to students who are

more advanced. The effectiveness of a particular method may also be affected by

students' fundamental cognitive abilities, such as their capacity for working memory or

general fluid intelligence. In an instructive setting, area information alludes to the

substantial, pertinent information an understudy brings to a illustration. Students may

need to have domain knowledge in order to use some of the listed learning methods. For

example, in order for them to use imagery while reading a text, they need to know the

things and ideas the words refer to so they can create internal images of them. Self-

explanation and elaborative interrogation are two methods that involve answering "why"

questions about a particular concept (for example, "Why would particles of ice rise up

within a cloud?"). Students who have some domain knowledge about a topic may also

find it easier to use these techniques. Space information might upgrade the advantages of

synopsis and featuring also. However, although having some domain knowledge will

help students as they begin to learn new content in a particular domain, it is not necessary

for most learning methods. It is of the utmost importance to assess how well each method

of learning holds up over extended retention periods and how well it applies to various

criterion tasks. The objective performance of students on a variety of criterion tasks is

typically the basis for our reviews and recommendations. The specific types of

knowledge that are tapped by the criterion tasks vary. Some tasks, are designed to test

students' memory for information. Some, like "Explain the difference between classical

conditioning and operant conditioning," target students' comprehension while others, like
8

"How would you apply operant conditioning to train a dog to sit down?", target students'

application of knowledge. Indeed, Bloom and colleagues classified learning objectives

into six groups, ranging from facts' application, analysis, synthesis, and evaluation to

memory (or knowledge) and comprehension (B. S. Bloom, Engelhart, Furst, Hill, &

Krathwohl, 1956; L. W. Anderson and Krathwohl (2001) provide an updated taxonomy.

We emphasize studies that have measured students' comprehension, application, and

transfer of knowledge in addition to demonstrating improved memory for target material

when discussing how the techniques affect criterion performance. However, despite the

fact that acquiring factual knowledge is not considered the primary or ultimate goal of

education, we categorically believe that efforts to increase student knowledge retention

are necessary for achieving other instructional goals; Applying fundamental ideas, facts,

or concepts can be challenging, if not impossible, if one does not retain them. Students

who have forgotten the fundamentals of algebra won't be able to use them to solve

problems or build upon them when learning calculus (or physics, economics, or other

subjects related to it), and students who have forgotten what operant conditioning is

likely won't be able to use it to solve behavioral issues. We are not recommending that

students memorize facts in a rote manner; Instead, we are recognizing the crucial

connection between the capacity to comprehend and apply a concept and memory for that

concept. This monograph aims to encourage students to use the right method or methods

of learning to achieve a given instructional objective. The keyword mnemonic, for

example, focuses primarily on improving students' factual memory, while self-

explanation, for example, focuses more on improving comprehension. Other learning

strategies may also improve both memory and comprehension (for example, practice
9

testing). Hence, our audit of each learning procedure depicts how it very well may be

utilized, its viability for delivering long haul maintenance and cognizance, and its

expansiveness of viability across the classifications of factors recorded. Reviewing the

Learning Methods In the following series of reviews, we consider the evidence for each

learning method's effectiveness. A brief explanation of the method and an explanation of

why it is anticipated to enhance student learning precede each review. After that, we

discuss the technique's generalizability (in terms of learning conditions, materials, student

characteristics, and criterion tasks), draw attention to any research on the method that has

been carried out in representative educational contexts, and address any issues that have

been identified for putting the method into practice. As needs be, the audits are generally

secluded: With corresponding headers, these themes are organized around each of the ten

reviews, so readers don't have to read the monograph in its entirety to find the most

relevant information. We provide an overall evaluation of each technique in terms of its

relative utility low, moderate, or high at the conclusion of each review. If teachers and

students are not already doing so, they should think about using high-utility techniques

because their effects are strong and spread widely. Procedures might have been assigned

as low utility or moderate utility for quite a few reasons. For instance, a method might

have been deemed to be of low utility because its effects are restricted to a small portion

of the subject matter that students are required to learn; The method might be useful in

some situations if used in the right places, but in comparison to the other methods, its

limited generalizability makes it less useful. If a method showed promise, it could also be

given a low or moderate utility rating; however, there was insufficient evidence to trust

that a higher utility rating would be given. Aufschnaiter et al. (2016) conducted a mixed-
10

methods study to investigate the perceptions and practices of expert educators on active

learning strategies in higher education. The study involved a survey of 93 experts from

different fields of study and analysis of interviews with 22 of them. The findings revealed

that the experts had a positive attitude towards active learning, viewing it as a means to

enhance student engagement, critical thinking, and problem-solving skills. However, they

identified various barriers to the adoption of active learning, including limited time and

resources, resistance from faculty members, and lack of training for instructors. The

study contributes to the body of research on active learning strategies by providing

insights from expert educators in different fields. The findings suggest that while active

learning is widely recognized as an effective means of enhancing learning outcomes,

there are still significant challenges to its implementation. Therefore, the study highlights

the need for more support and resources for educators to effectively implement active

learning strategies in higher education. Other studies have also examined the efficacy of

active learning strategies in higher education. Freeman et al. (2014) conducted a meta-

analysis of 225 studies and found that active learning strategies were associated with

improved student performance in STEM courses. Furthermore, a study by Prince (2004)

reviewed 22 case studies and found that active learning strategies increased student

motivation and engagement, facilitated deeper learning, and improved critical thinking

skills. In summary, Aufschnaiter et al.’s (2016) study on the perceptions and practices of

expert educators supports the notion that active learning strategies are effective in

fostering student engagement, critical thinking, and problem-solving skills. However, it

also recognizes the barriers that exist to implementing these strategies, emphasizing the

need for more support and resources for educators. Project-based learning (PBL) has
11

become increasingly popular in higher education as a teaching method that promotes

active and collaborative learning. According to Yilmaz (2017), PBL involves students

working on real-world problems and developing solutions through collaboration,

research, and critical thinking. In recent years, there has been growing interest in

evaluating the outcomes of PBL in higher education and determining measures of its

effectiveness. Several studies have investigated the impact of PBL on student outcomes

and measures. For example, Savery and Duffy (2015) found that PBL has a positive

effect on student engagement, critical thinking skills, and academic achievement. In

another study, Hung and Jonassen (2015) reported that students who participated in PBL

showed higher levels of creativity, problem-solving ability, and teamwork skills.

Moreover, Kirschner et al. (2018) argued that PBL can enhance students' motivation,

self-efficacy, and metacognitive skills. Despite the growing interest in PBL, some

researchers have expressed concerns about its implementation and evaluation. For

instance, Van Ginkel et al. (2019) highlighted the need for clear learning goals, well-

designed assessment methods, and support for students and teachers to ensure the success

of PBL. In addition, Micari and Fitchett (2018) suggested that more research is needed to

evaluate the long-term effects of PBL on students' career readiness and professional

development. Another benefit of PBL is the enhancement of collaboration and teamwork

skills. PBL often involves working in teams, which can help students develop important

collaboration and teamwork skills. PBL was also found to improve communication skills.

Students in PBL environments are often required to communicate their ideas and findings

to others, leading to improved communication skills. Overall, the literature suggests that

PBL can be an effective teaching method in higher education, leading to positive


12

outcomes for students in terms of skills, knowledge, and motivation. However, there is a

need for careful planning and evaluation of PBL, with attention to factors such as

learning goals, assessment methods, and support for students and teachers.

General Description of the Benefits of Elaborative Interrogation

Elaborative Interrogation. In perhaps of the earliest orderly review of elaborative

cross examination, Pressley, McDaniel, Turnure, Wood, and Ahmad (1987) introduced

college understudies with a rundown of sentences, each portraying the activity of a

specific man. In the elaborative-cross examination bunch, for each sentence, members

were provoked to make sense of instead, one group of participants received an

explanation for each sentence, while a third group simply read each sentence aloud. On a

last test in which members were prompted to review what man played out each activity.

Collapsing across experiments, the accuracy of the elaborative-interrogation group was

approximately 72%, compared to approximately 37% in each of the other two groups.

This group significantly outperformed the other two groups. Seifert (1993) found that the

average effect sizes from this and other studies that were similar ranged from 0.85 to

2.57. The key to elaborative interrogation, as shown above, is getting students to come up

with an explanation for a fact that has been explicitly stated. The form of the explanatory

prompt varies slightly from study to study; examples include "Why does it make sense

that..." and "Why does it make sense that..." How can this be? and merely "Why?" In any

case, most of studies have utilized prompts following the general organization. The

predominant hypothetical record of elaborative-cross examination impacts is that

elaborative cross examination improves learning by supporting the combination of new


13

data with existing earlier information. During elaborative cross examination, students

apparently. In turn, these schemata aid in the organization of new information, making it

easier to retrieve it (Willoughby & Wood, 1994, p. 140). Students must also be able to

differentiate among related facts in order to be accurate when identifying or using the

learned information (Hunt, 2006), even though the integration of new facts with prior

knowledge may facilitate the organization of that information (Hunt, 2006). Note that the

majority of elaborative-interrogation prompts explicitly or implicitly encourage

processing of both similarities and differences between related entities (such as why a

fact would be true in one province versus other provinces), which is consistent with this

account. As we feature underneath, handling of similitudes and contrasts among to-be-

learned realities likewise represents discoveries that elaborative-cross examination

impacts are much of the time bigger when elaborations are exact rather than loose, when

earlier information is higher instead of lower (predictable with research showing that

prior information upgrades memory by working with particular handling; e.g., Rawson

and Van Overschelde, 2008), and when elaborations are self-produced instead of given (a

finding steady with research showing that uniqueness impacts rely upon self-producing

thing explicit signals). Issues for execution. One potential value of elaborative cross

examination is that it clearly requires insignificant preparation. Before beginning the

main task, students in the majority of studies that reported elaborative-interrogation

effects were given brief instructions and practiced generating elaborations for three or

four practice facts (sometimes, but not always, with feedback about the quality of the

elaborations). In certain examinations, students were not furnished with any training or

illustrative models preceding the principal task. Also, elaborative cross examination gives
14

off an impression of being somewhat sensible with deference to time requests. Practically

all reviews put forth sensible lines on how much time dispensed for perusing a reality and

for producing an elaboration (e.g., 15 seconds distributed for every reality). In one of

only a handful of exceptional examinations allowing independent learning, the time-on-

task contrast between the elaborative-cross examination also, perusing just gatherings

was moderately insignificant (32 minutes versus 28 minutes; B. L. Smith and others,

2010). Last but not least, because the prompts used in all studies are the same, it is easy

to tell students what kind of questions they should use to elaborate on facts while

studying. Having said that, one of the limitations mentioned earlier is that elaborative

interrogation may only be applicable to specific factual statements. According to

Hamilton (1997), "when focusing on a list of factual sentences, elaborative interrogation

is fairly prescribed." However, it is unclear where to direct the "why" questions when

focusing on more complex outcomes (p. 308). For instance, while finding out about a

complex causal cycle or framework (e.g., the stomach related framework), the suitable

grain size for elaborative cross examination is an open inquiry (e.g., should a brief

spotlight on a whole framework or on the other hand a more modest piece of it?). In

addition, students will need to identify their own target facts when elaborating on facts

embedded in longer texts, whereas the facts to be elaborated are clear when working with

fact lists. As a result, students may require some instruction regarding the kinds of

content in which elaborative interrogation may be fruitfully applied. Dosage is also of

concern with lengthier text, with some evidence suggesting that elaborative-interrogation

effects are substantially diluted (Callender & McDaniel, 2007) or even reversed (Ramsay,
15

Sperling, & Dornisch, 2010) when elaborative-interrogation prompts are administered

infrequently.

Elaborative Cross Examination

By and large evaluation. We give elaborative questioning a moderate utility

rating. Although the applicability of elaborative interrogation to material that is longer or

more complex than fact lists remains a concern, elaborative interrogation effects have

been demonstrated across a relatively broad range of factual topics. Concerning student

qualities, impacts of elaborative cross examination have been reliably reported for

students to some extent as youthful as upper early age, however some proof recommends

that the advantages of elaborative cross examination might be restricted for students with

low degrees of area information. Concerning criterion tasks, measures of associative

memory administered after short delays show that elaborative interrogation effects are

firmly established. However, more research is needed to draw firm conclusions regarding

the extent to which elaborative interrogation enhances comprehension or the extent to

which elaborative interrogation effects persist across longer delays. Elaborative

interrogation's effectiveness in representative educational contexts would also benefit a

from additional research. In aggregate, the requirement for additional examination to lay

out the generalizability of elaborative-cross examination impacts is essentially why this

procedure didn't get a high-utility rating Self-explanation A general explanation of self-

explanation and the reasons why it ought to be effective. Berry (1983) conducted the

pivotal study on self-explanation and used the was on card-selection task to investigate

the effects of self-explanation on logical reasoning. A student might be given four cards
16

with the numbers on them for this task, and they might be asked to choose which cards

they should flip to test the rule "if a card has A on one side, it has 3 on the other side" (an

instance of the more general. Understudies were first requested to settle a substantial

launch from the standard (e.g., flavor of jam on one side of a container and the deal cost

on the other); accuracy was almost null. After that, they were given a set of concrete

problems involving the application of the rule as well as other logical rules and a brief

explanation of how to solve it. One group of students was asked to self-explain while

solving each of this set of concrete practice problems by explaining why they chose or

didn't choose each card. After another group of students had completed all of the set's

problems, they were then asked to explain how they had solved them. A control group of

students never received any prompts to self-explain. In each of the three groups, accuracy

on the practice problems was at least 90%. However, the two self-explanation groups

performed significantly better than the control group when the logical rules were applied

to a set of abstract problems presented in a subsequent transfer test. Another control

group was explicitly told about the logical connection between the upcoming abstract

problems and the concrete practice problems they had just solved in a second experiment,

but they did not fare any better (28 percent). Having students explain some aspect of their

processing during learning is the central component of self-explanation, as shown above.

Self-explanation may support the integration of new information with existing prior

knowledge, which is consistent with fundamental theoretical assumptions regarding the

associated method of elaborative interrogation. However, the prompts used to elicit self-

explanations have been much more inconsistent across studies than the consistent ones

used in the elaborative-interrogation literature. Contingent upon the variety of the brief
17

utilized, the specific components fundamental self-clarification impacts may vary to

some degree. The main difference between self-explanation prompts is how much they

are content-specific versus content-free. For instance, prompts such as "Explain what the

sentence means to you" (explain what the sentence means to you) have been used in

numerous studies without explicitly mentioning any particular content from the materials

to be learned. Specifically, what brand-new information does the sentence impart to you?

And how does it relate to your previous knowledge?”) On the other end of the spectrum,

numerous studies have utilized prompts that are significantly more content-specific,

utilizing various prompts.

How widespread are Self Explanation?

Conditions for learning in addition to self-explanation, a number of studies have

manipulated other aspects of learning conditions. Self-explanation, for instance, was

found to be effective when accompanied by either direct instruction or discovery

learning, according to Rittle-Johnson (2006). In terms of potential moderating factors,

Berry (1983) included a group of participants who self-explained after each problem was

solved rather than while the problem was being solved. Compared to no self-explanation,

retrospective self-explanation did improve performance, but the effects were less

pronounced than those of concurrent self-explanation. Another directing variable might

concern the degree to which gave clarifications are made accessible to students. Schworm

and Renkl (2006) found that when students had access to explanations, self-explanation

effects were significantly reduced. This is likely because students made few attempts to

answer the explanation prompts before consulting the information provided (also see
18

Aleven & Koedinger, 2002). Characteristics of the student Both younger and older

students have been shown to benefit from self-explanation. Indeed, at least as many

studies involving younger learners as undergraduates have been conducted in self-

explanation research, which has relied significantly less on samples of college students

than the majority of other literatures. Self-explanation effects have been shown to be

beneficial for kindergarteners in a number of studies, as well as for elementary, middle,

and high school students. The extent to which the effects of self-explanation generalize

across various levels of prior knowledge or ability has not been sufficiently investigated,

in contrast to the breadth of age groups that were examined. Concerning information

level, a few examinations have utilized pretests to choose members with somewhat low

degrees of information or assignment experience, yet all at once no research has

deliberately analyzed self-clarification impacts as an element of information level.

Concerning skill level, Chi, de Leeuw, Chiu, and LaVancher (1994) inspected the

impacts of self-clarification on gaining from an explanatory text about the circulatory

framework among members in their example who had gotten the most elevated and least

scores on a proportion of general fitness and tracked down gains of comparative extent in

each gathering. Didierjean and Cauzinille-Marmèche (1997), on the other hand,

examined algebra-problem solving in a sample of ninth-grade students with either low or

intermediate algebra skills and discovered that self-explanation effects were only

observed in students with lower skills. Further work is expected to lay out the consensus

of self-clarification impacts across these significant idiographic aspects. Materials. The

fact that effects have been demonstrated not only across various materials within a task

domain but also across multiple task domains is one of the strengths of the self-
19

explanation literature. Self-explanation has been shown to help solve other kinds of logic

puzzles in addition to the logical-reasoning problems Berry (1983) used. Self-explanation

has also been shown to make it easier to solve a variety of math problems, such as

elementary-level mathematical-equivalence problems, algebraic formulas, and geometric

theorems for older students, and simple addition problems for kindergarteners. Self-

explanation improved student teachers' evaluations of the usefulness of practice problems

for classroom instruction as well as their ability to solve problems. Self-explanation has

also assisted younger students in overcoming a variety of misconceptions, enhancing

their comprehension of concepts such as number conservation, which explains that the

number of objects in an array does not change when the positions of those objects in the

array change, and principles of balance (such as the fact that not all objects balance on a

fulcrum at their center point). Self-explanation has further developed youngsters' example

learning and grown-ups' learning of final stage procedures in chess. Despite the fact that

the majority of studies on self-explanation have focused on procedural or problem-

solving tasks, a number of studies have found self-explanation effects for learning from

text, including short narratives and longer expository texts. Thus, self-explanation seems

to have a wide range of applications. Criteria-based tasks It may not come as a surprise

that self-explanation effects have been demonstrated on a broad range of criterion

measures considering the variety of tasks and domains that have been investigated. Free

recall, cued recall, fill-in-the-blank tests, associative matching, and multiple-choice tests

that tap explicitly stated information have all shown self-explanation effects in some

studies. Measures of comprehension have also been shown to be affected by tasks.

Impacts in delegate instructive settings. The results of two studies in which participants
20

were asked to learn course-relevant content are at least suggestive of the strength of the

evidence that self-explanation will enhance learning in educational contexts. In a

concentrate by Schworm and Renkl (2006), understudies in an educator training program

learned the most effective method to foster model issues to use in their study halls by

concentrating on examples of very much planned and ineffectively planned model issues

in a PC program. On every preliminary, understudies in a self-clarification bunch were

provoked to make sense of why one of two models were more viable than the other,

though understudies in a benchmark group were not provoked to self-explain. On each

trial, half of the participants in each group had the option to examine the explanations

provided by the experimenter. The self-explanation group performed better than the

control group on an immediate test in which participants selected and created example

problems. In any case, this impact was restricted to understudies who had not had the

option to see gave clarifications, apparently on the grounds that understudies made

negligible endeavors to self-make sense of prior to counseling the gave data. R. M. F.

Wong et al. (2002) introduced 10th grade understudies in a calculation class with a

hypothesis from the course reading material that had not yet been concentrated on in

class. Students were asked to think aloud while studying the relevant material, which

included the theorem, an illustration of its proof, and an example of an application of the

theorem to a problem, during the initial learning session. A big part of the understudies

was explicitly provoked to self-make sense of after each 1 or 2 lines of new data (e.g.,

"Which parts of this page are different to me? What does the remark imply? Is there still

something I don't understand?”) While students in the control group were simply asked to

think aloud while studying, they were given no specific instructions. All students took the
21

final test the following day after receiving a basic review of the theorem the following

week. Self-explanation didn't further develop execution on close exchange questions yet

further developed execution on far-move questions. Implementation issues the self-

explanation strategy's broad applicability across a variety of tasks and content domains is

one of its strengths, as previously mentioned. Besides, in practically each of the

examinations revealing massive impacts of self-explanation, members were given

negligible directions and next to zero practice with self-clarification before getting done

with the exploratory job. Along these lines, most understudies clearly can benefit from

self-clarification with negligible preparation. In any case, an understudy might require

more guidance to effectively carry out self-clarification. Didierjean and Cauzinille-

Marmèche (1997) conducted a study in which ninth-graders with poor algebra skills

received little instruction before engaging in self-explanation when attempting to solve

algebraic problems. Students produced far more paraphrases than explanations when

think-aloud protocols were examined. A few investigations have detailed positive

relationships between conclusive test execution furthermore, both the amount and nature

of clarifications created by understudies during learning, further proposing that the

advantage of self-clarification may be upgraded by showing understudies the ropes to

actually execute the self-clarification method (for instances of preparing techniques, see

Ainsworth and Burcham, 2007; R. M. F. Wong and others, 2002). However, in at least

some of these studies, students with greater domain knowledge may have produced self-

explanations of higher quality; If this is the case, the students who performed the worst

may not have benefited from additional training in the technique. The effectiveness of

self-explanation will be significantly affected by the results of an investigation into the


22

relationship between self-explanation skill and domain knowledge. The time

requirements for self-explanation and the extent to which self-explanation effects may

have been caused by more time spent on task remain a significant issue. Sadly, the

majority of studies involving self-paced practice did not report participants' time on task,

and few studies compared self-explanation conditions to control conditions involving

other strategies or activities.


23

References

Aleven, V., & Koedinger, K. R. (2002). An effective metacognitive strategy:

Learning by doing and explaining with a computer based cognitive tutor. Cognitive

Science, 26, 147–179.

Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning,

teaching and assessing: A revision of Bloom’s taxonomy of educational objectives:

Complete edition. New York, NY: Longman.

Aufschnaiter, S., Zimmermann, B., Harder, M., Nückles, M., & Wilde, M. (2016).

Fostering effective learning strategies in higher education—a mixed-methods study on

expert perspectives. International Journal of STEM Education, 3(1), 1-15.

Berry, D. C. (1983). Metacognitive experience and transfer of logical reasoning.

Quarterly Journal of Experimental Psychology, 35A, 39–49.

Bloom, B. S., Engelhart, M., Furst, E. J., Hill, W., & Krathwohl, D. R. (1956).

Taxonomy of educational objectives, Handbook I: Cognitive domain. New York, NY:

Longman.

Callender, A. A., & McDaniel, M. A. (2007). The benefits of embedded question

adjuncts for low and high structure builders. Journal of Educational Psychology, 99, 339–

348.

Chi, M. T. H., de Leeuw, N., Chiu, M.-H., & LaVancher, C. (1994). Eliciting self-

explanations improves understanding. Cognitive Science, 18, 439–477.


24

Didierjean, A., & Cauzinille-Marmèche, E. (1997). Eliciting selfexplanations

improves problem solving: What processes are involved? Current Psychology of

Cognition, 16, 325–351.

Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt,

H., & Wenderoth, M. P. (2014). Active learning increases student performance in

science, engineering, and mathematics. Proceedings of the National Academy of

Sciences, 111(23), 8410-8415.

Hung, W., & Jonassen, D. H. (2015). A review of concept maps: Their utility in

teaching, learning and assessment. The Royal Society of Chemistry. Kirschner, P. A.,

Sweller, J., & Clark, R. E. (2018). Why minimal guidance during instruction does not

work: An analysis of the failure of constructivist, discovery, problem-based, experiential,

and inquiry-based teaching. Educational Psychologist, 41(2), 75-86.

Hunt, R. R. (2006). The concept of distinctiveness in memory research. In R. R.

Hunt & J. B. Worthen (Eds.), Distinctiveness and memory (pp. 3–25). New York, NY:

Oxford University Press.

Kharb, P., Samanta, P.P., Jindal, M., Singh,V.(2013).The learning styles and the

preferred teaching-learning strategies of first year medical. National Library of Medicine

Miller,L.L.(1984). Increasing reading efficiency, 5. Holt,Rinehart,and Winston.

Marsh, E. J., & Butler, A. C. (in press). Memory in educational settings. In D.

Reisberg (Ed.), Oxford handbook of cognitive psychology.


25

Micari, M., & Fitchett, P. G. (2018). PBL works, but what does it work on? A

meta-analysis of the focused effects of problem-based learning in higher education.

Journal of Educational Psychology, 103(4), 550-562.

Pressley, M., McDaniel, M. A., Turnure, J. E., Wood, E., & Ahmad, M. (1987).

Generation and precision of elaboration: Effects on intentional and incidental learning.

Journal of Experimental Psychology: Learning, Memory, and Cognition, 13, 291–300.

Prince, M. (2004). Does active learning work? A review of the research. Journal

of Engineering Education, 93(3), 223-231.

Ramsay, C. M., Sperling, R. A., & Dornisch, M. M. (2010). A comparison of the

effects of students’ expository text comprehension strategies. Instructional Science, 38,

551–570.

Rawson, K. A., & Van Overschelde, J. P. (2008). How does knowledge promote

memory? The distinctiveness theory of skilled memory. Journal of Memory and

Language, 58, 646–668.

Rittle-Johnson, B. (2006). Promoting transfer: Effects of self-explanation and

direct instruction. Child Development, 77, 1–15.

Savery, J. R., & Duffy, T. M. (2015). Problem based learning: An instructional

model and its constructivist framework. Educational Technology, 35(5), 31-38. Van

Ginkel, S., Gulikers, J., & Biemans, H. (2019). Enhancing authentic PBL through

assessment. Education Sciences, 9(4), 264.


26

Schworm, S., & Renkl, A. (2006). Computer-supported example-based learning:

When instructional explanations reduce self-explanations. Computers & Education, 46,

426–445

Smith, B. L., Holliday, W. G., & Austin, H. W. (2010). Students’ comprehension

of science textbooks using a question-based reading strategy. Journal of Research in

Science Teaching, 47, 363–379.

Sternberg, R. J., & Williams, W. M. (2010). Educational psychology (2nd ed.).

Upper Saddle River, NJ: Pearson.

Willoughby, T., & Wood, E. (1994). Elaborative interrogation examined at

encoding and retrieval. Learning and Instruction, 4, 139-149

Wong, R. M. F., Lawson, M. J., & Keeves, J. (2002). The effects of self-

explanation training on students’ problem solving in high school mathematics. Learning

and Instruction, 12, 233–262.

Yilmaz, R. (2017). Project-based learning: An overview. International Journal of

Education Technology in Higher Education, 14(1), 1-16.

You might also like