2020 Book ReadingSuccessInThePrimaryYear PDF
2020 Book ReadingSuccessInThePrimaryYear PDF
Westerveld
Rebecca M. Armstrong
Georgina M. Barton
Reading
Success in
the Primary
Years
An Evidence-Based Interdisciplinary
Approach to Guide Assessment and
Intervention
Reading Success in the Primary Years
Marleen F. Westerveld Rebecca M. Armstrong
• •
Georgina M. Barton
Reading Success
in the Primary Years
An Evidence-Based Interdisciplinary
Approach to Guide Assessment
and Intervention
Marleen F. Westerveld Rebecca M. Armstrong
Griffith University University of Queensland
Southport, QLD, Australia St Lucia, QLD, Australia
Georgina M. Barton
University of Southern Queensland
Springfield Central, QLD, Australia
This Springer imprint is published by the registered company Springer Nature Singapore Pte Ltd.
The registered company address is: 152 Beach Road, #21-01/04 Gateway East, Singapore 189721,
Singapore
Preface
The overall aim of the Reading Success project described in this book is the systematic
and accurate identification of students’ reading profiles within a primary school
setting in Australia to better support the reading outcomes of those students who
struggle in reading. We embraced an education–speech pathology collaborative
practice model (see Archibald, 2017), in line with speech pathology Australia’s
clinical guideline for speech pathologists working in literacy (Serry et al., 2016). In
this study, we adopted a range of evidence-based frameworks and approaches,
including the Simple View of Reading (Gough & Tunmer, 1986), and a
response-to-intervention approach (Jimerson, Burns, & van der Heyden, 2015). The
study involved a strong partnership between all stakeholders including academic
researchers in speech pathology and education and research end-users at the school.
In an attempt to bridge the research-to-practice gap, we present a five-step
assessment to intervention process involved in implementing an interdisciplinary
and evidence-based approach to the identification of and support for students at risk
for or experiencing difficulties in reading in a school setting. We share participants’
perspectives, gathered throughout the study, including the views from leaders,
teachers, students, and support staff, such as speech pathologists within the school
context. As our aim in this project was not to describe or evaluate current literacy
practices, we did not systematically gather information on classroom literacy
practices. However, we provide an easily accessible overview of a range of
evidence-based and theoretically driven initiatives aimed at enhancing the reading
success of primary (or elementary)-aged students in schools. Upon implementing
our five-step assessment to intervention framework, schools will be able to
(i) identify which students are in need of extra support to facilitate reading success;
(ii) create individual speech-to-print profiles for students who show difficulties in
reading comprehension; and (iii) initiate targeted support for students with reading
difficulties that is firmly based on the assessment results (Munro, 2017).
Throughout the book, there is a clear focus on how the team worked collabo-
ratively across different year levels, which included the decision-making processes
used, such as rich dialogue with the leadership team and teachers. As such, it offers
v
vi Preface
References
vii
Contents
ix
x Contents
xv
xvi Authors and Contributor
Contributor
Abstract This chapter explains what skills are needed for reading success. It high-
lights the Simple View of Reading and its core components, word recognition and
language comprehension, and how it may be used as a guiding framework across
different year levels to describe students’ reading abilities and/or difficulties, plan
subsequent intervention, and monitor progress. We describe the underlying spoken
and written language skills needed for successful reading comprehension including
both cognitive and sociocultural approaches to making meaning. We share how cre-
ating speech-to-print profiles for those students who need further support in reading
comprehension may inform instructional practice, using a multi-tiered system of sup-
port approach. The importance of an evidence-based, collaborative, interdisciplinary
approach to identying and supporting students experiencing difficulties in reading is
highlighted. Finally, an explanation of the importance of providing research-based
reading supports at increasingly intense levels matched to the student’s needs is
provided.
1.1 Introduction
Reading is a complex human behaviour. The ultimate goal of all reading is reading
comprehension, the ability to gain meaning from text. The importance of success-
ful reading comprehension for academic success, socio-emotional well-being, and
employment outcomes is undisputable (McGeown, Duncan, Griffiths, & Stothard,
2015), and it is therefore no surprise that a wealth of research exists investigating not
only how students acquire and develop their reading skills, but also what success-
ful intervention looks like when students struggle in their reading development (e.g.
Amendum, Bratsch-Hines, & Vernon-Feagans, 2018; Clarke, Snowling, Truelove, &
Hulme, 2010; Snowling & Hulme, 2011). Unfortunately, a research-to-practice gap
still exists, with many professionals involved in the teaching of reading unsure how
to bridge that gap. We agree with Tunmer and Hoover (2019) that a clearly defined
theoretical framework that underpins assessment and intervention will assist reading
professionals in developing the competencies needed to better understand why some
children struggle with reading and how to provide targeted support and intervention.
This chapter therefore starts by introducing some theoretical frameworks to better
understand what is involved in successful reading comprehension.
A theoretical framework that has received much attention in the last 30 years is the
Simple View of Reading (SVR) (Gough & Tunmer, 1986). Unlike the construction-
integration model (Kintsch, 1988), it does not attempt to explain the cognitive under-
pinnings of the reading process, but it outlines which core components are involved
in reading comprehension. The SVR holds that reading comprehension is the prod-
uct of word recognition and language comprehension. The significant aspect of the
SVR is that reading is conceptualised as a product of both components rather than
an accrual. If one component is poor or non-existent, reading comprehension com-
petency will not be fully achieved. Word recognition (WR) can be defined as the
ability to accurately and efficiently decode the written words on a page, whereas
4 1 Reading Success
Children’s literacy learning starts at birth, long before children commence their for-
mal reading education at school. During this period, also referred to as the emergent
literacy stage (Justice, 2006), children are typically engaged in a range of literacy-
related activities, such as : (a) shared book reading with their parents or early child-
1.3 Emergent Literacy Skills 5
hood educators; (b) exposure to environmental print (e.g. M for McDonalds); (c)
access to writing utensils to ‘write’ their name; and (d) other literate practices appro-
priate for their sociocultural surroundings such as oral storytelling. The development
of emergent literacy skills may be influenced by environmental factors such as family
trauma, displacement, and/or economic hardship or affluence. Regardless, research
demonstrates that the following five emergent literacy skills are strongly predictive
of future reading success (National Early Literacy Panel, 2008), even after control-
ling for socio-economic status and cognition (IQ): alphabet knowledge, phonologi-
cal awareness, rapid automatic naming of (a) digits and (b) objects, emergent name
writing, and phonological memory. Five additional skills were identified that showed
strong correlations but needed some further research, including print concepts and
oral language (vocabulary and grammar). Table 1.1 provides an overview of this
terminology. Using the SVR as a framework, these emergent literacy skills may be
conceptualised as the code-related skills that are needed for future word recogni-
tion and the meaning-related skills required for successful language comprehension
(NICHD, 2005).
Using the SVR as a theoretical framework, students who are unable to develop ade-
quate reading comprehension skills can be categorised into three main groups: (1)
students with dyslexia are those who show significant word reading difficulties, in the
absence of language comprehension problems; (2) students with specific comprehen-
sion difficulties are those who show adequate word recognition skills, but significant
language comprehension difficulties; (3) students with a mixed reading difficulties
profile (in the past referred to as garden variety poor readers) are those who show
weaknesses across word recognition and language comprehension (Spear-Swerling,
2016). These three reader groups are discussed in a little more detail below:
Dyslexia or Specific Word Reading Difficulties These reading difficulties stem
from phonological processing weaknesses (including phonological awareness, rapid
automatic naming [RAN], and/or phonological memory). Students who show this
reading profile have adequate vocabulary and language comprehension skills. These
students will demonstrate reading comprehension difficulties due to their weaknesses
in accurate and/or fluent word recognition skills. It is estimated that approximately
5–10% of students will show dyslexia (Catts, Hogan, & Adlof, 2005; Shaywitz,
Shaywitz, Fletcher, & Escobar, 1990).
Specific Comprehension Difficulties These students will demonstrate reading
comprehension difficulties, despite adequate word recognition skills (accuracy and
fluency). These students will often demonstrate language comprehension weaknesses
across vocabulary, grammar, and oral narrative/higher-order language skills (Catts
et al., 2006; Woolley, 2011). In addition, these students may lack a strategic approach
to comprehension of text (Spear-Swerling, 2015). Research has shown that up to 17%
of students may show this type of reading difficulty.
6 1 Reading Success
Table 1.1 Overview of (emergent) literacy skills with definitions (NELP, 2008) and examples of
assessment tasks
Code-related emergent The skills that are strongly Examples of assessment tasks
literacy skills predictive of word recognition.
The main focus of this book is on the cognitive skills underpinning reading develop-
ment, but we recognise that other factors play an important role in the student’s read-
ing development, such as sociocultural environment (Marjanovič-Umek, Fekonja-
Peklaj, Sočan, & Tašner, 2015). Although the SVR has been well-validated, reading
comprehension itself is a complex multidimensional process (Catts, 2018). Further,
Snow (2002) states that:
reading comprehension is the process of simultaneously extracting and constructing meaning
through interaction and involvement with written language. We use the words extracting
and constructing to emphasize both the importance and the insufficiency of the text as a
determinant of reading comprehension. Comprehension entails three elements:• The reader
who is doing the comprehending • The text that is to be comprehended • The activity in
which comprehension is a part. (p. 11)
When investigating reading comprehension, we thus need to consider what the reader
brings to the process, including the reader’s reading experience, world knowledge,
motivation, and self-perception, as well as their overall cognitive and linguistic (oral
language) skills. We acknowledge the need to take a holistic approach and include
consideration of the purpose of reading and the type of texts students engage with.
8 1 Reading Success
Although not the main focus in this book, the importance of reading motivation and
self-concept cannot be underestimated. Motivation for reading develops early in life
when children start building their emergent literacy skills. As outlined by Katzir,
Lesaux, and Kim (2009), research has shown that parents’ identification of pleasure
as a reason for reading predicted motivation for reading in their young school-aged
children. Moreover, early success or difficulties in learning to read is linked to reading
self-concept and motivation. In other words, if children perceive to perform well, i.e.
experience success in reading, they will be motivated to challenge themselves and
attempt more difficult tasks. On the other hand, if children have challenges in learning
to read, their reading self-concept may weaken and these children may lose motiva-
tion in reading-related tasks. Chapman and Tunmer (1995) conceptualised reading
concept as comprising three components: perceptions of competence, perceptions of
difficulty or ease of reading tasks, and attitudes towards reading, and developed the
Reading Self-Concept Scale (RSCS). Researchers applying this scale found reading
self-concept positively relate to reading comprehension in primary school-aged stu-
dents (Chapman & Tunmer, 1995; see also Chapman & Tunmer, 2003), even after
controlling for the children’s verbal ability (based on a verbal IQ test) and their word
reading ability (Katzir et al., 2009). In the Reading Success project, we measured
students’ reading self-concept using the RSCS and report the results in Chap. 4.
It would be remiss not to mention reading for pleasure (Cremin, Mottram, Collins,
Powell, & Safford, 2014). Students need to be given the time to make choices about
what they read and particularly in areas of personal interest. It is important not only
that children are aware of the benefit of learning to read in relation to academic success
but that reading can be a ‘delightful’ and ‘desirable’ endeavour (Cremin, 2007). Much
research points to the importance of developing positive classroom environments with
effective communities of readers (Cremin et al., 2014) that encourage imaginative
thought and playfulness. Further, the social aspect of reading, when supported, results
in stronger engagement and consequently high achievement across all aspects of
schooling (Ivey, 2014).
1.6 Teaching Reading During the Early Years of Schooling 9
There is clear evidence that the best way to teach children how to effectively learn
to read is by explicitly teaching the following five key ingredients: (1) phonics; (2)
phonemic awareness; (3) fluency (the ability to recognise words automatically); (4)
vocabulary; and (5) comprehension (i.e. understanding what is read) (National Early
Literacy Panel, 2008). It is also important to note that oral language plays an impor-
tant role in learning to read and creative and innovative approaches to learning such
as play-based learning strongly support oral language development. During the first
years of schooling, the emphasis will be on students learning to read, which should at
the very least involve systematic phonics instruction (Hempenstall, 2016). Phonics-
based approaches teach children how to match phonemes (sounds) to graphemes
(letters). In a synthetic phonics approach, children are systematically introduced to
letter sounds (alphabet knowledge), before being taught how to blend these sounds
together into syllables and words. It thus utilises a part-to-whole approach: sound-
ing out each letter (e.g. /m/ /o/ /p/) and blending (synthesising) these phonemes
into a word (mop). Although it goes beyond the scope of this book to provide a
detailed overview of this approach, readers are referred to freely accessible publi-
cations, such as Hempenstall (2016): Read about it: Scientific evidence for effective
teaching of reading (see also Parker, 2018) or the Massey University Early Liter-
acy Research Project (https://www.educationcounts.govt.nz/publications/schooling/
early-literacy-research-project). It is estimated that, when using an evidence-based
approach to early literacy instruction, 90–95% of students will develop accurate
and fluent word recognition, with approximately 5–10% of children demonstrating a
profile of dyslexia (i.e. specific word reading difficulties), depending on the specific
diagnostic criteria that are used (see Al Otaiba, Gillespie Rouse, & Baker, 2018, for
a discussion).
Early success in learning to read is paramount. Apart from the fact that challenges
in reading may affect a student’s reading self-concept and motivation (e.g. Chapman
& Tunmer, 1995), reading ability in first grade is linked to reading ability 10 years
later, even after accounting for cognitive ability (Cunningham & Stanovich, 1997).
Moreover, failure to develop automatic and fluent word recognition during the early
years of schooling will result in less exposure to more complex written texts con-
taining more literate vocabulary (e.g. abstract terms and mental state verbs), more
complex grammatical structures, and conforming to more advanced text schemas
such as exposition or persuasion. This reduced exposure may in turn hamper the
development of more advanced spoken language skills (see case study James in
10 1 Reading Success
Chap. 6). This phenomenon where the ‘rich get richer and the poor get poorer’ has
been coined the Matthew effects in reading (Stanovich, 2009), and highlights the
importance of evidence-based reading tuition coupled with early identification and
timely remediation of reading difficulties.
Table 1.2 Speech-to-Print Profile (adapted from Gillon, 2004, with permission from the author)
Spoken language Written language
Underlying Phonological processing
representations
Vocabulary Phonological Storage and Rule/concept Word-level Text-level
knowledge Awareness Retrieval knowledge
Syntax Syllable level Non word Print concepts Word Reading
repetition recognition: accuracy
Morphology Onset–rime Multisyllabic Grapheme-Phoneme Regular Reading
level word Correspondences: word comprehension
repetition reading
Phonology Phoneme Rapid Single letter Irregular Reading
level Naming fluency/rate
Text structure: Digraph Non-word Writing:
Narrative Trigraph Spelling:
Expository
Persuasion Diphthongs
Reform or change in schools takes time and requires embedded processes in order
for sustainable practices to occur. Educational practices and improvements over the
past few decades have been swift and enduring (Fullan, 2012), creating extensive
pressure on teachers to adapt and adopt new approaches in the classroom. It has been
argued that for change within educational contexts, whole-school approaches can be
effective in ensuring sustained and positive results (Pendergast et al., 2015). Accord-
ing to Hoare, Bott, and Robinson (2017) “whole-school approaches, being multi-
component, collective and collaborative action in the school community, provide
optimal opportunity for creating sustained positive change, as opposed to unidimen-
sional, standalone programmes” (p. 59). It is therefore important that whole-school,
community approaches are implemented for positive change to result. This means
that all parties are involved and invested in making the change work.
Whole-school approaches are driven by strong leadership (Ainscow & Sandill,
2010; Fullan, Hill, & Crévola, 2006; Hill & Crévola, 1997); however, such leadership
needs to ensure positive school cultures and quality student outcomes. Barton and
McKay (2016) note that “the beliefs of leaders and teachers play a significant role
in how they respond to students who experience difficulties with learning” (p. 164)
and in particular in relation to reading outcomes. As such, Barton and McKay (2016)
offer a model that has at its core the students (see Fig. 1.1).
Significant others in students’ lives included in the model include teachers, prin-
cipals and leadership teams, support staff such as learning support teachers, speech
pathologists and other specialists, parents/carers, family, and other community mem-
bers. All stakeholders are needed to provide the support and encouragement for
student success. In the Reading Success project, we recognised the importance of
hearing all perspectives (see also Chap. 7).
1.9 Evidence-Based Practice from Multiple Perspectives 13
Within this complex space are the big six of reading instruction: phonics, phono-
logical awareness, fluency, comprehension, vocabulary, and oral language (Konza,
2014). Also, as mentioned previously, many other aspects of students’ lives impact
on learning including students’ self-worth, motivation and engagement, cultural and
social issues, peers, family and relationships, self-regulation, and affect (see Fig. 1.2).
Students
14 1 Reading Success
The model incorporates the nature of a whole school’s culture, and this includes
staff within the school as well as community members. Any approach adopted needs
to be well communicated to all, and the benefits that students can gain from people’s
experience need to be valued. Additionally, the beliefs of teachers, leaders, and
support staff in relation to students’ capabilities need to be taken into account (see
Chap. 4 for more details). If these are stemmed from deficit views, positive outcomes
are less likely (Barton & McKay, 2016). In this book, we therefore aim to explore
some of these aspects in the school context in which we worked.
1.10 Summary
This chapter defined the Simple View of Reading (SVR) and explained how the SVR
can be used as a conceptual framework for categorising both skilled and struggling
readers. Learning to read is a complex process and requires both word recognition
and language comprehension skills. Difficulties in any of these skills or underlying
components may result in difficulties when learning to read. The chapter introduced a
range of terms that are frequently used in Australian classrooms including response to
intervention (RtI) and multi-tiered systems of support. The use of an evidence-based
framework in combination with the adoption of common terminology is important for
developing a collaborative approach to the identification and remediation of students
at risk for, or experiencing difficulties in reading. Ensuring early reading success and
encouraging approaches that support students to enjoy the reading process and share
their love for reading with their peers, teachers, and family will result in lifelong
positive reading behaviours.
References
Ainscow, M., & Sandill, A. (2010). Developing inclusive education systems: The role of organi-
sational cultures and leadership. International Journal of Inclusive Education, 14(4), 401–416.
https://doi.org/10.1080/13603110802504903.
Al Otaiba, S., Gillespie Rouse, A., & Baker, K. (2018). Elementary grade intervention approaches
to treat specific learning disabilities, including dyslexia. Language, Speech, and Hearing Services
in Schools, 49(4), 829–842. https://doi.org/10.1044/2018_LSHSS-DYSLC-18-0022.
Amendum, S. J., Bratsch-Hines, M., & Vernon-Feagans, L. (2018). Investigating the efficacy of a
web-based early reading and professional development intervention for young English learners.
Reading Research Quarterly, 53(2), 155–174. https://doi.org/10.1002/rrq.188.
Australian Government. (2005). Teaching reading: A guide to the report and recommendations for
parents and carers. Author.
Australian Government. (2015). Action now: Classroom ready teachers. Teacher education
ministerial advisory group report. Author.
Barton, G., & McKay, L. (2016). Conceptualising a literacy education model for junior secondary
students: The spatial and reflective practices of an Australian school. English in Australia, 51(1),
37–45.
References 15
Cain, K., & Oakhill, J. (1996). The nature of the relationship between comprehension skill and the
ability to tell a story. British Journal of Developmental Psychology, 14, 187–201.
Catts, H. W. (2018). The simple view of reading: Advancements and false impressions. Remedial
and Special Education, 39(5), 317–323. https://doi.org/10.1177/0741932518767563.
Catts, H. W., Adlof, S. M., & Ellis Weismer, S. (2006). Language deficits in poor comprehenders: A
case for the simple view of reading. Journal of Speech, Language, and Hearing Research, 49(2),
278–293.
Catts, H. W., Hogan, T. P., & Adlof, S. M. (2005). Developmental changes in reading and reading
disabilities. In H. W. Catts & A. G. Kamhi (Eds.), Connections between language and reading
disabilities (pp. 25–40). Mahwah, NJ: Lawrence Erlbaum.
Catts, H. W., Hogan, T. P., & Fey, M. E. (2003). Subgrouping poor readers on the basis of individual
differences in reading-related abilities. Journal of Learning Disabilities, 36(2), 151–164. https://
doi.org/10.1177/002221940303600208.
Chapman, J. W., & Tunmer, W. E. (1995). Development of young children’s reading self-concepts:
An examination of emerging subcomponents and their relationship with reading achievement.
Journal of Educational Psychology, 87(1), 154–167. https://doi.org/10.1037//0022-0663.87.
1.154.
Chapman, J. W., & Tunmer, W. E. (2003). Reading difficulties, reading-related self-perceptions,
and strategies for overcoming negative self-beliefs. Reading & Writing Quarterly, 19(1), 5–24.
https://doi.org/10.1080/10573560308205.
Clarke, P. J., Snowling, M. J., Truelove, E., & Hulme, C. (2010). Ameliorating children’s reading-
comprehension difficulties: A randomized controlled trial. Psychological Science, 21(8), 1106–
1116.
Cremin, T. (2007). Revisiting reading for pleasure: Delight, desire and diversity. In K. Goouch &
A. Lambirth (Eds.), Understanding phonics and the teaching of reading: A critical perspective
(pp. 166–190). Berkshire, UK: McGraw Hill.
Cremin, T., Mottram, M., Collins, F. M., Powell, S., & Safford, K. (2014). Building communities
of engaged readers: Reading for pleasure. London: Routledge.
Crystal, D. (1987). Towards a “bucket” theory of language disability: Taking account of interaction
between linguistic levels. Clinical Linguistics and Phonetics, 1, 7–22.
Cunningham, A. E., & Stanovich, K. E. (1997). Early reading acquisition and its relation to reading
experience and ability 10 years later. Developmental Psychology, 33(6), 934–945.
Denton, C. A. (2012). Response to Intervention for reading difficulties in the primary grades: Some
answers and lingering questions. Journal of Learning Disabilities, 45(3), 232–243. https://doi.
org/10.1177/0022219412442155.
Fuchs, D., & Fuchs, L. S. (2006). Introduction to response to intervention: What, why, and how
valid is it? Reading Research Quarterly, 41(1), 93–99. https://doi.org/10.1598/RRQ.41.1.4.
Fullan, M. (2012). Change forces: Probing the depths of educational reform. London: Routledge.
Fullan, M., Hill, P., & Crévola, C. (2006). Breakthrough. Thousand Oaks, CA: Corwin Press.
Gillon, G. T. (2004). The speech to print profile. Christchurch, New Zealand: Author. Pro-
file available from https://www.canterbury.ac.nz/media/documents/education-and-health/gail-
gillon—phonological-awareness-resources/resources/speechprintprofile2004.pdf.
Gillon, G. T. (2018). Phonological awareness: From research to practice (2nd ed.). New York: The
Guilford Press.
Gough, P. B., & Tunmer, W. E. (1986). Decoding, reading, and reading disability. Remedial and
Special Education, 7(1), 6–10. https://doi.org/10.1177/074193258600700104.
Hempenstall, K. (2016). Read about it: Scientific evidence for effective teaching of reading CIS
research report. Australia: Centre for Independent Studies.
Hill, P., & Crévola, C. (1997). Key features of a whole-school, design approach to literacy teaching
in schools. Australia: University of Melbourne.
Hoare, E., Bott, D., & Robinson, J. (2017). Learn it, Live it, Teach it, Embed it: Implementing a
whole school approach to foster positive mental health and wellbeing through Positive Education.
International Journal of Wellbeing, 7(3), 56–71.
16 1 Reading Success
Ivey, G. (2014). The social side of engaged reading for young adolescents. The Reading Teacher,
68(3), 165–171.
Justice, L. M. (Ed.). (2006). Clinical approaches to emergent literacy intervention. San Diego, CA:
Plural Publishing.
Katzir, T., Lesaux, N. K., & Kim, Y.-S. (2009). The role of reading self-concept and home literacy
practices in fourth grade reading comprehension. Reading and Writing, 22(3), 261–276. https://
doi.org/10.1007/s11145-007-9112-8.
Kintsch, W. (1988). The role of knowledge in discourse comprehension: A construction integration
model. Psychological Review, 95(2), 163–182.
Konza, D. (2014). Teaching reading: Why the “Fab five” should be the “Big six”. Australian Journal
of Teacher Education (Online), 39(12), 153–169.
Lonigan, C. J., Burgess, S. R., & Schatschneider, C. (2018). Examining the simple view of read-
ing with elementary school children: Still simple after all these years. Remedial and Special
Education, 39(5), 260–273. https://doi.org/10.1177/0741932518764833.
Marjanovič-Umek, L., Fekonja-Peklaj, U., Sočan, G., & Tašner, V. (2015). A socio-cultural per-
spective on children’s early language: A family study. European Early Childhood Education
Research Journal, 23(1), 69–85. https://doi.org/10.1080/1350293X.2014.991096.
McGeown, S. P., Duncan, L. G., Griffiths, Y. M., & Stothard, S. E. (2015). Exploring the relationship
between adolescents’ reading skills, reading motivation and reading habits. Reading and Writing,
28(4), 545–569. https://doi.org/10.1007/s11145-014-9537-9.
Mullis, I. V. S., Martin, M. O., Foy, P., & Drucker, K. T. (2012). PIRLS 2011 International results
in reading. Chestnut Hill, MA, USA: TIMMS & PIRLS International Study Centre.
Nation, K., & Norbury, C. F. (2005). Why reading comprehension fails—Insights from develop-
mental disorders. Topics in Language Disorders, 25(1), 21–32.
National Early Literacy Panel. (2008). Developing early literacy: Report of the National Early
Literacy Panel. Washington, DC: National Institute for Literacy.
NICHD. (2005). Pathways to reading: The role of oral language in the transition to reading.
Developmental Psychology, 41(2), 428–442.
No Child Left Behind Act. (2002). 20 U.S.C. § 6319.
Oakhill, J. V., & Cain, K. (2012). The precursors of reading ability in young readers: Evidence from
a four-year longitudinal study. Scientific Studies of Reading, 16(2), 91–121.
Oakhill, J., Hartt, J., & Samols, D. (2005). Levels of comprehension monitoring and working
memory in good and poor comprehenders. Reading and Writing, 18(7), 657–686. https://doi.org/
10.1007/s11145-005-3355-z.
Parker, S. (2018). Reading instruction and phonics. Theory and practice for teachers. Boston, MA:
Royce-Kotran.
Pendergast, D., Main, K., Barton, G., Kanasa, H., Geelan, D., & Dowden, T. (2015). The Education
Change Model as a vehicle for reform: Shifting Year 7 and implementing Junior Secondary in
Queensland. Australian Journal of Middle Schooling.
Shaywitz, S. E., Shaywitz, B. A., Fletcher, J. M., & Escobar, M. D. (1990). Prevalence of reading
disability in boys and girls: Results of the connecticut longitudinal study. JAMA, 264(8), 998–
1002. https://doi.org/10.1001/jama.1990.03450080084036.
Snow, C. (2002). Reading for understanding: Toward an R&D program in reading comprehension.
Santa Monica, CA: Rand Corporation.
Snowling, M. J., & Hulme, C. (2011). Evidence-based interventions for reading and language
difficulties: Creating a virtuous circle. British Journal of Educational Psychology, 81(1), 1–23.
https://doi.org/10.1111/j.2044-8279.2010.02014.x.
Spear-Swerling, L. (2015). The power of RTI and reading profiles: A blueprint for solving reading
problems. Baltimore, MD: Brookes.
Spear-Swerling, L. (2016). Common types of reading problems and how to help children who have
them. The Reading Teacher, 69(5), 513–522.
Stanovich, K. E. (2009). Matthew effects in reading: Some consequences of individual differences
in the acquisition of literacy. Journal of Education, 189(1–2), 23–55.
References 17
Stark, H. L., Snow, P. C., Eadie, P. A., & Goldfeld, S. R. (2016). Language and reading instruction
in early years’ classrooms: the knowledge and self-rated ability of Australian teachers. Annals of
Dyslexia, 66(1), 28–54. https://doi.org/10.1007/s11881-015-0112-0.
Stein, N., & Glenn, C. (1979). An analysis of story comprehension in elementary school children.
In R. O. Freedle (Ed.), New directions in discourse processing (Vol. 2, pp. 53–120). Norwood,
NJ: Ablex.
Thompson, R., & Whimper, L. A. (2010). Exposure to family violence and reading level of early
adolescents. Journal of Aggression, Maltreatment & Trauma, 19(7), 721–733. https://doi.org/10.
1080/10926771003781347.
Tunmer, W. E., Chapman, J. W., & Prochnow, J. E. (2006). Literate cultural capital at school entry
predicts later reading achievement: A seven year longitudinal study. New Zealand Journal of
Educational Studies, 41(2), 183–204.
Tunmer, W. E., & Hoover, W. A. (2019). The cognitive foundations of learning to read: A framework
for preventing and remediating reading difficulties. Australian Journal of Learning Difficulties,
24(1), 75–93. https://doi.org/10.1080/19404158.2019.1614081.
Verhoeven, L., & van Leeuwe, J. (2012). The simple view of second language reading throughout
the primary grades. Reading and Writing, 25(8), 1805–1818. https://doi.org/10.1007/s11145-
011-9346-3.
Westerveld, M. F., Gillon, G. T., & Moran, C. (2008). A longitudinal investigation of oral narra-
tive skills in children with mixed reading disability. International Journal of Speech-Language
Pathology, 10(3), 132–145. https://doi.org/10.1080/14417040701422390.
Woolley, G. (2011). Reading comprehension: Assisting children with learning difficulties. Dor-
drecht, NLD: Springer.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.
Part I
The Reading Success Project
Chapter 2
Methodology
Abstract This chapter describes the methodology used in the Reading Success
project. We start by briefly describing the school’s context, including the student
cohorts who were involved in the project. The specific qualitative data collection
methods relating to existing school initiatives for improving the reading performance
of their students are then described. The second half of this chapter outlines the
assessment process, including a brief explanation of the types of assessments that
can be used (formal vs. informal; standardised vs. criterion-referenced) and how to
interpret these types of assessment results. We then describe the five-step assessment
process we used to determine which students showed challenges in aspects of their
spoken and/or written language skills to determine which students might benefit
from specific intervention to address their reading needs. An overview of the specific
assessment tasks that were used to determine the students’ performance at each step
of the process is included along with a clear rationale for each assessment and an
explanation of how these assessment results map on to the speech-to-print profile.
Finally, consideration is given to progress monitoring practices.
The Australian school involved in the Reading Success project identified the assess-
ment and intervention practices for reading as a major area for school improvement.
At the time of completing the project, there were 674 students (50% males/50%
females) enrolled at the school. These enrolments included 6% of students who
identified as being indigenous and 31% of students who had a language background
other than English. The Index of Community Socio-Educational Advantage (ICSEA)
scale (Australian Curriculum, Assessment and Reporting Authority [ACARA], n.d.)
was used as an indicator of the demographic composition of the school. ICSEA
is a scale developed by ACARA and used to provide an indication of the level of
educational advantage of students attending the school. An overall ICSEA value is
calculated for each school, based on factors including parents’ occupation, parents’
The specific data collection methods for both the qualitative aspect of this project
and the data pertaining to reading achievement are described in further detail below.
To obtain information about the educators’ perspectives about their school as well
as their opinions about the enablers and/or inhibitors of the school’s reading pro-
grammes and approaches to the teaching of reading, a number of qualitative data
sets were collected to ensure richness of the data (Lambert & Loiselle, 2008):
a productive iterative process whereby an initial model of the phenomenon guided the explo-
ration of individual accounts and successive individual data further enriched the conceptu-
alisation of the phenomenon; identification of the individual and contextual circumstances
surrounding the phenomenon, which added to the interpretation of the structure of the phe-
nomenon; and convergence of the central characteristics of the phenomenon across focus
groups and individual interviews, which enhanced trustworthiness of findings. (p. 228)
Therefore, the qualitative data included in the Reading Success project were collected
through interviews and/or focus groups with teachers (at the beginning and at the end
of the project), interviews with the leadership team, and a student focus group as well
as individual student interviews. Information was also collected about the reading
programmes the school implemented over the course of the project. In addition,
school planning documents and policies were included in the data set.
Educators The interview questions for teachers and leaders firstly focused on
their demographics—how long they had been teaching, what their qualifications
were, and what year levels they had taught previously including any other teaching
experiences. Remaining questions explored what the educators’ perspectives were
about their school as well as what they felt were the enablers and/or inhibitors of
the school’s reading programmes and approaches to the teaching of reading. The
post-interviews also investigated what staff members felt about the Reading Success
project including its impact on student learning. Prompt questions for both the pre-
and post-interviews can be found in Appendix 1. The questions were based on a
previous project conducted by Barton and McKay (2014, 2016a, 2016b) that explored
another school’s approach to teaching reading in the secondary context.
Students The student focus group and subsequent interviews involved students
in the reading to learn cohort (i.e. Year 4 students). Three of these students rated
themselves the lowest on the Reading Self-Concept Scale (Chapman & Tunmer,
1995). Initially, the students were invited as a whole group to participate in a focus
group to talk about reading, how they perceive themselves as readers and what might
help or inhibit their reading. While the students participated in this focus group, the
researcher asked them to create a collage using cut-up pictures from a range of mag-
azines. The purpose of this collage was to support the students in feeling comfortable
about talking about themselves with the researcher. The researcher utilised a method
known as re-collage (Barton, 2020), whereby participants create an artwork that
expresses deeper messages about a topic, in this case reading. Arts-based research
24 2 Methodology
methods have been known to support participants’ abilities to express their feelings
about focused topics. For students of this age, the researchers were interested in not
only asking for their opinions orally, but also using a stimulus artwork created by
the participants to base a discussion around.
Allowing the students to reflect on themselves as readers through the art had poten-
tial for students to uncover aspects of themselves as readers not previously known.
Leitch (2006), for example, explained that “through the creation of images in relation
to self, new meanings, previously unaware, unvoiced, unexpressed, half-understood
came to be significant and capable of being incorporated into the participants’ social
and/or emotional understanding of themselves, to the point that new actions or direc-
tions could be taken in their lives” (p. 566). In a similar vein, McNiff (2008) showed
how arts-based and creative approaches to reflecting on a topic often resulted in more
“meaningful insights [that] often come by surprise, unexpectedly and even against
the will of the creator” (p. 40). Utilising modes, other than language, created the
opportunity for the students to feel safe and express different ways of knowing or
being (Liamputtong & Rumbold, 2008).
Analytical approaches All interviews and focus groups were audio recorded and
then transcribed. Participants were provided with the transcripts in order for member
checking to occur. The team then analysed the transcripts by firstly identifying any
codes (interesting factors) throughout the initial reading of documents. Subsequently,
the team re-read each transcript to ensure no code was left unidentified. Finally, the
codes were clustered into identified themes (Braun & Clarke, 2006). The identified
themes from the educator and student perspectives are described in detail in Chaps. 4
and 7.
across the three different subscales can also be conducted. Student performance on
this scale is described in Chap. 4.
Fig. 2.1 Bell curve. (Source Wikimedia Commons, retrieved from https://commons.wikimedia.
org/wiki/Category:Normal_distribution#/media/File:Normal_distribution_and_scales.gif)
difference of scores from the mean score to account for variability that exists in
test scores amongst different test takers. It is the combination of information from
the mean and standard deviation of a test that allows decisions to be made about
when a child’s score falls far enough from the mean to warrant the decision that it
is significantly different from the norm. Raw scores (i.e. the total number of items
the student got correct on a standardised test) need to be converted to allow for
meaningful comparisons to norms provided for each test. These comparisons result
in the creation of standard scores (such as z-scores or scaled scores) or percentile
ranks which can be used to provide an indication of how far from the mean the child
performs on a test in comparison with the normative data for that assessment. The
interpretation of these three types of scores, z-scores, scaled scores, and percentile
ranks is provided below.
Z-scores A z-score of 0 indicates average performance (see Fig. 2.1), with a nega-
tive score indicating performance below the mean and a positive score falling above
the mean. A student’s performance with a z-score below −1 indicates the child’s
performance is more than 1 standard deviation below the mean and is subsequently
interpreted as performing ‘below expectations’.
Scaled Scores The interpretation of scaled scores, or standard scores, is usually
based on the mean score that is assigned for each test. For example, it is common for
tests to assign a mean score of 100 (i.e. z-score of 0), and scores less than 85 indicate
a child’s performance is more than 1 standard deviation below the mean, and thus
‘below expectations’.
28 2 Methodology
As stated previously, the aim of learning to read is to read accurately and fluently
with comprehension. To determine the level of reading ability, it is therefore impor-
tant to use a formal reading assessment that adequately captures a student’s read-
ing skills across reading accuracy, reading fluency, and reading comprehension. As
summarised by Westerveld (2009), when choosing a reading comprehension test, the
following needs to be taken into consideration: (1) test format (i.e. cloze vs ques-
tion–answer format; oral or silent); (2) passage dependency (can the students answer
the questions without reading the passage); and (3) test-taking strategies (i.e. stu-
dent initiative, involvement, and item completion rate; Bornholt, 2002; Keenan &
Betjemann, 2006; Keenan, Betjemann, & Olson, 2008).
Unfortunately, not all reading tests adequately measure reading comprehension. In
fact, some assessments may fail to identify reading comprehension deficits (Bowyer-
Crane & Snowling, 2005). For example, student performance based on tests such as
the Neale Analysis of Reading Ability (NARA; Neale, 1999) and the York Assessment
of Reading for Comprehension (YARC; Snowling et al., 2012) depends on both
word recognition and language comprehension skill, as students are asked to answer
questions after readig a passage. In contrast, performance on cloze reading tasks (i.e.
student reads a sentence or passage and fills in the blank) has been shown to be mostly
dependent on adequate word recognition ability. Based on these results, Westerveld
(2009) advised the use of a reading test that requires the student to read passages
and answer open-ended questions following the reading or using an individually
administered test of language comprehension in addition to using a reading test
(Nation & Snowling, 1997).
2.5 Choosing a Reading Comprehension Test 29
Another consideration in the Australian context is to choose a test that has been
normed on the Australian population. Recognising the importance of the schooling
system, including the effect of what age students start their formal reading tuition
(Cunningham & Carroll, 2011) on their reading performance, using standardised tests
that have been normed overseas may not adequately identify Australian students with
reading difficulties. Both the NARA and the YARC have been normed on Australian
students. A recent study by Colenbrander, Nickels, and Kohnen (2016) compared
student performance on the NARA versus the YARC. Students attended Australian
primary schools and were in grades 3–6. Interestingly, more students were classified
with a ‘specific comprehension difficulty’ profile on the NARA than on the YARC,
but no differences were found for classification of ‘specific word reading difficulties’
between the tests. Closer inspection of these results, however, showed that only 9%
of the variance in YARC scores could be explained by a child’s ability to decode
words; this was 21% for the NARA. Taken together, the authors concluded that the
YARC may be a better test to indicate reading comprehension difficulties for students
who have word recognition difficulties. Furthermore, the authors recommend using
a separate decoding task (word or non-word reading measure), and that diagnosis
of a reading comprehension difficulty should not be based on the results of a single
assessment. For the current study, based on previous research, we administered the
YARC to determine students’ reading performance across reading accuracy, reading
fluency, and reading comprehension.
Fig. 2.2 Five-step assessment to intervention process based on the Simple View of Reading. Note
RC = Reading comprehension; RA = reading accuracy; RR = reading rate; LC = language compre-
hension. # Step 3: Further assessment may also include students whose RC was within expectations
but their RA was below expectations and may warrant further assessment (see Step 3 below)
recognition (Step 4). Finally, the information obtained from Steps 1–4 was used to
guide intervention practices as part of Step 5. This assessment to intervention process
is shown descriptively in Fig. 2.2.
The following section details these five assessment steps including description
of the specific assessments administered as part of the Reading Success project.
We acknowledge that there are many other assessments available to assess reading
and spoken language skills of school-aged children and recommend collaboration
with all professionals involved in assessment and intervention for students at risk
for or experiencing difficulties in reading, including speech pathologists. The tests
that were used in this project are described in detail below; however, a list of other
suitable tests are shown as further examples in Appendix 2.
For this project, the Australian edition of the York Assessment of Reading for Com-
prehension, Primary (YARC; Snowling et al., 2012), was used as a formal (standard-
ised), norm-referenced assessment of reading ability. The YARC yields measures of
reading accuracy and reading comprehension as well as reading rate. This diagnostic
reading assessment was administered individually, either by a speech pathologist or
by a teacher, with all students in the project. As per the manual, the student was asked
2.6 The Reading Assessment Process Used in the Reading Success … 31
to read aloud two passages (at a level suitable to the student’s reading ability), and
the time taken to read each passage was recorded. If the student produced a reading
error, the correct production was immediately provided to the student by the exam-
iner. At the conclusion of each passage, the student was asked a series of literal and
inferential comprehension questions, and the student was able to refer to the text to
answer the questions if they needed. Following the completion of two passages, sev-
eral scores were calculated: (a) the number of reading errors was totalled to provide
an overall reading accuracy (RA) score; (b) the time taken to read each passage was
summed to provide an overall reading rate (RR); (c) the total number of questions
answered correctly was calculated to provide a reading comprehension (RC) score.
These raw scores were then converted to standard scores to allow for comparisons
with Australian normative data. As per the test guidelines, for students in the Year 1
cohort, RR was not calculated if the student exceeded 16 reading errors on the level
1 passage. These test scores (RA, RR, and RC) can be calculated online using a free
online score conversion tool.
In the Reading Success project, we used a standard score (SS) cut-off of 85
as an indicator of ‘below expectations’, which corresponds to students scoring in
the bottom 16% of the same-age population (see Fig. 2.1). Using this cut-off, the
cohort was divided into those students who needed further assessment and those who
performed within expectations for their age/stage of schooling.
For students who scored below expectations on the RC subtest of the YARC, consis-
tent with the Simple View of Reading, we investigated their reading accuracy skills
and considered their language comprehension skills.
(i) Check Reading Accuracy First, student performance on the YARC RA sub-
test was checked to determine whether challenges in RA were contributing to
RC difficulties. Students who demonstrated RA SS < 85 were considered to
demonstrate challenges in RA.
(ii) Check Language Comprehension The next step was to screen the lan-
guage comprehension skills of students who scored below expectations
in RC. In this project the Understanding Spoken Paragraphs subtest
from the standardised, norm-referenced assessment, the Clinical Evaluation
of Language Fundamentals, fourth or fifth edition (CELF-4/CELF-5; Semel,
Wiig, & Secord, 2006; Wiig, Semel, & Secord, 2017) was administered. As per
the test manual, students were asked to listen to spoken passages and then answer
questions about the content of the passages. The subtest was administered as per
the assessment manual guidelines, and a total raw score was calculated and then
converted to a scaled score. In the current project, a scaled score below 7, i.e. 1
32 2 Methodology
standard deviation below the mean, was used to indicate a student performing
‘below expectations’ in language comprehension (i.e. equivalent to SS85).
The CELF is a restricted test (i.e. can only be administered by a speech pathologist)
and is relatively expensive if the only aim of using it is to determine if the student’s
RC difficulties stem from underlying problems in LC. An alternative way of assessing
LC is to orally administer two paragraphs of the YARC. The YARC Passage Reading
Test—Primary comprises two parallel tests of graded passages (A and B), each of
which is accompanied by a set of eight comprehension questions. Therefore, if form
A has been used to test the student’s RC, form B can be used to test the student’s
LC. In this instance, the examiner (e.g. teacher or speech pathologist) reads two
passages (level equivalent to the student’s age or year of schooling) out loud and
asks the student to answer the comprehension questions afterwards. The student’s
performance can be evaluated by scoring the total number of questions answered
correctly across the two passages and converting this raw score to a scaled score.
Although this test was not normed for this purpose, a scaled score ≥ 85 would
potentially indicate satisfactory language comprehension performance.
The results from Steps 1 and 2 were then used to determine whether students’
poor performance on the RC subtest of the YARC could be explained by poor RA,
poor LC, or both.
Next, for students who performed below expectations in RA on the YARC (i.e.
showed difficulties accurately reading passages), further assessment was conducted
to determine possible underlying reasons for their RA difficulties. It is important
to note that assessment at this stage should include students who demonstrate poor
RA and RC (both SS < 85), as well as those students whose RC SS ≥ 85 but
who show difficulties in RA (i.e. SS < 85). This is an important consideration as
these students, who demonstrate decoding difficulties but age-appropriate RC, may
have oral language skills that allow for masking of difficulties in RC, particularly in
the early years of schooling. Therefore, further assessment of these students’ word
recognition skills and consideration of targeted intervention around decoding may
be required.
Assessments of the following print-related skills were conducted: (a) single word
reading, (b) orthographic knowledge, and (c) phonological awareness.
The Castles and Coltheart Word Reading Test—Second Edition (CC-2; Castles,
et al., 2009) was administered as a formal, standardised, norm-referenced measure
of single word reading ability. This test was selected as it has Australian normative
data, is freely available, and is an efficient measure of word reading across regular,
irregular, and nonsense words. This test is untimed and assesses a student’s ability
2.6 The Reading Assessment Process Used in the Reading Success … 33
Orthographic Knowledge
The Letter Sound Test (LeST; Larsen, Kohnen, Nickels, & McArthur, 2015)
assessed orthographic knowledge. This freely available, formal, norm-referenced
assessment was chosen to assess a student’s ability to sound out single letters
and letter combinations (including consonant and vowel digraphs). The test has
been normed on Australian students (kindergarten to grade 3). This test was
administered online (www.motif.org.au), in a one-to-one setting, and took approx-
imately 5–10 min. Students were shown 51 graphemes (single letters and let-
ter combinations) in isolation, and they were asked to produce the target sound
. After the administration of the test, a total raw score (/51) was calculated and con-
verted to a z-score as per the test guidelines. In the current study, students with a
z-score less than −1 were classified as performing below expectations with their
orthographic knowledge, including the students attending Year 4 who were older
than the norming sample.
Phonological Awareness
non-word repetition. The CTOPP-2 is a restricted test (i.e. only allied health and
special education professionals can administer this test).
Table 2.2 provides an overview of the assessments that were used in Steps 1–3.
Speech-to-print profiles were created for students who performed below expectation
on the RC subtest of the YARC. Information from all assessment measures (from
Steps 1 to 3) was mapped on to the speech-to-print profiles for each student. It is
paramount to ensure this step of the assessment process is in collaboration with all
stakeholders involved in the teaching and remediation of reading, including teachers,
speech pathologists, and learning support teachers. This collaborative process allows
for the identification of strengths and difficulties in the core components of language
comprehension and word recognition. Furthermore, this team approach provides an
avenue for identifying if any further assessments should be completed and by whom,
or whether some of this information has already been collected as part of routine
school-wide assessment processes. For example, as shown in Table 2.2, Step 4 may
require a comprehensive evaluation of a student’s spoken language skills, which
may involve a referral to the speech pathologist. The speech pathologist may, for
example, administer the CELF-5 and/or collect a spontaneous language sample in a
narrative (e.g. Profile of Oral Narrative Ability [PONA]; Westerveld & Vidler, 2016)
or expository context (Heilmann & Malone, 2014). For other students, more detailed
information may be needed regarding a student’s writing and spelling skills, which
may well be available from the classroom teacher. Completing the profile will then
promote discussion about appropriate intervention planning (see Step 5).
The next step was to discuss how appropriate and targeted intervention could be
provided for students with identified reading difficulties, based on their speech-to-
print profiles. For example students who showed a profile of dyslexia would benefit
from intervention aimed at improving these students’ phonological processing and
word recognition skills, whereas students who demonstrated specific comprehension
deficits would benefit from a programme aimed at improving their language com-
prehension skills. In this project, the speech-to-print profiles were discussed with the
school team to: (1) trial appropriate interventions for a selection of students within the
school setting, and (2) allow for the school team to plan and continue to implement
regular classroom instruction. Detailed information about the interventions imple-
mented and evaluated as part of the Reading Success project is provided in Chap. 5.
Case samples are presented in Chap. 6.
2.6 The Reading Assessment Process Used in the Reading Success … 35
Table 2.2 An overview of the areas assessed and the assessment methods used at each step
Area to assess Assessment used Skill Type of assessment
Step 1 Reading YARC—form A Student’s ability Standardised,
comprehension (text to answer norm-referenced
level) questions from a (scaled scores)
passage of text
that they had read
Step 2 Reading accuracy YARC—form A Student’s ability Standardised,
i (text level) to accurately read norm-referenced
aloud a passage (scaled scores)
of text
Step 2 Language CELF-4/CELF-5 Student’s ability Standardised,
ii comprehension (text Understanding to answer norm-referenced
level) Spoken comprehension (scaled scores)
Paragraphs questions after
Or use the listening to short
YARC—form B passages
Step 3 Word recognition CC-2 (Motif) Student’s ability Standardised,
(word level) to read aloud norm-referenced
single words of (z-score)
different word
types including:
regular, irregular,
and nonsense
words
Grapheme–phoneme LeST (Motif) Student’s ability Standardised,
correspondences to produce sound norm-referenced
to a (z-score)
corresponding
letter /letter
combination
Phonological SPAT-R Student’s ability Standardised,
awareness to complete norm-referenced
syllable, rhyme, (percentile ranks)
and phoneme
identification
tasks
CTOPP-2 Phonological Standardised,
processing, norm-referenced
including (standard scores)
phoneme
awareness, rapid
automatic
naming, and
non-word
repetition
(continued)
36 2 Methodology
The school utilised several routine data collection methods for monitoring reading
outcomes across the school years. The following section describes these specific
methods; however, it is important to note that other Australian schools would have
different processes for measuring reading outcomes. At the school where the Reading
Success project was undertaken, it was reported by the school leadership team that
the assessments administered change throughout primary school based on student
year level and reading level. In the early foundational years of schooling, the PM
Benchmark Reading Assessment Resource Kit (Smith, Nelley, & Croft, 2009) was
used to measure reading skills. Once the students exceeded the maximum reading
level on the PM Benchmark (i.e. level 30), teachers administered the Prose Reading
2.7 Progress Monitoring 37
a description of the outcomes for the Year 4–5 cohort across these assessments
following the relevant intervention programme.
2.9 Summary
This chapter provided an overview of the methodology that was used in the Read-
ing Success project to (1) understand teacher perspectives around the teaching of
reading, (2) identify students who show challenges in their reading accuracy and/or
comprehension skills, and (3) evaluate student reading self-concept and motivation.
It explained how the Simple View of Reading drives the reading assessment process
and provided details of the theoretically based assessments that were used, includ-
ing an overview of how to interpret different types of assessment results. Finally,
the importance of ongoing monitoring of student performance was highlighted as
students move from the learning to read to reading to learn phase.
6. What aspects of the school’s philosophy and planning impact on these pro-
grammes?
7. What programmes are currently in place that aim to improve literacy outcomes
for students?
8. What are the current perceptions of the success as well as areas in need of
improvement of these programmes?
9. What future programmes and approaches to improving success in literacy
learning might be considered in your school?
Post-interview prompt questions
1. Firstly, can you tell me how you were involved in the project?
2. What are your thoughts on how the project went?
3. What do you think worked well in this project?
4. What do you think could be improved?
5. Have you noticed any differences in your students? For example, more effective
at reading, need more support?
6. What have you learnt from the project? For example, do you feel your skills
related to the teaching of reading have improved? If so, how? And if not, why?
7. What do you think would help more in relation to the teaching of reading? And
your students’ reading skills?
Sutherland YOS 1–4 Australia Identification and Norm-referenced • Administered by: Neilson (2003)
Phonological manipulation of speech-language
Awareness syllables, rhyme, pathologists or education
Test—Revised and phonemes, and specialist
(SPAT-R) includes tests of
non-word reading
and spelling;
15 min
(continued)
41
42
(continued)
Spoken language assessments
Test details Age range Country Areas assessed + Type of assessment Comments References
where approx.
normed administration
times
Comprehensive Test of 4; 0–24; 11 USA Phonological Norm-referenced • Administered by: schools Wagner et al.
Phonological processing and speech-language (2013)
Processing—2nd Phonological pathologists
Edition (CTOPP-2) memory
Rapid automatic
naming;
40 min
Lindamood Auditory 5; 0–18; 11 USA Ability to perceive Criterion-referenced • Speech-language Lindamood and
Conceptualisation and conceptualise pathologists or occupational Lindamood
Test—Third Edition speech sounds; therapist (2004)
(LAC-3) 20–30 min
The Children’s Test of 4; 0–8; 0 Britain Assesses non-word Norm-referenced • Not specified Gathercole
Non-word Repetition repetition skills et al. (1994)
only; 15 min
TILLS—non-word As above As above Subtest assessing As above As above As above
repetition subtest non-word
repetition; 5–10
min
(continued)
2 Methodology
(continued)
Spoken language assessments
Test details Age range Country Areas assessed + Type of assessment Comments References
where approx.
normed administration
times
Clinically Useful Australia Assesses Informal • Not specified James (2006)
Words multisyllabic word Available from:
repetition skills https://speech-
only; 5 min language-
therapy.com/
index.php?
option=com_
content&view=
article&id=46
The Single-Word Test 4; 0 + Australia Assesses Informal • Not specified Gozzard et al.
of Polysyllables multisyllabic word (2006)
repetition skills
Appendix 2: Sample of Assessment Tools to Complete …
(continued)
(continued)
Written language assessments
Test details Age range Country Areas assessed Type of Comments References
where + approx. assessment
normed administration
times
Castles and 6; 0–11; 6 Australia Regular, Norm-referenced Available
Coltheart— 2nd irregular, and from: https://
Edition (CC2) nonsense www.motif.
words; 10 min org.au
Word level—spelling
South Australian 6; 0–15; 0 Australia Assessment of Norm-referenced Administered by: educators Westwood
Spelling Test single word (2005)
(SAST) spelling; 15–
30min
Test of Written 6; 0–18; 0 USA Single word Norm-referenced Allied health or special education Larsen et al.
Spelling—Fifth spelling; professional (2013)
Appendix 2: Sample of Assessment Tools to Complete …
Edition 20 min
Group or
individual
Diagnostic YOS 1–7 Australia Spelling of Norm-referenced Available
Spelling Test for irregular words; from: https://
Irregular Words 10–30 min www.motif.
org.au
Diagnostic YOS 1–7 Australia Non-word Norm-referenced Available
Spelling Test for spelling; 10–30 from: https://
Non-words min www.motif.
org.au
(continued)
45
46
(continued)
Written language assessments
Test details Age range Country Areas assessed Type of Comments References
where + approx. assessment
normed administration
times
TILLS—non-word As above As above Subtest As above As above As above
spelling test assessing
non-word
spelling; 10
min
Text level
York Assessment 5; 0–12; 0 • UK Accuracy, rate, Norm-referenced • Administered by: Snowling
of Reading for • Australia and school psychologists, et al. (2012)
Comprehension comprehension; speech pathologists,
(YARC) Individual and specialist teachers
10–15 min
Gray Oral 6; 0–23; 11 USA Accuracy, rate, Norm-referenced • Administered by: Wiederholt
Reading Test—5th and allied health or special and Bryant
Edition (GORT) comprehension; education professional (2012)
20–30 min
2 Methodology
References 47
References
Australian Council for Educational Research. (2018). Progressive achievement tests in reading
(PAT-R). Australia: Author.
Australian Curriculum Assessment and Reporting Authority [ACARA]. (n.d.). ICSEA for
principals. Retrieved from https://www.myschool.edu.au/more-information/information-for-
principals-and-teachers/icsea-for-principals/.
Australian Government. (2016). National assessment program-literacy and numeracy (NAPLAN).
Retrieved from https://www.nap.edu.au/.
Barton, G. (2020). Recollage as tool for self-care: Reflecting multimodally on first five years in the
academy through Schwab’s lines of flight. Qualitative Research Journal, 20(1), 49–62. https://
doi.org/10.1108/QRJ-04-2019-0039.
Barton, G., & McKay, L. (2014). Impact report for Nerang state high school: A collaborative,
community initiative. Mt Gravatt: Griffith University.
Barton, G., & McKay, L. (2016a). Adolescent learners and reading: Exploring a collaborative,
community approach. Australian Journal of Language and Literacy, 39(2), 162–175.
Barton, G., & McKay, L. (2016b). Conceptualising a literacy education model for junior secondary
students: The spatial and reflective practices of an Australian school. English in Australia, 51(1),
37–45.
Bornholt, L. J. (2002). An analysis of children’s task strategies for a test of reading comprehension.
Contemporary Educational Psychology, 27(1), 80–98.
Bowyer-Crane, C., & Snowling, M. J. (2005). Assessing children’s inference generation: What
do tests of reading comprehension measure? British Journal of Educational Psychology, 75,
189–201.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in
Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa.
Cartwright, K. B., Marshall, T. R., & Wray, E. (2016). A longitudinal study of the role of reading
motivation in primary students’ reading comprehension: Implications for a less simple view of
reading. Reading Psychology, 37(1), 55–91. https://doi.org/10.1080/02702711.2014.991481.
Castles, A., Coltheart, M., Larsen, L., Jones, P., Saunders, S., & McArthur, G. M. (2009). Assessing
the basic components of reading: A revision of the Castles and Coltheart test with new norms
(CC2). Retrieved from www.motif.org.au.
Chapman, J. W., & Tunmer, W. E. (1995). Development of young children’s reading self-concepts:
An examination of emerging subcomponents and their relationship with reading achievement.
Journal of Educational Psychology, 87(1), 154–167. https://doi.org/10.1037//0022-0663.87.
1.154.
Colenbrander, D., Nickels, L., & Kohnen, S. (2016). Similar but different: differences in compre-
hension diagnosis on the Neale Analysis of Reading Ability and the York Assessment of Reading
for Comprehension. Journal of Research in Reading. https://doi.org/10.1111/1467-9817.12075.
Cunningham, A., & Carroll, J. (2011). Age and schooling effects on early literacy and phoneme
awareness. Journal of Experimental Child Psychology, 109(2), 248–255. https://doi.org/10.1016/
j.jecp.2010.12.005.
Dunn, D. M. (2018). Peabody picture vocabulary test (5th ed.). Minneapolis, MN: Pearson
Assessments.
Gathercole, S., Willis, C., Baddeley, A., & Emslie, H. (1994). The children’s test of nonword
repetition: A test of phonological working memory. Memory, 2, 103–127.
Gillam, R., & Pearson, N. (2017). Test of narrative language (2nd ed.). Austin, TX: Pro-Ed.
Gozzard, H., Baker, E., & McCabe, P. (2006). Children’s productions of polysyllabic words.
ACQuiring Knowledge in Speech, Language and Hearing, 8(3), 113–116.
Heilmann, J., & Malone, T. O. (2014). The rules of the game: Properties of a database of expository
language samples. Language, Speech, and Hearing Services in Schools, 45(4), 277–290.
James, D. G. H. (2006). Hippopotamus is so hard to say: Children’s acquisition of polysyllabic
words (Doctoral thesis). Sydney, Australia: University of Sydney.
48 2 Methodology
Katzir, T., Lesaux, N. K., & Kim, Y.-S. (2009). The role of reading self-concept and home literacy
practices in fourth grade reading comprehension. Reading and Writing, 22(3), 261–276. https://
doi.org/10.1007/s11145-007-9112-8.
Keenan, J. M., & Betjemann, R. S. (2006). Comprehending the Gray oral reading test without
reading it: Why comprehension tests should not include passage-independent items. Scientific
Studies of Reading, 10(4), 363–380.
Keenan, J. M., Betjemann, R. S., & Olson, R. K. (2008). Reading comprehension tests vary in
the skills they assess: Differential dependence on decoding and oral comprehension. Scientific
Studies of Reading, 12(3), 281–300.
Lambert, S. D., & Loiselle, C. G. (2008). Combining individual interviews and focus groups to
enhance data richness. Journal of Advanced Nursing, 62(2), 228–237.
Larsen, S., Hammill, D., & Moats, L. (2013). Test of written spelling (5th ed.). Austin, TX: Pro-Ed.
Larsen, L., Kohnen, S., Nickels, L., & McArthur, G. (2015). The Letter-Sound Test (LeST): a
reliable and valid comprehensive measure of grapheme–phoneme knowledge. Australian Journal
of Learning Difficulties, 20(2), 129–142. https://doi.org/10.1080/19404158.2015.1037323.
Leitch, S. (2006). Prosperity for all in the global economy: World class skills. Final report of the
Leitch review of skills. London: The Stationery Office.
Liamputtong, P., & Rumbold, J. (2008). Knowing differently: Arts-based and collaborative research
methods. New York: Nova Science Publishers.
Lindamood, P. C., & Lindamood, P. (2004). Lindamood auditory conceptualization test-third
edition. Austin, TX: Pro-Ed.
McNiff, S. (2008). Art-based research. In J. G. Knowles & A. L. Cole (Eds.), Handbook of the arts
in qualitative research: Perspectives, methodologies, examples, and issues (pp. 29–40). Thousand
Oaks, CA: Sage.
Nation, K., & Snowling, M. (1997). Assessing reading difficulties: The validity and utility of current
measures of reading skill. British Journal of Educational Psychology, 67, 359–370.
Neale, M. D. (1999). Neale analysis of reading ability (3rd ed.). Melbourne, VIC: Australian Council
for Educational Research.
Neilson, R. (2003). Sutherland phonological awareness test—revised (SPAT-R) Revised. Jamberoo,
NSW: Author.
Nelson, N. W., Plante, E., Helm-Estabrooks, N., & Hotz, G. (2016). Test of integrated language
and literacy skills (TILLS). Baltimore, MD: Brookes.
Parkin, C., & Parkin, C. (2011). PROBE 2: Reading comprehension assessment. Wellington, NZ:
Triune Initiatives.
Semel, E., Wiig, E. H., & Secord, W. A. (2006). Clinical evaluation of language fundamentals—
fourth edition—Australian (Standardised ed.). Marrickville: Harcourt Assessment.
Smith, A., Nelley, E., & Croft, D. (2009). PM Benchmark reading assessment resources (AU/NZ).
Melbourne: Cengage Learning Australia.
Snowling, M. J., Stothard, S. E., Clarke, P., Bowyer-Crane, C., Harrington, A., Truelove, E., &
Hulme, C. (2012). York assessment of reading for comprehension (YARC), (Australian ed.).
London: GL Assessment.
Torgesen, J. K., Wagner, R. K., & Rashotte, C. A. (2012). Test of word reading efficiency (2nd ed.).
Austin, TX: Pro-Ed.
Wagner, R. K., Torgesen, J. K., & Rashotte, C. A. (2013). The comprehensive test of phonological
processing—second edition (CTOPP-2). Austin, TX: Pro-ed.
Westerveld, M. (2009). Measuring reading comprehension ability in children: Factors influencing
test performance. ACQuiring Knowledge in Speech, Language and Hearing, 11(2), 81–84.
Westerveld, M. F., & Gillon, G. T. (2010). Profiling oral narrative ability in young school-aged
children. International Journal of Speech-Language Pathology, 12(3), 178–189. https://doi.org/
10.3109/17549500903194125.
Westerveld, M. F., Gillon, G. T., & Boyd, L. (2012). Evaluating the clinical utility of the profile of
oral narrative ability in 4-year-old children. International Journal of Speech-Language Pathology,
14(2), 130–140. https://doi.org/10.3109/17549507.2011.632025.
References 49
Westerveld, M. F., & Vidler, K. (2016). Spoken language samples of Australian children in con-
versation, narration and exposition. International Journal of Speech-Language Pathology, 18(3),
288–298. https://doi.org/10.3109/17549507.2016.1159332.
Westwood, P. (2005). Spelling: Approaches to teaching and assessment (2nd ed.). Melbourne:
Australian Council for Educational Research.
Wiederholt, J. L., & Bryant, B. (2012). Gray oral reading tests—fifth edition. Austin, TX: Pro-Ed.
Wiig, E. H., Semel, E., & Secord, W. A. (2017). Clinical evaluation of language fundamentals
Australian and New Zealand (5th ed.). Bloomington, MN: NCS Pearson.
Wolf, M., & Denckla, M. B. (2005). RAN/RAS: Rapid automatized naming and rapid alternating
stimulus tests. Austin, TX: Pro-ed.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.
Part II
The Reading Success Project (Results)
Chapter 3
Reading Success Results Across the Year
Groups
Abstract This chapter describes the Reading Success results across the year groups
and compares the students’ performance on the York Assessment of Reading for
Comprehension (YARC) with results from school-based reading measures. For stu-
dents in their learning to read phase (Year 1), we also report their Year 2 results.
For students in their reading to learn phase of schooling (Year 4), we only report
their reading results at the end of that school year. All results are described using
our five-step assessment to intervention framework, based on the Simple View of
Reading. We demonstrate how the assessment results can be used to create speech-
to-print profiles for students who can be categorised as demonstrating: (a) specific
word reading difficulties; (b) mixed reading difficulties; (c) specific comprehension
difficulties; and (d) non-specified reading difficulties. Based on these profiles, we
suggest which students may benefit from supplemental intervention (i.e. Tier 2 or
Tier 3).
All Year 1 students were assessed on the York Assessment of Reading for Compre-
hension (YARC; Snowling et al., 2012) in Term 4 of the school year. As a group,
© The Author(s) 2020 53
M. F. Westerveld et al., Reading Success in the Primary Years,
https://doi.org/10.1007/978-981-15-3492-8_3
54 3 Reading Success Results Across the Year Groups
these students in the ‘learning to read’ phase of development scored within normal
limits on the three YARC scales (see Table 3.1): reading accuracy (RA), reading rate
(RR), and reading comprehension (RC). It should be noted that reading rate can only
be calculated if students complete levels 1 and 2 of the YARC (as opposed to the
beginner level and level 1).
Step 2: Further Assessment of Students who Scored below Expectations on
RC
As shown in Table 3.1, when applying a standard score cut-off of 85, 26 students
scored below expectations on the RC component of the YARC.
i. We first checked if students’ poor performance in RC could be explained by
their challenges in RA. Only 6 of the 26 students demonstrated poor RA on the
YARC.
ii. The next step was ascertaining the listening comprehension (LC) skills for the 26
students who scored below expectation in their RC ability. One student left the
school between assessment points, and therefore LC data were only available for
25 students. It was found that 8 of these 25 students scored below expectations
on the language comprehension task (Understanding Spoken Paragraphs subtest
of the CELF-5; Wiig, Semel, & Secord, 2017).
After checking RA and LC skills, there were 16 students who demonstrated RC
difficulties and performed within expectations for both LC and RA. Therefore, these
students were classified as showing non-specified reading difficulties. Of these 16
students, 6 students demonstrated low RR (SS < 85), and 4 students had RR not
calculated.
The flow chart in Fig. 3.1 provides a visual representation of the reading profiles
for all students in Year 1, including the 26 students who performed below expectations
for RC, including their skills in RA and LC.
Step 3: Further Assessment of Word Recognition Skills
Further assessments of the 6 students with poor RA were conducted. These results
showed that 5 of the 6 students with RA scores below expectations also showed ortho-
graphic difficulties on the LeST (Larsen, Kohnen, Nickels, & McArthur, 2015), i.e.
Table 3.1 Student performance on the York Assessment of Reading for Comprehension (n = 93)
Reading accuracy Reading rate (RR) n Reading
(RA) n = 93 = 80 comprehensiona (RC)
n = 92b
Mean (SD) 101.47 (13.17) 95.27 (13.88) 93.78 (14.10)
Range 70–130 70–122 70–127
n (%) below SS80 4 (4.3%) 10 (10.8%) 14 (15.1%)
n (%) below SS85 8 (8.6%) 22 (23.7%) 26 (28%)
=
a There was no significant difference between reading comprehension scores based on gender (p
0.599) or E/ALD (p = 0.452) when applying SS85 cut-off.; b 1 student had poor RA on beginner
level that meant RC was not able to be administered and therefore missing data
3.1 Learning to Read Cohort Results 55
Fig. 3.1 Results from Steps 1 and 2 from the assessment to intervention process. One student had
left the school by step 2
z-score < −1.0. Only 2 of the 6 students performed below expectations for phono-
logical awareness (score below the 25th percentile) on the Sutherland Phonolog-
ical Awareness Test—Revised (SPAT-R: Neilson, 2003). The single word reading
assessment (CC-2; Castles, Coltheart, Larsen, Jones, Saunders, & McArthur 2009)
showed strengths and difficulties across the 6 students, but all students performed
below expectations on at least one word type (i.e. regular, irregular, and/or nonsense
words).
The assessment results for the 26 students who performed below expectations for
RC were compiled to provide an overall picture of their strengths and weaknesses in
the different domains assessed. The reading results for each reading profile for these
26 students are described below and shown in Tables 3.2, 3.3, 3.4, and 3.5.
Specific Word Reading Difficulties Only 1 student (S303) presented with a spe-
cific word reading difficulties (SWRD) profile. As shown in Table 3.2, this student
showed difficulties in RA at both the text level and single word level. However, the stu-
dent’s RA difficulties did not seem to arise as a result of weaknesses in orthographic
knowledge or PA skills.
Mixed Reading Difficulties Table 3.3 outlines the assessment results for the five
students with mixed reading difficulties profiles at the end of Year 1. As shown, these
students had difficulties with both RA and LC. Based on further assessment of their
reading skills, it was also found that all 5 students demonstrated difficulties in single
word reading, as well as orthographic knowledge. Only 2 (S40 and S17) of the 5
students showed difficulties with phonological awareness.
Given the challenges across WR and LC, we recommend that students with this
type of profile receive supplemental intervention at a Tier 2 (or Tier 3) level of
56
Table 3.2 Assessment results for students with a specific word reading difficulties (SWRD) profile in Year 1
Code Gender E/ALD RC SS RA SS RR SS LC SS Single word reading Orthographic knowledgea PA
REGa IRa NWa
S303 F N 83 81 NC 10 −1.16 −1.37 −0.36 0.95 WNL
Note E/ALD English as an additional language or dialect; RC SS reading comprehension standard score; RA SS reading accuracy standard score; RR SS reading
rate standard score; LC SS language comprehension scaled score; PA phonological awareness; REG regular words; IR irregular words; NW nonsense words;
NC not calculated/completed; WNL within normal limits; a z-scores reported
3 Reading Success Results Across the Year Groups
Table 3.3 Assessment results for students with a mixed reading difficulties profile in Year 1
Code Gender E/ALD RC SS RA SS RR SS LC SS Single word reading OKa PA
3.1 Learning to Read Cohort Results
Table 3.4 Assessment results for students with a specific comprehension difficulties (SCD) profile
in Year 1
Code Gender E/ALD RC SS RA SS RR SS LC SS
S93 M N < 70 88 NC 6
S95 M N 75 86 75 6
SS0 M Y 80 86 81 5
Note E/ALD English as an additional language or dialect; RC SS reading comprehension standard
score; RA SS reading accuracy standard score; RR SS reading rate standard score; LC SS language
comprehension scaled score; NC not calculated/completed
Table 3.5 Assessment results for Year 1 students with a non-specified reading difficulties profile
Code Gender E/ALD RC SS RA SS RR SS OKa LC SS Possible reasons
for low RC
S27 M Y 82 100 101 8 Comprehension
monitoring
S48 F N 80 107 98 8 Comprehension
monitoring
S308 M N 80 97 93 10 Comprehension
monitoring
S92 F N 74 107 87 8 Comprehension
monitoring
S31 M Y 84 90 87 9 Comprehension
monitoring
S16 F N < 70 118 113 10 Comprehension
monitoring
S22 M Y 74 94 80 10 Poor reading
fluency
S23 F Y < 70 90 NC 11 Poor reading
fluency
S71 M Y 80 100 80 10 Poor reading
fluency
S49 F N < 70 86 NC −1.8 8 Poor reading
fluency—poor
single word
reading (regular)
S45 M N 83 88 NC −2.0 10 Poor reading
fluency—poor
single word
reading
S37 F N 83 85 NC −1.4 9 Poor reading
fluency—poor
single word
reading (regular
and nonsense
words)
S75 M N 74 87 80 −1.0 7 Poor reading
fluency—poor
single word
reading (and
language
borderline)
S30 F N 72 89 77 12 Poor reading
fluency—single
word reading
WNL
(continued)
60 3 Reading Success Results Across the Year Groups
The results for the ‘learning to read’ cohort show that when using the YARC as a
standardised test of reading performance, using SS85 as a cut-off, approximately 28%
of students were considered at risk, based on their RC performance in the learning to
read phase of development. Using our step-by-step approach based on the SVR as a
framework, four different reading profiles were identified at the end of Year 1. The
students’ performance on the YARC assessment was compared with the results from
school-based reading measures (i.e. PM Benchmark Reading Assessments). We will
discuss these results in the following section.
During the early school years, the school used the PM Benchmark Reading Assess-
ments (Smith, Nelly, & Croft, 2009; see Chap. 2). Based on this individually admin-
istered test, students were categorised as average or below average (using level 16
as the benchmark for satisfactory performance). In this section, we compare the stu-
dents’ performances on the assessments administered as part of the Reading Success
project with the results from the PM Benchmark, a school-based reading measure.
Table 3.7 Reading data of the 13 Year 1 students who passed PM Benchmark but who performed
below expectations on the YARC RC subtest
Code Gender E/ALD PM RC SS RR SS RA SS USP Reading profile
S30 F N 21 72 77 89 12 NS
S37 F N 16 83 NC 85 9 NS
S31 M Y 21 84 87 90 9 NS
S48 F Y 21 80 98 107 8 NS
S27 M Y 24 82 101 100 8 NS
S22 M Y 16 74 80 94 10 NS
S16 F N 24 < 70 113 118 10 NS
S308 M N 24 80 93 97 10 NS
S92 F N 18 74 87 107 8 NS
S71 M Y 17 80 80 100 10 NS
S29 M Y 16 83 NC 88 NA NA
S38 M N 21 < 70 NC 84 4 Mixed
S20 M Y 19 80 81 86 5 SCD
Note E/ALD English as an additional language or dialect; PM PM Benchmark Reading Assessment
in level of achievement; SS standard score with 85–115 indicating performance within age
expectations; USP Understanding Spoken Paragraphs subtest of the CELF-5; NS non-specified;
NC not calculated; NA not available; SCD specific comprehension difficulties
discussed further below. It is important to note, however, that the PM level is based
on a child’s performance across decoding and comprehension skills, whereas the
YARC reports a student’s skills separately across these two domains. Thus, while
the following compares these assessments in terms of identification of children at risk
of reading difficulties, the differences across the assessments must be acknowledged
and considered when interpreting these comparisons.
Table 3.7 outlines the assessment data for the 13 students who performed within
average expectations for their year level on the PM Benchmark, but failed to meet
expectations on the YARC RC component. As shown, 10 of these 13 students had
profiles consistent with ‘non-specified’ reading difficulties at the end of Year 1.
As shown in Table 3.6, a further 4 students showed average RC skills on the YARC
at the end of Year 1 but failed to meet the benchmark for PM. These students’ results
are shown in Table 3.8.
Table 3.8 Reading data of four Year 1 students who performed below expectations for PM
Benchmark but who performed within normal limits on the YARC RC subtest
Code Gender E/ALD PM RC SS RR SS RA SS Reading profile
S9 M N 15 98 76 99 Typical
S87 M Y 11 104 83 92 Typical
S299 M N 14 86 87 89 Typical
S32 M N 9 93 NC 81 Typical
Note E/ALD English as an additional language or dialect; PM PM Benchmark Reading Assessment
in level of achievement; RC reading comprehension; SS standard score with 85–115 indicating
performance within age expectations; RR reading rate; RA reading accuracy; NC not calculated
the PM Benchmark (level) and their performance on the YARC RA (SS), r = 0.81
(p < 0.001). The results from this comparison are shown in Table 3.9.
As shown in Table 3.9, 81 students met expectations on both assessments (87%).
However, 12 students (13%) only met expectations on one of the tests. Only one stu-
dent performed well on the PM Benchmark (level 21) but did not reach expectations
in RA on the YARC (S38; see Table 3.7). This student showed a mixed reading dif-
ficulties profile. The breakdown of assessment results for the remaining 11 students
who performed below expectations on the PM Benchmark but showed satisfactory
performance in RA on the YARC is shown in Table 3.10. The three students with
‘typical’ Year 1 reading profiles based on the YARC RA (S9, S87, S299) in the table
below are also included in Table 3.8 when investigating which students performed
WNL on the YARC RC, but below expectations on the PM Benchmark.
Table 3.10 Reading data of 11 Year 1 students who performed below expectations on the PM
Benchmark, but within expectations for RA on the YARC
Code Gender E/ALD PM RC SS RR SS RA SS Reading profile
S9 M N 15 98 76 99 Typical
S87 M Y 11 104 83 92 Typical
S299 M N 14 86 87 89 Typical
S93 M N 7 70 NC 88 SCD
S95 F N 12 75 75 86 SCD
S79 F N 10 84 75 88 NS
S45 M N 14 83 NC 88 NS
S98 F N 15 84 75 86 NS
S75 M N 13 74 80 87 NS
S49 F N 7 70 NC 86 NS
S23 F Y 11 70 NC 90 NS
Note E/ALD English as an additional language or dialect. PM PM Benchmark Reading Assessment
in level of achievement. RC reading comprehension; SS standard score with 85–115 indicating
performance within age expectations; RR reading rate; RA reading accuracy; NC not calculated;
SCD specific comprehension difficulties; NS non-specified
All students were closely monitored during Year 2 with some students obtain-
ing supplemental intervention targeting their areas of difficulties (i.e. WR, LC, or
both, depending on their reading profiles). Most of the students were reassessed for
monitoring purposes in Term 4 of Year 2, and these results are discussed below.
A total of 70 students (75% of the original cohort of 93) were available (with
parental consent) for follow-up assessments one year later (Year 2). Four students
had left the school, and parent/caregiver consent forms were not returned for 19
students. Of these 19 students, five had been ‘flagged’ as at risk, based on their
performance in RC (see Fig. 3.1). The results of the Year 2 YARC performances are
displayed in Table 3.11.
When investigating these students’ performance on the YARC in Year 1, the
following results were found:
3.3 Follow-up Results One Year Later—Year Two 65
Table 3.11 Student performance on the York Assessment of Reading for Comprehension (n = 70)
in Year Two
Reading accuracy n = Reading rate n = 68 Reading
70 comprehension n = 70
Mean (SD) 100.99 (12.2) 100.68 (14.1) 99.86 (13.1)
Range 75–130 70–124 73–130
n (%) below SS85 4 (5.7%) 10 (14.7%) 6 (8.6%)
• Of the 4 students (S40, S49, S17, and S45) who scored below SS85 in RA in
Year 2, two students showed challenges in RA in Year 1 (S40 and S17), and all
four had shown difficulties with reading rate as well as significant difficulties in
orthographic knowledge (z-scores < −2.0).
• Of the 10 students who scored below SS85 in RR in Year 2, all 10 students had
also shown challenges in RR in Year 1.
• Of the 6 students (S40, S23, S303, S82, S49, and S17) who scored below SS85
in RC in Year 2, five students showed challenges in RC in Year 1 (SS < 85) and
3 students showed challenges in RA in Year 1. All students except S82 had been
flagged ‘of concern’ in Year 1. Unfortunately, no further assessment results are
available for S82, as this student was away when we conducted our follow-up
testing.
Using our five-step assessment to intervention framework, 28% of students at the end
of Year 1 were identified as having reading difficulties. Using the SVR as a guide,
four different reading profiles emerged. When comparing these results with school-
based reading measures reasonably high correlations between the PM Benchmark
data and student performance on the YARC RC and/or YARC RA were found with
more than 80% agreement in classification of average versus below average readers
on both assessments. Based on these results, the PM Benchmark may be a suitable
reading assessment during the early years of schooling, although we recommend
using a higher cut-off for the PM Benchmark and monitoring those students who end
up on level 16 or 17, particularly for reading fluency. For example, teachers may
administer the Test of Word Reading Efficiency 2nd Edition (TOWRE-2; Torgesen,
Wagner, & Rashotte, 2012), which assesses a student’s fluency in sight word reading
and phonetic decoding skills and takes only 5 min.
When the students were reassessed one year later, many students showed a marked
improvement. It was promising to see that at the end of Year 2 (third year of school-
ing), only 3 students showed significant reading difficulties; two of these students
demonstrated a mixed reading difficulties profile (S40 and S17) and one (S49) showed
66 3 Reading Success Results Across the Year Groups
specific word recognition difficulties. These students require focused, more inten-
sive support for WR. Further support is also indicated for all students who continued
to struggle in reading comprehension. Chapter 5 provides examples of interventions
that specifically target the language comprehension skills needed for successful read-
ing comprehension. Finally, the importance of fluent decoding (as measured through
RR) should not be underestimated. Fluency in word recognition is needed to obtain
automaticity in reading skills to allow students to move from their learning to read
to reading to learn stage of schooling.
As a group, the students in the reading to learn phase of development (i.e. Year
4 cohort) scored within normal limits on the three YARC scales. These results are
shown in Table 3.12.
Table 3.12 Year 4 student performance on the York Assessment of Reading for Comprehension (n
= 77)
Reading accuracy n = Reading rate n = 76 Reading
77 comprehensiona n =
77
Mean (SD) 91.97 (12.73) 88.18 (11.77) 88.26 (11.82)
Range 70–117 70–111 70–122
n (%) below SS80 15 (19.5%) 18 (23.7%) 23 (29.9%)
n (%) below SS85 23 (29.9%) 29 (38.2%) 33 (42.9%)
a There was no significant difference between reading comprehension scores based on gender (χ2 (1)
= 0.070, p = 0.792) or E/ALD (χ2 (1) = 0.210, p = 0.647) using SS85 as cut-off
3.5 Reading to Learn Cohort Results 67
Further assessments of single word reading were conducted with the 18 students with
poor RA on the YARC. Of these 18 students with RA scores below expectations, a
total of 17 showed difficulties in single word reading on the CC-2 across at least one
Fig. 3.2 Results from Steps 1 and 2 from the assessment to intervention process. One student was
unavailable for testing at step 2
68 3 Reading Success Results Across the Year Groups
The assessment results for the 33 students who performed below expectations for
RC were compiled to provide an overall picture of their strengths and weaknesses in
the different domains assessed. The reading results for these 33 students are shown
in Tables 3.13, 3.14, 3.15, and 3.16.
Specific Word Reading Difficulties Ten students presented with a SWRD profile.
As shown in Table 3.13, all students demonstrated weaknesses in RA at both the text
level and single word level. Seven of these students also had underlying orthographic
knowledge difficulties.
Mixed Reading Difficulties Table 3.14 outlines the assessment results for the
seven students with MRD profiles. As shown, these students demonstrated difficulties
with both RA and LC. Through further assessment of their reading skills, it was also
shown that all students, except one, demonstrated difficulties in single word reading.
These students should be considered for supplemental intervention at a Tier 2 or Tier
3 level of support, targeting both WR and LC skills.
Specific Comprehension Difficulties Table 3.15 presents the reading results for
students identified with a specific comprehension difficulties profile. These students
showed difficulties in comprehension, regardless of the modality (spoken or written).
Non-specified Reading Difficulties Finally, there were 9 students who performed
poorly in RC, but showed adequate performance in RA (on the YARC) and on a
language comprehension measure. The results are shown in Table 3.16. As previously
described in Chap. 1, reasons for poor performance in RC could include difficulties
in comprehension monitoring and working memory capacity (e.g. the students who
showed difficulty with RR and/or single word reading, including S37, S23, and S41).
In drawing together the findings from assessment for the ‘reading to learn’ cohort, it
was evident that approximately 43% of these Year 4 students (n = 33) demonstrated
RC skills below expectations (applying SS85 cut-off). Using our step-by-step assess-
ment process, based on the SVR as a framework, four different reading profiles were
identified, with 7 students showing MRD, 6 showing SCD, 10 showing SWRD, and
9 students demonstrating a non-specified reading deficit. Our assessment battery
seemed effective in helping us identify these different reading profiles. Using a lan-
guage comprehension task allowed us to differentiate between students with dyslexia
(who showed poor reading comprehension due to their challenges in word recogni-
tion) and those whose difficulties in language comprehension contributed to their low
scores in reading comprehension. Further assessment of students who demonstrated
Table 3.13 Assessment results for students with specific word reading difficulties (SWRD) profiles in Year 4
Code Gender E/ALD RC SS RA SS RR SS LC SS Single word reading Orthographic knowledgea
REGa IRa NWa
S26b M Y 71 81 80 8 −1.53 −0.56 −2.35 −1.90
S46 M Y 73 70 70 12 −2.48 −2.12 −2.62 −2.32
3.5 Reading to Learn Cohort Results
Table 3.14 Assessment results for students with mixed reading difficulties profiles in Year 4
Code Gender E/ALD RC SS RA SS RR SS LC SS Single word reading
REGa IRa NWa
S67 M N <70 <70 NC 3 −2.97 −3.09 −3.09
S66 F N <70 <70 <70 2 −2.30 −2.03 −3.09
S02 F N <70 71 <70 5 −2.31 −1.37 −2.42
S09 M Y 70 83 99 3 0.12 −0.81 .33
S07 F Y 73 77 <70 5 −1.85 −1.09 −2.10
S05 F Y 77 71 85 3 −1.72 −0.76 −1.62
S61 F N 84 83 80 6 −1.81 −1.25 −1.48
Note E/ALD English as an additional language or dialect; RC SS reading comprehension standard score; RA SS reading accuracy standard score; RR SS reading
rate standard score; LC SS language comprehension scaled score; REG regular words; IR irregular words; NW nonsense words; NC not calculated/completed;
a z-score values reported
3 Reading Success Results Across the Year Groups
Table 3.15 Assessment results for students with specific comprehension difficulties (SCD) reading profiles in Year 4
Code Gender E/ALD RC SS RA SS RR SS LC SS Single word reading
REGa IRa NWa
3.5 Reading to Learn Cohort Results
Table 3.16 Assessment results for students with non-specified reading profiles in Year 4
Code Gender E/ALD RC SS RA SS RR SS LC SS Single word reading
REGa IRa NWa
S37 M N 81 85 75 10 −0.41 −1.77 −0.11
S71 F N 77 97 93 8 NC NC NC
S60 F Y 79 103 95 8 −0.73 −0.76 −0.38
S50 F N 79 89 93 9 −0.73 0.41 −1.30
S17 F N 73 91 77 10 −0.41 −0.21 −0.83
S57 M Y 79 89 97 10 −0.14 −0.83 2.34
S23 M N 73 91 82 11 −0.73 −1.09 −1.04
S65 F N 84 95 87 8 NC NC NC
S41 M N 79 93 82 12 −0.35 −1.09 −1.14
Note E/ALD English as an additional language or dialect; RC SS reading comprehension standard score; RA SS reading accuracy standard score; RR SS reading
rate standard score; LC SS language comprehension scaled score; REG regular words; IR irregular words; NW non-words; NC not completed; a z-scores
reported
3 Reading Success Results Across the Year Groups
3.5 Reading to Learn Cohort Results 73
poor performance in word recognition at text level (i.e. SWRD and MRD profiles)
showed that all but one of these students demonstrated weaknesses in single word
reading (non-word reading and regular word reading) using the CC-2; 70% of the
students with SWRD showed significant difficulties with orthographic knowledge.
Students’ speech-to-print profiles may now be used to identify which skills should be
targeted in intervention, whether it is at Tier 2 or Tier 3 within an RtI framework. A
description of the interventions provided based on these reading profiles is described
in detail in Chap. 5.
The students’ performance on the YARC subtests was then compared to the data
obtained as part of the school-based reading assessments. As described in Chap. 2,
the PAT-R (Australian Council for Educational Research, 2018) was routinely used
at the school to assess the reading skills of the Year 4 students. Data from 64 students
were available.
Table 3.17 Year 4 (n = 64) students identified with reading difficulties on the RC component of
the YARC versus the PAT-R
PAT-R
Average Below average
YARC RC WNL 31 4
Below expectations 12 17
Note PAT-R Progressive Achievement Tests in Reading; YARC RC York Assessment for Reading
Comprehension Reading Comprehension; WNL within normal limits. Shading represents the
students shown in Table 3.18
74 3 Reading Success Results Across the Year Groups
Table 3.18 Performance of the 12 students who passed the PAT-R but performed below expectations
on the YARC RC (shaded), as well as the 4 students who failed the PAT-R but performed WNL on
the YARC RC
Code Gender E/ALD PAT-R RC SS RA SS RR SS Reading profile
S07 F Y 4 73 77 70 Mixed
S11 F N 5 77 83 80 SWRD
S26 M Y 4 71 81 80 SWRD
S33 F N 4 79 74 70 SWRD
S36 F N 4 83 78 80 SWRD
S40 F N 5 83 72 77 NC
S48 M N 6 83 85 94 SCD
S49 F N 5 75 93 90 SCD
S50 F N 4 79 89 93 NS
S56 M N 5 83 97 93 SCD
S60 F Y 7 79 103 95 NS
S65 F N 4 84 95 87 NS
S04 F N 2 85 91 71 Typical –low RR
S22 F N 2 102 83 76 Typical – low
RA/RR
S34 M N 2 94 87 71 Typical – low RR
S69 F N 2 90 101 94 Typical
Note Standard scores reported for the YARC; scaled scores reported for the PAT-R; PAT-R
Progressive Achievement Tests in Reading; RC reading comprehension; RA reading accuracy;
RR reading rate; Mixed mixed reading difficulties; SWRD specific word reading difficulties; NC
non conclusive; SCD specific comprehension difficulties; NS non-specific reading difficulties
Approximately 43% of the reading to learn cohort demonstrated poor reading skills
at the end of Year 4 (i.e. their fifth year of schooling). In applying the five-step assess-
ment to intervention process, four different reading profiles were identified. When
comparing students’ performance on the YARC RC and YARC RA with the PAT-R,
between 68 and 75% agreement in classification of average versus below average
readers was found. Closer inspection of the comparisons showed no clear pattern,
although students with specific word recognition difficulties (dyslexia) were likely to
perform well on the PAT-R. We suggest that the PAT-R may not be sensitive to read-
ing challenges and recommend for the school to consider an alternative monitoring
tool.
This chapter presented the results of the assessments conducted as part of the Reading
Success project and described the comparisons between these assessment results
and school-based data. Assessment results were reported for two distinct cohorts
of children: (i) those in the learning to read phase and (ii) those in the reading
to learn phase of development. It was shown that at the end of Year 1, 28% of
the learning to read cohort demonstrated difficulties in RC. However, when these
students were followed up 12 months later, only 9% showed persistent difficulties
in RC at the end of Year 2, with the cohort showing improvements in all aspects
of reading development. When considering the assessment results for the reading to
learn cohort, approximately 43% of the students demonstrated poor RC skills at the
end of Year 4. We showed that, when guided by the SVR as a framework, we identified
which different underlying causes were contributing to the students’ RC challenges,
highlighting different reading profiles. Based on the findings reported in this chapter,
we urge school teams to implement targeted interventions based on each student’s
reading profile. Finally, this chapter showed that while the PM Benchmark may be
a suitable reading assessment during the early years of schooling, the PAT-R lacked
sensitivity in detecting reading challenges in older readers.
References
Australian Council for Educational Research. (2018). Progressive achievement tests in reading
(PAT-R). Australia: Author.
Castles, A., Coltheart, M., Larsen, L., Jones, P., Saunders, S., & McArthur, G. M. (2009). Assessing
the basic components of reading: A revision of the Castles and Coltheart test with new norms
(CC2). Retrieved from www.motif.org.au.
76 3 Reading Success Results Across the Year Groups
Crystal, D. (1987). Towards a bucket theory of language disability: Taking account of interaction
between linguistic levels. Clinical Linguistics and Phonetics, 1, 7–22.
Larsen, L., Kohnen, S., Nickels, L., & McArthur, G. (2015). The Letter-Sound Test (LeST): A
reliable and valid comprehensive measure of grapheme–phoneme knowledge. Australian Journal
of Learning Difficulties, 20(2), 129–142. https://doi.org/10.1080/19404158.2015.1037323.
Neilson, R. (2003). Sutherland phonological awareness test—revised (SPAT-R) (Revised ed.).
Jamberoo, NSW: Author.
Semel, E., Wiig, E. H., & Secord, W. A. (2006). Clinical evaluation of language fundamentals—
fourth edition—Australian (4th ed.). Marrickville: Harcourt Assessment.
Smith, A., Nelley, E., & Croft, D. (2009). PM benchmark reading assessment resources (AU/NZ).
Melbourne: Cengage Learning Australia.
Snowling, M. J., Stothard, S. E., Clarke, P., Bowyer-Crane, C., Harrington, A., Truelove, E., &
Hulme, C. (2012). York assessment of reading for comprehension (YARC), (Australian ed.).
London: GL Assessment.
Torgesen, J. K., Wagner, R. K., & Rashotte, C. A. (2012). Test of word reading efficiency 2 (TOWRE-
2). Austin, TX: Pro-Ed.
Wiig, E. H., Semel, E., & Secord, W. A. (2017). Clinical evaluation of language fundamentals
Australian and New Zealand (5th ed.). Bloomington, MN: NCS Pearson.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.
Chapter 4
Reading Self-Concept and Student
Perceptions
Abstract This chapter describes the results of the Year 4 student responses to the
Reading Self-Concept Scale (RSCS; Chapman & Tunmer, 1995). It then outlines three
case studies that include data from interviews carried out with students involved in
the programme. These students were selected on the basis of their low responses on
the RSCS, regardless of their performance in reading. One member of the research
team invited the students to participate firstly in a focus group where the students
created a collage that represented themselves as a reader and also depicted how
they felt about reading. The students were then invited to talk to the researcher in an
individual interview with the collage as a stimulus for discussion and the development
of strengths, weaknesses, opportunities, and threats chart. Findings showed that the
students enjoyed reading for relaxation at home but felt the reading practices at
school were not always as engaging as they could be for them. The chapter outlines
a number of recommendations for schools and teachers in relation to the students’
feedback.
4.1 Introduction
All Year 4 students who participated in the Reading Success project were asked
30 questions from the RSCS (Chapman & Tunmer, 1995) to measure their reading
self-concept. As shown in Table 4.1, overall, the students scored between 3 and 4
on all three subscales (on a scale from 1 [no, never] to 5 [yes, always] indicating
generally positive perceptions regarding difficulty, competence, and attitude towards
reading. These results are very similar to those reported by Chapman and Tunmer
(1995). Closer inspection of individual student results showed that only 7 of the 78
students scored < 3 on the total RSCS. Ten students perceived reading to be difficult
(scores < 3 for difficulty); 13 students perceived themselves as not very competent in
reading (scores < 3 for competence); 8 students scored < 3 when asked about their
attitude towards reading. We investigated correlations between subscales and found
that the difficulty factor was correlated at r = 0.67 with competence (p < 0.001), r
= 0.26 with attitude (p = 0.023) and r = 0.80 with total (p < 0.001); the attitude
factor was correlated with competence at 0.40 (p < 0.001) and correlations between
attitude and total scale was 0.86 (p < 0.001).
We also investigated the links between students’ performance on the RSCS to
their performance on the Reading Comprehension subtest from the York Assessment
of Reading for Comprehension (YARC; Snowling et al., 2012). In our cohort of Year
4 students, there was a positive, but mild correlation between reading comprehension
and the total self-reported reading self-concept score (r = 0.24, n = 77, p = 0.037).
This meant that students with higher RC scores also reported higher perceptions
of their reading self-concept. When looking at the individual components of the
RSCS, there was a statistically significant relationship between YARC RC score
and self-perceptions in terms of difficulty (r = 0.40, n = 77, p = < 0.001). This
can be interpreted to show that higher RC scores resulted in students’ perceiving
reading to be easier. However, there was no significant correlation between reading
comprehension performance on the YARC and self-concept in terms of ‘attitude’
towards reading (r = 0.012, n = 77, p = 0.918) and perceptions of competence
in reading (r = 0.143, n = 77, p = 0.215). Finally, we considered the students’
performance between subgroups of students who performed within expectations on
the YARC RC (SS > 85) and those below expectations. As shown in Table 4.1,
students who performed below expectations on the YARC RC (see also Chap. 3),
found reading more difficult (t(75) = −4.272, p = <0.001), and perceived themselves
as less competent (t(75) = −2.609, p = 0.011), but there were no group differences
in attitude (t(75) = .282, p = 0.779) towards reading.
To better understand the students’ perceptions around reading and to gain insight into
their reading experiences, a focus group was carried out with three students from Year
4 (pseudonyms: Daisy [S03], Lily [S33], and Tiana [S36]), who rated themselves the
lowest on the Reading Self-Concept Scale. The students were invited to participate
in a small group to create their own individual collage by using pre-cut images from
magazines. The purpose of this activity was for the students to feel comfortable with
the researcher initially in a small group. In the focus group, the main questions were
(see also Chap. 2): (1) Tell me about you as a reader; and (2) Tell me about your
experiences in reading. The students then returned individually at different times
to talk to the researcher about their collage and themselves as a reader as well as
reading in general. In this interview, a SWOT (strengths, weaknesses, opportunities,
and threats) analysis was completed by each student where they talked about their
own strengths and weaknesses as well as what would help and/or hinder their progress
in reading. The questions used in each section of the SWOT analysis are shown in
Table 4.2.
The students all stated that they liked reading. They all felt it helped them to
relax and also provided an opportunity to not feel bored. Daisy liked to read with her
sister. The students also shared their areas of interest in terms of the distinct topics
and genres they liked to read.
I like reading as you can just pick up a random book on any topic. It is relaxing. I also like
reading because I can read if I get bored. I like reading about different cultures. I like reading
French books. I am reading the book Chocolat with my mum. (Daisy)
I like reading because you sometimes you feel like you just have to get away and sometimes
it just takes you to a different world. I just like reading any type of book. I read magazines,
newspapers, I like looking at the photos in the magazines, especially Kim Kardashian. (Lily)
I like all sorts of books – the most I like is about plants and animals. (Tiana)
The students also talked about reading a wide range of text types including chapter
books, magazines, information on their iPads, and non-fiction books such as about
natural disasters.
Reading Strategies
The students spoke about what they did when they had challenges reading some
of the texts they engaged in. Interestingly, from the three students, quite a number
of different strategies were discussed including skipping words that are not familiar
and seeing if the text still makes sense; using dictionaries; have an adult sound out
difficult words; writing down the words and practising the spelling and finding out
the meaning; asking for help.
I like being a reader but sometimes I don’t like reading cos there are really tricky words that
I can’t read and I don’t understand and I can’t sound out. I have trouble with really long
words. (Lily)
4.3 Student Interviews 81
I normally just skip the words but it makes sense to me. Mum says sound it out aloud but I
don’t like to do that because I am actually a really shy person. I had an incident where I read
out aloud…(Daisy)
Each student said that they were placed in a specific reading group at school. They
explained that the reading groups were designed at different levels of proficiencies
and included the red group, the green group, the pink group, and the blue group.
Interestingly, although the reading groups changed once a term depending on student
progress, the students all said that they had been in the same group all year despite
them thinking they had probably improved enough to be moved up into the next
group.
The students then completed their re-collages and attended an interview one at a
time.
Lily’s (S33) performance on the Reading Self-Concept Scale indicated very low
scores on ease of reading/difficulty and competence (1.8 and 2.2, respectively), but
better on attitudes (3.6), indicating she perceived reading to be difficult and that she
was not very confident, but maintained a positive attitude and liked reading. Lily’s
performance on the YARC showed significant difficulties in reading accuracy (SS
74) and reading rate (SS < 70), affecting her reading comprehension (SS 79). Her
language comprehension was within normal limits (SS 11 on the Understanding
Spoken Paragraphs subtest of the CELF-4; Semel, Wiig, & Secord, 2006). Lily
demonstrated a specific word reading difficulties profile.
Lily talked about her collage (see Fig. 4.1) by saying she liked to read about
animals, in particular dogs, and in cartoons. She also said she found reading relaxing,
especially if she was reading outside and on the weekends. Lily liked to read chapter
books and mentioned the Dork Diaries and books by author Paul Jennings. She talked
about the reading groups in her classroom. These included the gold group—who
needed more practise, silver group—who did not read bigger books, blue group—
who had confidence, and turquoise group—who were more than average.
Lily described a range of strategies for reading in her interview. These included:
read in her head, read aloud, sound out the words, guessing the meaning of the story,
etc. She said her strengths as a reader related to being good at guessing or making
inference and also reading in her head aloud.
82 4 Reading Self-Concept and Student Perceptions
Sounding out, reading a sentence and then come back to the word to try and figure it out
myself because it’s hard to understand some people when they put it in a way so I try to figure
it out myself, and most of the time that helps because sometimes other people use these big
words that I don’t understand so I can help myself with it because I just don’t understand
other people what they say, so usually I help myself with it and if I don’t I usually go to like
my friends and if they can’t help me I go to the teacher or my mum.
Lily mentioned that she mainly asks her mum for help as her dad is asleep a lot due
to having to go to work at 3 am. She also said she is really busy with scouts and
soccer, etc. Her dad also likes sports. This meant that most of her afternoons were
busy with extra-curricular activities. Lily believed that her reading would improve
through:
Confidence in myself, so if I’m doing something, if I can just believe in myself and have
courage…
I could improve by looking at the pictures to help me figure out the words and asking for
some help if I don’t know…
She said she did not like to make mistakes as she was worried about other students’
reactions.
Lily’s SWOT is shown in Table 4.3 where the researcher noted down any
comments Lily made but also observations.
Daisy, in her interview, explained that her bedroom had just been renovated and
they used pastel colours to create a relaxed feeling. As reading made her feel relaxed
she chose images that used the same colour palette (see Fig. 4.2). The other items
such as the cake, macrons, and the hanging chair all reminded her of how reading
was relaxing and how she likes to read. The gems meant that she was achieving great
things with her reading.
Daisy also talked about she was a dancer and is at dance 24/7. This experience
influenced the types of books she read including ones with costumes. She also read
about cooking and food as her mum had a Thermomix. Reading made her hungry,
so she usually has snacks when she reads. She said reading made her feel relaxed
I like reading books about mysteries, like I like reading, like I’ve just finished this book
about these girls went and lived in a palace and they kicked out because there was a special
thing in the palace that they always wanted, they’re like thieves cause they stole something
in the palace, so they stole like a jewel from the palace, but there were two jewels that they
wanted and so they had to keep going back and getting the jewels like without them getting
kicked out…
I looked at it because it’s kind of like a costume without the like tulle or whatever…and it’s
kind of like a costume and I like to read books about dance as well…at the moment I’m
reading Chocolat and I, so me and my mum are reading that, and then by myself I’m reading
the Maddie Diaries by Maddie Seigler which is really good and I like it.
In these comments, we can see that Daisy is able to recount the narratives from the
books she was reading at the time.
At school, Daisy said she is part of the pink reading group, but she thinks she
should be in the blue group. Her group had not changed throughout the entire year:
I think I’ve improved a lot with my reading, but like I would have liked to have moved up to
the blue group but I don’t know if I’m ready for that yet, like I can read really, like I’ve got
the fluency, my teacher says I’ve got fluency, I’ve got expression, I’ve got like, I can connect
to the text, she said I’ve got all that but I don’t know if she thinks I’m ready to go up to the
blue group…
I think because blue group is where all the really smart kids are…and they’re really good
readers so I don’t think I’m up to their standards yet, I think I’m in the middle of their
standards…
Daisy’s comments showed that she was well aware of reading levels and what these
levels entail. She could effectively gauge what type of reader she was and where she
sat in relation to her peers. She was also aware of the areas of reading she found
difficult and some strategies that could assist.
I think the words I find difficult because I’m only starting now to understand some words,
like the words are a bit hard for me… Sometimes I have trouble understanding the meaning
of the story…Unless I talk to mum about it… I don’t understand the meaning of this story
because a lot of the books I’m reading mum has read and I don’t understand the meaning of
the story.
I like getting criticism and feedback to help me grow because my mum said if you get
criticism then you know that your teacher or whatever is, you know that you’ve got someone
who is trying to help you do it so I like getting that.
When asked about whether or not Daisy felt she had a voice at school in terms of
choice she said she would prefer to have the option of reading a wider variety of texts
at school such as magazines and newspapers. She commented on feeling behind in
her work in other areas such as maths, as the topics changed quickly particularly due
to NAPLAN.
I try and ask for help, because I’m a year behind everyone, I can’t keep up with the work
as easy. I need, like usually we go on one topic, we learn that for a week and then the next
topic, but I understand because we missed so much of our learning in NAPLAN and we have
to do that, but I’m falling behind a bit in my maths and everything. (Daisy)
Daisy’s SWOT is shown in Table 4.4 where the researcher noted down any comments
Daisy made but also observations.
4.3 Student Interviews 85
Such a metaphor was quite powerful for Tiana as reading for her at school was the
opposite. In Year 1, she was asked to read aloud to the class but was very shy. When
she did not say anything for about 30 s everyone laughed at her. This experience for
Tiana impacted negatively on her reading confidence.
The memories in Year 1 just keep coming back to me. When I read in a circle – I can’t read
out aloud – I don’t like people hearing me. I like hiding behind someone. […] I didn’t speak
for 30 seconds and then everyone laughed at me. I am trying to get over it but it is really
hard to get over it.
86 4 Reading Self-Concept and Student Perceptions
Tiana talked about how she really did not like being singled out to read words or do
spelling it made her feel ‘really anxious’.
I don’t like people hearing me – even if you are in a corner. I get really anxious when it
comes to spelling…I only like to do things when everyone does it I don’t like being singled
out. I like reading to our buddies.
In relation to meeting challenges with reading, Tiana talked about feeling frustrated
sometimes but then she would usually find an easier book to read to feel better.
Normally I feel like this with reading. Sometimes I just rage quit and get really angry and
throw the book, I have some anger issues. Yeah I normally get a book that is easier to read.
Like my dad’s old yellow cat and dog book. It is about this friend who found a dog wandering
in the forest. They make stuff together and talk together.
She also used a number of other strategies such as sounding out chunks of words
that were difficult and ones she might experience when reading science texts.
Sounding out chunks of the words and I use the strategy of seeing a word inside a word. I
normally use that or if I have real trouble I skip the word and then come back to it so I know
what it will mean…It is normally the long words. I really like science. Mum and dad said if
you really like science we can go to a space place. There are really tricky words in science.
Tiana’s SWOT is shown in Table 4.5 where the researcher noted down any comments
Tiana made but also observations.
reading self-concept, which may affect their reading motivation and hence hamper
their reading performance. To better understand the views and perceptions of some of
these students, we shared findings from three of the Year 4 students who participated
in a focus group as well as individual interviews. They provided interesting feedback
related to their self-perception as readers including the need to have choice in what
they read, to receive support to improve their confidence towards reading, and to
have the option of being able to change reading groups when possible. Generally,
the students enjoyed reading outside of school but found reading practices in school
stressful at times and often uninteresting. These findings are useful for teachers,
the leadership team, and parents in relation to approaches to improving students’
reading self-concept.
References
Cartwright, K. B., Marshall, T. R., & Wray, E. (2016). A longitudinal study of the role of reading
motivation in primary students’ reading comprehension: Implications for a less simple view of
reading. Reading Psychology, 37(1), 55–91. https://doi.org/10.1080/02702711.2014.991481.
Chapman, J. W., & Tunmer, W. E. (1995). Development of young children’s reading self-concepts:
An examination of emerging subcomponents and their relationship with reading achievement.
Journal of Educational Psychology, 87(1), 154–167. https://doi.org/10.1037//0022-0663.87.
1.154.
Semel, E., Wiig, E. H., & Secord, W. A. (2006). Clinical evaluation of language fundamentals—
fourth edition—Australian (4th ed.). Marrickville: Harcourt Assessment.
88 4 Reading Self-Concept and Student Perceptions
Snowling, M. J., Stothard, S. E., Clarke, P., Bowyer-Crane, C., Harrington, A., Truelove, E., &
Hulme, C. (2012). York assessment of reading for comprehension (YARC), (Australian ed.).
London: GL Assessment.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.
Chapter 5
Intervention Initiatives Across Three
Levels of Instruction
Reading Panel (2000) stressed the importance of vocabulary instruction for reading
success, and hence building teacher capacity in explicit and systematic vocabulary
instruction contributes to a language-rich teaching and learning environment.
Research shows that the ability to acquire and express spoken vocabulary is a
key to improve and sustain reading comprehension. The size of vocabulary, that
is, the number and variety of words that students know, is a significant predictor
of reading comprehension in the middle and secondary years of schooling, and of
broader academic and vocational success (Beck, McKeown, & Kucan, 2013; Clarke,
Truelove, Hulme, & Snowling, 2014). Students who lack adequate vocabulary have
difficulty getting meaning from what they read. As a result, they may read less
because they find reading difficult. Weak word recognition skills (including phonemic
awareness, phonics, and fluency) also contribute to the gap between how much good
and poor readers will read and encounter new vocabulary. As a result, they learn
fewer words because they are not reading widely enough to encounter and learn new
words.
Given this reciprocal relationship between reading and vocabulary growth, and
the difficulties faced by struggling readers particularly relating to exposure, explicit
instruction in vocabulary is considered one important intervention approach. Explicit
or robust vocabulary teaching provides explanations of word meaning, across varied
contexts, as well as multiple opportunities to explore and apply the words, which, in
turn, can add substantially to the vocabulary growth of all students (Beck, McKeown,
& Kucan, 2008). This teaching assists students to grow as readers and thinkers in both
fiction and non-fiction, develops a deeper understanding of the words and concepts
students are partially aware of, nurtures understanding of new concepts, increases
reading comprehension, and enhances both oral and written communication skills
(Allen, 1999). For this reason, Robust Vocabulary Instruction was conducted at the
Tier 1 level of classroom support to facilitate vocabulary knowledge for all students.
What it means to ‘know’ a word is not a simple notion. Word learning is incremental,
that is, understanding a word is usually partial at first and grows with repeated
exposures. Dale and O’Rourke (1986) conceptualised word learning as being along
a continuum, ranging from never having seen or heard the word before, to having
a deep knowledge of the word and its different meanings, and the ability to use the
word confidently and accurately in speaking and writing contexts.
As outlined by Beck et al. (2013), research findings point to the need to create
classrooms that support and encourage sophisticated word usage through a rich oral
language environment characterised by:
• Multiple encounters (modelling and practice) in a variety of contexts;
• Rich and extensive opportunities to practise using new words that promote deep
processing and more complex levels of understanding;
5.1 Robust Vocabulary Instruction 91
• Ample structured reviews to revisit learned words within and across lessons;
• Numerous opportunities to learn and reinforce vocabulary through wide indepen-
dent reading;
• Nurturing an appreciation for words and how they are used; and
• Explicitly taught word meanings using clear, consistent, and understandable
language.
The current project adopted the vocabulary instruction model proposed by Beck
et al. (2013), in which vocabulary is classified into three tiers according to a word’s
frequency of use, complexity, and meaning. This classification of words into ‘tiers’
is based on the premise that not all words have equal importance when it comes to
recommended instructional time.
Tier 1 words, or basic words, are words that usually develop without help, even
if slowly. These words are seldom explicitly taught. Words such as ‘dog’, ‘red’, and
‘big’ would be classified as Tier 1 words, for example. Tier 2 words, or interesting
words, are very important because of the role they play in literacy. Tier 2 words
are the words that characterise written text—but are not so common in everyday
conversation. What this means is that learners are less likely to be exposed to these
words during everyday conversations. The opportunities to learn Tier 2 words come
mainly from interaction with text. Because gaining meaning from written context is
more difficult than gaining meaning from oral contexts, learners are less likely to learn
Tier 2 words on their own in comparison with the words of everyday oral language.
For example, words such as ‘fortunate’, ‘ordinary’, ‘wonderful’, and ‘plead’ would
be classified as Tier 2 words. Tier 3 words are generally subject or domain-specific
and as such do not have high use. These words are best learned if/when specific need
arises. ‘Isotope’, ‘conduit’, and ‘beaker’ are examples of Tier 3 words.
In choosing words for instruction, McGregor and Duff (2015) suggested the
following questions are asked:
• Is this word more likely to occur in written language than in spoken language?
• Would this word occur across various subject areas?
• Can the word be explained in student-friendly terms? A student-friendly expla-
nation involves an explanation of a word’s meaning in everyday, connected
language.
Words selected for explicit instruction should be drawn from the curriculum; from
texts or books read in class; or from assessment materials. The words targeted in the
Reading Success project were Tier 2 words drawn from curriculum to the classroom
(C2C) texts in conjunction with the speech pathologist and classroom teacher. These
C2C materials included texts or books that would be read in class throughout the
school term.
Intervention All school staff (across all year levels), including the leadership
team, attended a 1-hour professional development (PD) session on Robust Vocabu-
lary Instruction at the beginning of the school year. This PD covered the relationship
between vocabulary and reading comprehension, Beck’s tiers of vocabulary, word
selection, student-friendly definitions, and the steps in Robust Vocabulary Instruction.
92 5 Intervention Initiatives Across Three Levels of Instruction
The Reading Success project then focused on Robust Vocabulary being implemented
within the Year 5 cohort. As part of this project two Year 5 classrooms were ‘control
classes’, meaning the teachers received the whole-school training at the beginning
of the school year but chose to not actively complete the programme in their class-
rooms. The speech pathologist and the class teacher implemented Robust Vocabulary
Instruction within two ‘intervention classes’ each week.
The intervention was provided at the whole-class level and included adhering to
the nine steps for introducing a new word (see Appendix for a lesson plan example).
Each week, six words were introduced. In the first four weeks, the speech pathologist
introduced 3–4 words and the teacher observed. The teacher then completed the
remaining 2–3 introductions and follow up for all words throughout the week, with
one follow-up session being demonstrated by the speech pathologist. Follow-up
activities aimed at providing the students with opportunities to: (1) use the Robust
word in multiple contexts, (2) create links with other words, (3) use the Robust word
in a sentence to show the meaning, and (4) ensure repeated exposure to the word. For
example, the teacher may ask (1) the student to think of a person or job that relates to
the word; (2) the teacher may ask the students to think of a word that has the opposite
meaning; (3) the teacher provides an object or topic and asks the student to make
a sentence related to the topic and containing the Robust word; and (4) the teacher
may ask the student to describe the Robust word so that other students can guess
what it is. For more examples of activities, please refer to the two books authored by
Beck and colleagues (Beck et al., 2008, 2013). From week 5 onwards, the teachers
were responsible for introducing all vocabulary words and completing the follow-up
activities for the words during the week. To support fidelity of implementation across
classrooms, an observation checklist was developed and completed by the speech
pathologist in each classroom during the implementation of the programme.
Assessments All students participated in pre- and post-testing of 12 Tier 2 words,
in the first week (week 1) and final week (week 10) of each school term. The words
included in the testing comprised both Tier 2 words that would be targeted within the
Robust Vocabulary Instruction that term (as described above) and also control words
that would not be explicitly taught. The list of Tier 2 words that were included in
the pre- and post-testing for Term 1 are shown in Table 5.1. Students were provided
with a sheet of paper that included the target word, and space for writing their own
definition of the target word and a sentence containing that word. Students were then
given the following instructions:
Please write your name at the top of the sheet.
I will read each word and then I would like you to write what you think the word means in
the first box. The meaning could be just one word that means the same as the word I read, or
it could be a few words that explain the meaning. I would then like you to write a sentence
that uses the word. (read example).This is not a spelling test and it doesn’t matter if you spell
words incorrectly in your sentences. Just have a go. If I read the next word and you haven’t
finished your sentence, you can come back to it later. You might not know some of these
words, so I would just like you to do your best and have a go.
5.1 Robust Vocabulary Instruction 93
Table 5.1 The list of 12 Tier 2 words that were used for pre- and post-assessment
No Word Meaning Sentence
Ex. Delighted Very happy I was delighted when my mum bought my favourite cake for
my birthday
1 Reliable
2 Guarantee
3 Diversity
4 Attitude
5 Genuine
6 Raucous
7 Imposing
8 Colony
9 Dwellers
10 Menace
11 Prevent
12 Vulnerable
Table 5.2 Score guide for robust vocabulary pre- and post-assessments
Points Description Relation to robust vocabulary instruction
0 Not able to provide a sentence or Never heard it before
definition that reflects the meaning of the Heard it, but don’t know what it means
word
1 Can provide a sentence that demonstrates In context, I know it has something to do
some knowledge of the word. Has some with…
difficulties defining the word
2 Can provide a sentence and definition Know it and use it
which reflects the true meaning of the
word
Student responses for each of the 12 words were scored from 0 to 2 according to the
procedure outlined in Table 5.2. A total pre- and post-test score was then tallied for
each student.
Table 5.3 Group performance on target and control words prior to and the following intervention
Target words Pre-post Control words Pre-post
Pre Post Effect size Pre Post Effect size
Intervention 3.53 (2.0) 6.85 (2.4) 2.023* 3.18 (2.1) 4.1 (2.1) 0.582*
classes 0–9 1–12 0–8 0–8
Control classes 3.47 (1.7) 4.61 (1.8) 0.753* 3.0 (1.4) 3.25 (1.8) 0.190
0–7 1–7 1–7 0–8
Note Max score is 12. * significant improvement in performance from pre to post (p < 0.05)
means the maximum score for this task was 24 (12 for the target words and 12 for
the control words). Table 5.3 provides an overview of the results. To determine if
the differences between the two groups were clinically significant, i.e. observable
in the classroom, effect sizes were calculated and reported using Cohen’s d (Cohen,
1988). Following Cohen’s guidelines, d = 0.2 is considered a ‘small’ effect size,
0.5 represents a ‘medium’ effect size, and 0.8 represents a ‘large’ effect size. As
shown in Table 5.3, although both groups showed a significant improvement on the
target words following the school term, the intervention classes showed a much larger
improvement, with a very large effect size. Only the intervention classes showed a
significant improvement on the control words, with a large effect size.
5.1.3 Discussion
The results from this small-scale investigation showed that a Robust Vocabulary
Approach, using an integrated service delivery model where speech pathologists
and teachers work collaboratively to support vocabulary instruction was effective in
enhancing student performance on a vocabulary task in which students were asked
to demonstrate their understanding of Tier 2 words. Not only did the students in
the intervention classrooms show better performance than their peers in the control
classes on the target words post-intervention, they also demonstrated better perfor-
mance on the control words. These results suggest that robust vocabulary may indeed
‘kindle a lifelong fascination with words’ (Beck et al., 2013).
Based on the students’ performance on the YARC reading accuracy subtest (Snowl-
ing et al., 2012), as outlined in Chap. 2 (step 3), combined with their performance on
tasks tapping the skills needed for efficient word recognition skills, 12 students from
the Year 4 cohort were invited to participate in an intervention programme aimed at
5.2 Orthographic Knowledge and Phonological Processing Instruction 95
Lindamood & Lindamood, 2004), which measures the student’s ability to discrim-
inate speech sounds (phonemes) and to analyse the number and order of sounds in
spoken patterns. Students are asked to demonstrate their knowledge by matching the
number and the colour of small blocks to the number and patterns of sounds (e.g.
show me /d//d/: two blocks, same colour; show me /m//ch//ch/, three blocks, first
block a different colour to the next two blocks). According to the LAC manual, at the
start of Year 5, the minimum recommended score is 86. All students scored below the
minimum recommended score. There were no differences in performance between
the intervention and the control group.
Students completed a six-week programme, comprising two sessions per week, one
30-mins individual session, and one 60-mins group (three students) session. Each
session covered two components: (1) phonological processing and (2) orthographic
knowledge. Although all students completed similar activities (as described below),
the specific phonemes or phoneme combinations that were targeted in both compo-
nents of the intervention were based on the students’ performance on the LeST. The
two components of the intervention will now be discussed in more detail below.
Phonological processing This programme explicitly targets students’ phonolog-
ical processing skills and was firmly based on Gillon and Dodd’s (1995) work, in
which ten students with significant reading difficulties participated. The programme
itself is based on the Lindamood Phoneme Sequencing Programme, now into its
fourth edition (LiPS; Lindamood & Lindamood, 2011), which systematically teaches
students to segment, manipulate, and blend the speech sounds (phonemes) in syl-
lables and words. We used Gillon and Dodd’s adapted version of this programme
by using traditional letter names to teach the students to encode sounds in syllables
(using simple syllable sets, simple syllable chains, and complex syllables). We used
the metalinguistic approach recommended in the LiPS programme which drew the
students’ attention to changes in syllables by explicitly describing these during the
activities. The sessions involved reading and writing (of real and nonsense words) to
ensure a transfer of segmentation and blending skills to reading and writing. Finally,
as per the process outlined in the LiPS manual, the sessions included the teaching
of some basic spelling rules (magic /e/; the /c/ in reading; two vowels go walking).
Figure 5.1 provides an overview of the six-week programme.
Progress tracking sheets were used to monitor progress and to ensure students
only moved on to the next level (simple syllables, CVC, varied, shifts) when they
achieved 70–80% success at a certain level. Once a student was proficient with the
coloured blocks, i.e. could quickly and accurately work through all levels using the
coloured blocks and described the changes that were made with confidence, the
student would only work with letter tiles. Use of progress checking sheets allowed
for an individualised approach.
5.2 Orthographic Knowledge and Phonological Processing Instruction 97
Week Activities
Week 1 1. Tracking speech sound changes – coloured blocks, simple syllables (VC
and CV).
2. Tracking speech sound changes – letter tiles, simple syllables.
3. Writing and reading simple syllables (game).
Week 2 1. Tracking speech sound changes with blocks, move from VC to C, to
CVC and Varied.
2. Tracking speech sound changes with letter tiles as above.
3. Writing and reading simple syllables (game).
Week 3 1. Tracking speech sound changes with blocks, varied syllables, using a
timer.
2. Tracking speech sound changes with letter tiles – aiming for 80%
accuracy on CVC before moving to CCVC.
3. Writing and reading CVC words (game).
Week 4 1. Tracking speech sound changes with blocks, some students move on to
shifts (e.g.ask to aks). Use a timer.
2. Tracking speech sound changes with letter tiles, using individualised
tracking sheets to match students’ orthographic knowledge.
3. Real word reading (CVC) – cut out and put face up on the desk. Read as
many as you can in 1 min.
4. Real word and non word (CVC) spelling – write on cards.
5. Teach the magic /e/ rule: write the pairs on cards (e.g. bit – bite).
6. Board / bingo game – using the cards.
Week 5 1. Tracking speech sounds with blocks (only those students who had not
yet achieved this level). Use a timer.
2. Tracking speech sound changes with tiles.
3. Real word reading in 1 min (using cards).
4. Simple spelling (CVC) – write on cards.
5. Teach the ‘C in Reading’ rule – write on cards.
6. Reading the cards (game).
Week 6 1. Tracking speech sound changes with blocks – improve your time.
2. Tracking speech sound changes with letter tiles – improve your time.
3. Real word reading (include magic /e/ and soft /c/ rules).
4. Teach the two vowels go walking rule – create cards.
5. Game – read the cards.
Fig. 5.1 Overview of the six-week programme targeting phonological processing skills
Table 5.5 Mean group performance before and after intervention (SD in brackets)
Measures Intervention group Control group
Pre Post Pre Post
Orthographic knowledge LeST – raw scores 32 (5.4) 46.3 (3.7) 33.3 (6.7) 34.8 (7.8)
Single word reading CC-2 Regular words 24.7 (6.9) 28.5 (6.5) 24.8 (7.5) 25.3 (7.6)
Raw score
Irregular words 16.8 (2.6) 17.8 (1.7) 17.2 (3.2) 19.2 (5.3)
Raw score
Note Shading indicates significant progress from pre- to post-intervention p < 0.05; # n = 5 as
missing non-word data for B39
All students were re-assessed after the six weeks of intervention on measures of
orthographic knowledge (LeST), single word reading (CC-2), and reading accuracy
(YARC RA). Repeated measures t-tests were used to calculate changes in perfor-
mance from pre- to post-intervention. As shown in Table 5.5, the students in the
intervention group demonstrated significantly greater gains (p < 0.05) on measures
of letter-sound knowledge and single word reading (regular and non-words). No
significant group differences were found on measures of irregular word reading or
reading accuracy.
To determine if the differences between the two groups were clinically significant,
i.e. observable in the classroom, effect sizes were calculated and reported using
Cohen’s d (Cohen, 1988). Following Cohen’s guidelines, d = 0.2 is considered a
‘small’ effect size, 0.5 represents a ‘medium’ effect size, and 0.8 represents a ‘large’
effect size. As shown in Fig. 5.2, although both groups demonstrated progress over
time, the intervention group made significantly more progress than the intervention
group on measures of orthographic knowledge, regular, and non-word reading.
5.2.3 Discussion
5 4.914
4.5
4
3.5
2.861
3
2.5
2
1.498
1.5
1 0.711 0.685
0.463 0.339
0.5 0.2820.253
0.095
0
LeST CC2 Regular CC-2 Irregular CC-2 Nonwords YARC - Reading
Accuracy
Intervention Control
Fig. 5.2 Effect size comparison (Cohen’s d) of the progress made in the intervention versus control
group following six weeks of intervention
had made significant progress in single word reading (improving more than 1 z-
score), closer inspection showed that many of the students still scored significantly
below expectations (i.e. z-score < −1).
These results raise important questions regarding the timing of the interven-
tion (Year 5). Research suggests intervention targeting word recognition difficul-
ties is more effective during the earlier school grades than during the later years
(see Wanzek & Vaughn, 2007). Early detection of reading difficulties is possible as
long as sensitive and specific assessment tasks are used (see Chap. 3). By using an
RtI approach, students showing early signs of dyslexia (i.e. fail to make satisfac-
tory progress in word recognition despite high-quality classroom instruction) can
then receive timely intervention. Another important issue to consider is the dura-
tion of the intervention. Although little is known about the exact dosage needed to
effect more significant changes in reading accuracy, research suggests more exten-
sive intervention is needed than the six weeks we provided as part of the Reading
Success project (see Al Otaiba, Gillespie Rouse, & Baker, 2018). Finally, it is not
clear what the active ingredients were as the intervention contained a combination
of phonological processing and orthographic knowledge tasks, using both examiner-
and computer-assisted instructional methods.
100 5 Intervention Initiatives Across Three Levels of Instruction
Similar to the Clarke et al. (2010) study, all sessions contained a range of evidence-
based techniques, including “comprehension monitoring, cooperative learning,
graphic organizers for story structure training, question answering and generat-
ing, summarisation, and multiple-strategy teaching” (p. 1108). Each session had the
same structure and contained the four components of vocabulary, graphic organiser,
reciprocal teaching, (figurative language, if applicable), and spoken expository (see
Table 5.7). All sessions adhered to the following principles: (a) rich interaction and
high-quality contextualised discussion; (b) integrate opportunities for relating mate-
rial to personal experiences, and (c) exploration of vocabulary and spoken expository
Table 5.6 Student performance prior to (end of Year 4) and following the intervention (mid Year 5)
Pre-YARC CELF-4 Post-YARC CELF-4 YARC
Groups Age RA SS RC SS RC AE USP (SS) RC SS RC AE USP (SS) RC AE pre-post RC SS pre-post
Intervention
S31 9;03 97 84 7;10 6 79 7;08 3
S74 10;03 95 83 8;03 3 101 11;01 7
S49 9;09 93 75 7;01 6 96 10;0 7
S57 10;03 89 79 7;10 10 88 9;03 12
5.3 Expository Text Structure Intervention
Mean (SD) 80.3 (4.1) 7;11 6.25 91.0 (9.6) 9;06 p = 0.08 p = 0.163
(0;06) (1;04) g = 1.27 g = 1.06
Control
S68 10;03 93 75 7;05 5 <70 7;01 3
S56 9;11 97 83 7;10 6 85 8;09 5
S48 10;03 85 83 8;03 5 74 7;08 10
S23 9;06 91 73 6;08 11 <70 5;07 11
Mean (SD) 78.5 (5.3) 7;07 74.8 (7.1) 7;04 p = 0.65 p = 0.200
(0;08) (1;02) g = 0.138 g = 0.43
Note YARC York Assessment of Reading for Comprehension, reading accuracy (RA), reading comprehension (RC); USP Understanding Spoken Paragraphs
from the Clinical Evaluation of Language Fundamentals (CELF-4); SS standard score; AE age equivalence; g = Hegde’s g
101
102 5 Intervention Initiatives Across Three Levels of Instruction
Table 5.7 Session overview of expository language intervention programme (adapted from Clarke
et al., 2010)
Duration
3 min Introduction Summary Session 2
5 min Vocabulary Introduce a Tier 2 word, Session 2. Ask students to
related to the topic, using a come up with a synonym
Robust Vocabulary and a sentence to
Approach (Beck et al., demonstrate understanding
2013). Learning additional
words through context
10 min Graphic organiser Introduce the Repeat
passage article, or YouTube
clip. Explain the type of
expository it is and explain
why
10 min Reciprocal teaching Listen to the expository Keywords or key phrases
passage and write down only
keywords on the graphic Use a game with students
organiser. Students raise drawing question cards they
their hand if they do not need to answer
understand a concept,
sentence, or passage. Model
this behaviour
Question generation
activity: what, when, how,
where, who, why, can,
which?
Activate background
knowledge and develop
visual representations; use
reciprocal teaching
strategies
5 min Figurative language If applicable. Explore Repeat
non-literal language such as
jokes, riddles, and idioms
10–20 min Spoken expository Using the graphic Students take turns and
organiser, one student provide each other with
provides oral summary; feedback. Students are
others provide feedback. given a graphic organiser
Scaffolded by the SLP each with a pencil so they
Practice is key can tick if the other student
has included the
concepts/descriptors
3 min Plenary Wrap-up. Today we looked at…
5.3 Expository Text Structure Intervention 103
through varied games and activities, as well as worksheets. For further details see
also Clarke et al. (2014).
The following five types of expository discourse were targeted during the interven-
tion: (1) description, (2) procedure sequence, (3) comparison, (4) cause and effect,
and (5) problem and solution. Each type of discourse had an accompanying graphic
organiser, which was printed on A3 paper and used during the session. Graphic organ-
isers were sourced online, for example, by performing a Google search or by visiting
the www.readwritethink.org website. All five types of expository discourse were
briefly introduced during the first session, and students were informed they would
focus on a different type each week. The topic or content of the expository passages
was matched to the topics that were covered in the classroom during those six weeks.
For example, Unit 3.1 focused on ‘the Riddle of the Black Panther–The Search’ (Edu-
cation Services Australia). As a consequence, the first expository session focused on
the Black Panther (What’s a Black Panther, Really? National Geographic, 2015).
Other sessions included comparing rugby union to soccer, why native goannas are
dying (invaders), and flying foxes.
In week 6, the content of the previous five weeks was covered by introducing a
topic (in this case ‘flying foxes’) and asking students what type of expository they
could think of for the same topic, using the graphic organisers as prompts.
All students were re-assessed after the intervention on Form A of the YARC to investi-
gate their reading comprehension performance (standard score and age equivalence).
Repeated measures t-tests were used to calculate changes in performance from pre-
(i.e. at the end of Year 4) to post-intervention. As shown in Table 5.6, the students
in the intervention group demonstrated larger gains in reading comprehension than
the control group who participated in the regular classroom activities, as shown by
the effect size (Hegde’s g, whereby small effect [cannot be discerned by the naked
eye] = 0.2; medium effect = 0.5; large effect [can be seen by the naked eye] = 0.8).
Although the progress in reading comprehension made by the intervention group
was not statistically significant (p > 0.05), the intervention group’s performance
showed a large effect size for both their standard score and their age equivalence
(i.e. > 1 standard deviation change) compared to the control group who showed < 0.4
standard deviation change. These results indicate the experiment was underpowered
(very small sample size), but also suggest the students in the intervention group made
noticeable gains in reading comprehension following the intervention.
104 5 Intervention Initiatives Across Three Levels of Instruction
5.3.3 Discussion
The results from this pilot study showed the potential effectiveness of a short intensive
intervention aimed at enhancing students’ expository structure knowledge. Although
replication is needed with larger numbers, our results are in line with those from
previous studies (e.g. Clarke et al., 2010).
In response to the high literacy needs of the middle year students, the school imple-
mented a supplementary whole-class oral language and emergent literacy interven-
tion in the foundation year of schooling. This programme was based on Read It
Again-PreK (Justice & McGinty, 2010), which is freely available online (see refer-
ence list), but adapted, with permission for the local context. The adapted version
is called Read It Again—FoundationQ! (Department of Education, Training and
Employment, 2013), and can be downloaded for free from the same website (see
reference list).
Read It Again is firmly based on current research regarding how adults can support
children’s language and literacy development using systematic and explicit instruc-
tion presented in highly meaningful literacy events such as storybook reading. A
key feature of Read It Again is the repeated use of children’s storybooks as a way
to enhance language and literacy development. Studies indicate that repeated book
reading influences both story-related vocabulary and story-related comprehension
5.4 Supplementary Whole-Class Oral Language and Emergent … 105
and that the average effect size for the relationship between repeated book reading
and outcomes is larger when a book is read four or more times (Trivette, Simkus,
Dunst, & Hamby, 2012; Zucker, Cabell, Justice, Pentimonti, & Kaderavek, 2013).
FoundationQ! is aligned to the Australian curriculum for the foundation (first)
year of schooling and can be delivered as either a Tier 1 (differentiated) or Tier 2
(focused) teaching strategy within a response-to-intervention model. In this project,
FoundationQ! was delivered as a Tier 1 strategy across all Prep (foundation year
of schooling) classrooms. As explained in Chap. 1, differentiating instruction is a
critical feature of a response-to-intervention model and advocates for active planning
for student differences to ensure that every student is engaged, participating, and
learning successfully (Goddard, Goddard, & Tschannen-Moran, 2007; Tomlinson,
2000).
Read It Again—FoundationQ! incorporates six differentiation strategies (three
Too Easy strategies and three Too Hard strategies) that educators can use to scaf-
fold students’ performance on similar tasks or activities. To enhance teachers’ use
of the differentiated instructional strategies, a capacity-building model including
coaching by the speech pathologist was implemented. The provision of coaching by
speech pathologists is a natural extension of their consultative and collaborative ser-
vices within a response-to-intervention framework and has been found to be effective
when combined with in-service workshops (Milburn, Weitzman, Greenburg, & Giro-
lametto, 2014). The professional development programme implemented employed a
combination of a 1.5 h workshop that explained the intent, content, and structure of
the programme and provided opportunities to identify and apply the Too Easy and
Too Hard differentiation strategies; and individual coaching sessions incorporating
demonstration lessons, scheduled observations, and instructional feedback.
To increase teachers’ awareness of individual student needs and to assist teach-
ers in differentiating instruction, Read It Again—FoundationQ! includes a student
progress checklist (see the website for a copy), which measures individual students’
progress against the learning objectives in each of the four domains specific to Foun-
dationQ!. The checklists are administered at three separate points (after week 2, week
12, and week 21) during the 30-week intervention period. Development of skills is
rated by teachers as:
– Acquiring: student never or occasionally demonstrates the skill
– Building: student often demonstrates the skill, but is not yet consistent and/or
requires assistance, or
– Competent: student consistently demonstrates the skill
100 94.3
91.5
90 85.5
80
69.2
70
60.3
60 56.4
50
40
30
20
10
0
YARC RC YARC RA YARC RR
Yr4-pre RIA Yr2-post RIA
Fig. 5.3 Cohort (Year 2 vs. Year 4) performance on the YARC pre-and post-implementation of Read
It Again (RIA): percentage of students performing within normal limits in reading comprehension,
reading accuracy, and reading rate
Year 4 cohort (n = 78) who had not participated in the Read It Again—FoundationQ!
programme (see Chap. 3 for specific cohort results) to the performance of the Year
2 cohort (n = 69; see Chap. 3 for specific cohort results when these students were in
Year 1) on the YARC. On paper, these cohorts were similar with the Year 4 cohort
comprising 6% indigenous students; 31% English as a Second Language (School
ICSEA score 1005) and the Year 2 cohort comprising 6% indigenous students, and
35% ESL (ICSEA, 2013).
As shown in Fig. 5.3, a significantly higher percentage of students in Year 2
performed within normal limits (i.e. SS ≥ 85) on the YARC reading comprehension
(91.5% vs. 57.1%). Similar results were seen for reading accuracy and reading rate.
5.4.3 Discussion
This chapter presented the results from four evidence-based intervention initiatives.
In the reading to learn cohort, Robust Vocabulary Instruction was provided at the
whole-class level, with results showing larger gains in vocabulary knowledge in the
intervention classes compared to the control classes. The other two intervention ini-
tiatives for this cohort of students involved students who had been identified with
specific areas of weaknesses in spoken and/or written language skills impacting their
reading comprehension performance. Expository structure intervention was provided
to a group of students with specific comprehension difficulties, whereas a group of
students with specific word recognition difficulties participated in an orthographic
knowledge and phonological processing intervention programme. Results from these
interventions were positive but modest, highlighting the importance of early identi-
fication of reading difficulties to enable more timely intervention. In the learning to
read cohort, Read It Again—FoundationQ! was implemented at whole-school level
in all Prep classes. Although our research design did not allow for firm conclusions
regarding the effectiveness of this initiative, cohort mapping results showed a sig-
nificant improvement in reading performance over time prior to and following the
implementation of Read It Again—FoundationQ within the school. Implementation
of the five-step assessment framework introduced in this book will now be needed
to identify those students who need additional support, as our previous research has
shown that implementation of this type of whole-class supplementary oral language
and emergent literacy intervention alone may not be sufficient for long-term reading
success (Lathouras, Westerveld, & Trembath, 2019).
Introduction to word
Tier 2 Target: misconception
Steps of introduction Tier 2 target word outline
1. Text or Activity Text: ‘What’s a Black Panther, Really?’
Teacher reads the text to the students or Author: Liz Langley
completes other Key Learning Area activity Target word: Misconception
2. Context The article we are about to read is about a
Teacher contextualises the word within the misconception
text/activity
3. Repeat What’s the word? Misconception
Students repeat the word
(continued)
108 5 Intervention Initiatives Across Three Levels of Instruction
(continued)
4. Friendly Definition A misconception is an idea that is not correct.
Teacher provides a friendly definition of the You could also call it a myth
word
5. Teacher Examples There are many fears and misconceptions
Teacher provides examples in contexts other about cancer
than the one in the text/activity I think people have the misconception
that restaurants make a lot of money
It is a misconception that you have to wait
24 h before filing a missing person report
6. Repeat What’s the word? misconception
Students repeat the word.
7. Interaction I’m going to say some things and if you think
Teacher creates a situation/activity where the they are a misconception that kids could have,
students ‘interact’ with the word say ‘misconception’; if you think they are
correct, say nothing
– Chocolate milk comes from brown cows
– The sky is blue
– 1+1=2
– If you dig a really deep hole in the ground,
you will get to China
8. Students’ Examples Do you think you have ever had a
Students provide some of their own examples misconception? You could say, I had a
misconception when I thought…
9. Repeat What’s the word that means you have an idea
Teacher says friendly definition and students about something that turns out not to be
say the word again correct? misconception
References
Al Otaiba, S., Gillespie Rouse, A., & Baker, K. (2018). Elementary grade intervention approaches
to treat specific learning disabilities, including dyslexia. Language, Speech, and Hearing Services
in Schools, 49(4), 829–842. https://doi.org/10.1044/2018_LSHSS-DYSLC-18-0022.
Allen, J. (1999). Words, words, words: Teaching vocabulary in Grades 4-12. Portland, ME:
Stenhouse Publishers.
Beck, I. L., McKeown, M. G., & Kucan, L. (2008). Solving problems in the teaching of literacy.
Creating robust vocabulary: Frequently asked questions and extended examples. New York, NY:
Guilford Press.
Beck, I., McKeown, M., & Kucan, L. (2013). Bringing words to life—Robust vocabulary instruction
(2nd ed.). New York: Guilford Press.
Castles, A., Coltheart, M., Larsen, L., Jones, P., Saunders, S., & McArthur, G. M. (2009). Assessing
the basic components of reading: A revision of the Castles and Coltheart test with new norms
(CC2). Retrieved from www.motif.org.au.
Clarke, P. J., Snowling, M. J., Truelove, E., & Hulme, C. (2010). Ameliorating children’s reading-
comprehension difficulties: A randomized controlled trial. Psychological Science, 21(8), 1106–
1116.
References 109
Clarke, P. J., Truelove, E., Hulme, C., & Snowling, M. (2014). Developing reading comprehension
(1st ed.). Hoboken: Wiley.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ:
Erlbaum.
Dale, E., & O’Rourke, J. (1986). Vocabulary building: A process approach. Columbus, OH: Zener-
Bloser.
Dawson, N., & Ricketts, J. (2017). The role of semantic knowledge in learning to read exception
words. Perspectives of the ASHA Special Interest Groups, 2(1), 95–104. https://doi.org/10.1044/
persp2.SIG1.95.
Department of Education, Training and Employment. (2013). Read It Again-FoundationQ! Queens-
land, Australia: Queensland Department of Education, Training and Employment. https://
earlychildhood.ehe.osu.edu/research/practice/read-it-again-prek/.
Gillon, G., & Dodd, B. (1995). The effects of training phonological, semantic, and syntactic pro-
cessing skills in spoken language on reading ability. Language, Speech, and Hearing Services in
Schools, 26(1), 58–68.
Gillon, G., McNeill, B., Scott, A., Denston, A., Wilson, L., Carson, K., et al. (2019). A better start
to literacy learning: Findings from a teacher-implemented intervention in children’s first year at
school. Reading and Writing. https://doi.org/10.1007/s11145-018-9933-7.
Goddard, Y. L., Goddard, R. D., & Tschannen-Moran, M. (2007). A theoretical and empirical
investigation of teacher collaboration for school improvement and student achievement in public
elementary schools. Teachers college record, 109(4), 877–896.
Hiebert, E. H., & Kami, M. L. (Eds.). (2005). Teaching and learning vocabulary: Bringing research
to practice. Mahway, NJ: Erlbaum.
Justice, L., & McGinty, A. (2010). Read it again-PreK! A preschool curriculum supplement to
promote language and literacy foundations. Columbus, OH: Crane Center for Early Childhood
Research and Policy. https://earlychildhood.ehe.osu.edu/research/practice/read-it-again-prek/.
Larsen, L., Kohnen, S., Nickels, L., & McArthur, G. (2015). The Letter-Sound Test (LeST): A
reliable and valid comprehensive measure of grapheme–phoneme knowledge. Australian Journal
of Learning Difficulties, 20(2), 129–142. https://doi.org/10.1080/19404158.2015.1037323.
Lathouras, M., Westerveld, M., & Trembath, D. (2019). Longitudinal reading outcomes in response
to a book-based, whole class intervention for students from diverse cultural, linguistic and socio-
economic backgrounds. Australian Journal of Learning Difficulties, 24, 147–161. https://doi.org/
10.1080/19404158.2019.1640755.
Lennox, M., Westerveld, M. F., & Trembath, D. (2018). Evaluating the effectiveness of PrepSTART
for promoting oral language and emergent literacy skills in disadvantaged preparatory students.
International Journal of Speech-Language Pathology, 20(2), 191–201. https://doi.org/10.1080/
17549507.2016.1229030.
Lindamood, P. C., & Lindamood, P. (2004). Lindamood auditory conceptualization test: Examiner’s
manual. Austin, TX: Pro-Ed.
Lindamood, P. D., & Lindamood, P. D. (2011). Lindamood phoneme sequencing program for
reading, spelling, and speech (4th ed.). USA: Pro-Ed.
McGregor, K., & Duff, D. (2015). Promoting diverse and deep vocabulary development. In T. A.
Ukrainetz (Ed.), School-age language intervention: Evidence-based practices (pp. 247–278).
Austin, TX: Pro-Ed.
Milburn, T. F., Girolametto, L., Weitzman, E., & Greenberg, J. (2014). Enhancing preschool edu-
cators’ ability to facilitate conversations during shared book reading. Journal of Early Childhood
Literacy, 14(1), 105–140.
National Early Literacy Panel. (2008). Developing early literacy: Report of the National Early
Literacy Panel. Washington, DC: National Institute for Literacy.
National Reading Panel. (2000). Teaching children to read: An evidence-based assessment of the
scientific research literature on reading and its implications for reading instruction (Vol. NIH
Publication No. 00-4769): Washington, DC: U.S. Government Printing Office.
110 5 Intervention Initiatives Across Three Levels of Instruction
Semel, E., Wiig, E. H., & Secord, W. A. (2006). Clinical evaluation of language fundamentals—
fourth edition—Australian (4th ed.). Marrickville: Harcourt Assessment.
Snowling, M. J., Stothard, S. E., Clarke, P., Bowyer-Crane, C., Harrington, A., Truelove, E., &
Hulme, C. (2012). York assessment of reading for comprehension (YARC), (Australian ed.).
London: GL Assessment.
Tomlinson, C. A. (2000). What is differentiated instruction. In Carolyn M. Callahan & H. L.
Hertberg-Davis (Eds.), Fundamentals of gifted education: Considering multiple perspectives
(pp. 287–300). New York, NY: Routledge.
Trivette, C. M., Simkus, A., Dunst, C. J., & Hamby, D. W. (2012). Repeated book reading and
preschoolers’ early literacy development. Center for Early Literacy Learning, 5(5), 1–13.
Wanzek, J., & Vaughn, S. (2007). Research-based implications from extensive early reading
interventions. School Psychology Review, 36(4), 541–561.
Zucker, T. A., Cabell, S. Q., Justice, L. M., Pentimonti, J. M., & Kaderavek, J. N. (2013). The role of
frequent, interactive prekindergarten shared reading in the longitudinal development of language
and literacy skills. Developmental Psychology, 49(8), 1425–1439.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.
Chapter 6
Case Studies
Abstract In this chapter, we provide three case studies of students with different
reading profiles. We demonstrate how using the five-step assessment to intervention
approach explained in Chap. 2 assists in creating a detailed profile of each student’s
strengths and weaknesses in spoken and written language skills that are needed
for successful reading comprehension. We highlight the importance of collabora-
tion between professionals involved in the identification of students who experience
difficulties in reading, to avoid duplication of assessments and to ensure targeted
intervention can be provided by the most relevant professional at the required tier of
intervention within a response-to-intervention (RtI) model.
James (S46: not his real name, age 9 years, 8 months) attended Year 4 at the school
and had attended the school since Prep (foundation year). His enrolment form iden-
tified him speaking English as an additional language or dialect, but his teachers
reported no concerns about his command of the English language. However, his
teachers were concerned about his reading comprehension skills and had noticed dif-
ficulties in his spelling too (see Fig. 6.1). James’ NAPLAN results (https://www.nap.
edu.au/results-and-reports/how-to-interpret) were at the national minimum standard
when he was tested in Year 3, particularly for writing and grammar and punctuation
(reading band 3; writing band 2; spelling band 3; grammar and punctuation band 2).
James performed at age level on the PROBE 2 (Parkin & Parkin, 2011); however, his
performance on the PAT-R (Australian Council for Educational Research, 2018) was
stanine 2, indicating performance well below expectations. When administering the
Reading Self-Concept Scale (Chapman & Tunmer, 1995), James’ response indicated
average perceptions of difficulties, competence, and attitude. We investigated James’
reading skills using our five-step assessment to intervention approach, as described
in Chap. 2 (Fig. 2.2).
Further assessment of word recognition, using the Castles and Coltheart test of single
word reading (CC2; Castles, Coltheart, Larsen, Jones, Saunders, & McArthur, 2009)
showed severe difficulties in single word reading across regular, irregular, and non-
sense words (z’s < −2.0). James performed poorly in orthographic knowledge (z =
−2.32 using Year 3 norms of the Letter-Sound Test (LeST; Larsen, Kohnen, Nickels,
6.1 Case Study 1: James 113
A speech-to-print profile was created based on the assessment results from Steps
1 to 3. The speech pathologist administered the Rapid Automatic Naming task of
the CELF-4 which showed performance within normal limits. In addition, informa-
tion was gathered from the classroom teacher, including James’ performance on the
Single Word Spelling Test (SWST; Sacre & Masterson, 2000) as shown in Fig. 6.1
(age equivalence 6 years, 9 months). James also participated in a curriculum-based
assessment in English, based on the Novel Rowan of Rin by Emily Rodda. For this
assessment, students had to explain how the author of this novel represents the main
character in an important event. Students had to select an important event, complete
several scaffolded tasks, including writing a draft, before producing a final copy
of their response, which the teacher marked as not satisfactory for James’ level of
schooling (see Fig. 6.2). Because of the teacher’s concerns about James’ ability to
answer some of the scaffolded questions and create a coherent response, the speech
pathologist also administered an expository task (Heilmann & Malone, 2014) in
which James was asked to explain his favourite game or sport. James chose to explain
how to play soccer. The speech pathologist noticed James did not make effective use
of the planning sheet in that he did not write down any keywords on page 1, but
chose to draw a picture instead (Fig. 6.3). Although he used long sentences, he had
difficulty formulating complex ideas and his explanation lacked cohesion (i.e. “little
discernible order to topics; much jumping between topics; and abrupt transitions
between topics”). The final speech-to-print profile is shown in Fig. 6.4.
Fig. 6.4 James’ speech-to-print profile (Adapted from Gillon, 2004, with permission from the
author). Note for interpretation of scores, see Fig. 2.1 (Bell curve). Shading: white = not tested;
light grey = within normal limits/no concerns; dark grey = significant difficulties
6.1 Case Study 1: James 115
Based on the assessment results, it is clear that James has a profile of dyslexia (i.e.
specific word recognition difficulties). James demonstrates reading comprehension
difficulties due to his weaknesses in accurate and fluent word recognition skills, even
though he shows adequate language comprehension skills. As explained in Chap. 1,
these reading difficulties generally stem from phonological processing weaknesses;
in James’ case, he showed particular challenges with phoneme awareness tasks, but
demonstrated age-appropriate performance in rapid automatic naming (i.e. phono-
logical retrieval). James’ difficulties in reading accuracy at sentence-level on the
YARC were confirmed at word level on the CC-2, with significant difficulties on
regular, irregular, and non-word reading. Moreover, James struggled on the LeST;
closer inspection showed poor performance in naming of short vowels, digraphs, and
diphthongs.
Worth mentioning is James’ poor performance on the expository generation task,
with limited use of complex sentences and poor cohesion. As shown in Fig. 6.3,
James did not make efficient use of the planning sheet to organise his explanation of
how to play the game/sport of his choice, soccer. James’ difficulties on this task are
most likely the result from reduced exposure to complex written materials due to his
persistent word recognition difficulties (see also Chap. 1 for a discussion). Consider-
ing the emphasis on expository text from Year 4 of schooling (Snyder & Caccamise,
2010), this places James at high risk of facing challenges in most academic subjects,
including English but also History and social studies.
Although James’ teachers had been concerned about his reading skills, he had not
been identified with specific word recognition difficulties. It seems likely that James’
strong language comprehension skills masked his reading accuracy/word recogni-
tion difficulties during the early years of schooling. Early identification of James’
significant difficulties in word recognition would have prompted early intervention.
Considering the evidence that early intervention is critical for children with dyslexia
to avoid long-term challenges in academic achievement and socio-emotional well-
being, the importance of routinely use of sensitive reading assessment tools cannot
be underestimated.
Hannah (S02: not her real name) attended Year 4 when she became involved in the
Reading Success project. Hannah (age 9 years, 10 months) spoke English as her first
and only language and had attended the school since Prep (foundation year). Hannah
had been referred to the speech pathologist because of concerns about her spoken
language skills when she was in Prep but had not been verified with speech-language
impairment (SLI). Hannah’s teachers were still concerned about her oral language
skills and had become increasingly concerned about her reading skills. Hannah’s
NAPLAN results were at the national minimum standard when she was tested in
Year 3 for writing and spelling in particular (reading band 3; writing band 2; spelling
band 2; grammar and punctuation band 3). Hannah’s performance on the PAT-R was
stanine 2 (5th percentile), indicating performance well below expectations. Hannah’s
PROBE-2 results were not available. When administering the Reading Self-Concept
Scale (Chapman & Tunmer, 1995), Hannah’s responses indicated very low self-
ratings for reading difficulties and reading competence, but a high rating on reading
attitude. We investigated Hannah’s reading skills using our five-step assessment to
intervention approach, as described in Chap. 2 (Fig. 2.2).
Further assessment of word recognition, using the CC-2 showed severe difficulties in
single word reading across regular (z = −2.31), irregular (z = −1.37), and nonsense
6.1 Case Study 1: James 117
Based on the assessment results, it is clear that Hannah has a profile of mixed read-
ing difficulties. In other words, her reading comprehension difficulties stem from
significant weaknesses in word recognition and language comprehension. It is inter-
esting to note that Hannah performed within normal limits on phoneme awareness.
This may reflect the fact that she had received phonological awareness interven-
tion from the speech pathologist in Year 2. It is not clear whether this intervention
included activities aimed at improving Hannah’s grapheme knowledge (i.e. letter-
sound correspondences), particularly more complex ones. Further inspection of her
LeST results showed a mastery of all 26 letters of the alphabet (except for /i/ and /x/),
but difficulty with most digraphs (e.g. /ng/, /gn/, and /igh/ ) as well as diphthongs. It
118 6 Case Studies
Fig. 6.5 Hannah’s speech-to-print profile (Adapted from Gillon, 2004, with permission from the
author). Note For interpretation of scores, see Fig. 2.1 (Bell curve). Shading: white = not tested;
light grey = within normal limits/no concerns; dark grey = significant difficulties
is also not clear if this intervention specifically included spelling and reading tasks
aimed at improving fluency (automaticity). It is of concern that Hannah showed a
relatively low self-concept when we asked her questions regarding reading using the
Self-Concept Scale (Chapman & Tunmer, 1995), indicating she found reading diffi-
cult and that she was not very good at it. In contrast, she scored higher when asked
about her attitudes towards reading, highlighting she quite liked reading. As shown
in Chap. 4, it is important to consider the students’ self-perceptions when initiating
intervention to ensure they are engaged in the process. In Hannah’s case, a SWOT
analysis would have provided invaluable information.
Considering Hannah demonstrated significant spoken language difficulties on
standardised tests of language ability (the CELF-4 and the TNL), the speech pathol-
ogist transcribed Hannah’s alien story (from the TNL), using Systematic Analy-
sis of Language Transcripts, New Zealand/Australia version (SALT-NZ/AU; Miller,
Gillon, & Westerveld, 2017) to perform a more detailed language sample analysis
(Miller, Andriacchi, Nockerts, Westerveld, & Gillon, 2016). As shown in Fig. 6.6,
Hannah’s story was short, and showed a few grammatical errors, such as incorrect
use of an article, referential pronoun, and noun-verb agreement. When comparing
her performance to age-matched peers from the TNL database using SALT-NZ/AU,
Hannah showed little use of complex sentences (low MLU: mean length of utter-
ance); semantics (low semantic diversity in number of different words), and gram-
matical accuracy (in % utterances with errors). The SALT-NZ/AU database standard
measures report is shown in Fig. 6.7 (with areas of difficulty highlighted in grey).
6.1 Case Study 1: James 119
$ Child, Examiner
+ Language: English
+ ParticipantId: Hannah
+ Gender: F
+ CA: 9;6
+ Grade: 4
+ Context: Nar
+ Subgroup: TNL
+ Aliens
C there was a[ew] alien ship what[ew] just landed.
C and they were coming out.
C they were : a mum, a kid, a dog .
C and a dad.
C : and another girl.
C and there's a boy and a girl in the corner.
C and (the boy :02) the girl was trying to drag the boy.
C and the aliens[ew] girl : was waving.
C and the : girl : was shouting get the man to come out and say hello.
C and the little girl was leading the dog.
C and they looked like they were gonna Camp.
C that's it.
Fig. 6.6 Hannah’s SALT transcript of the alien story (TNL, Gillam & Pearson, 2004). Note C =
child; ew = error at word level; :02 = pause of 2 s
Bill (S38: not his real name) attended Year 1 of his local primary school. Bill (age
7 years, 1 month) had attended the school since the commencement of Prep (foun-
dation year) the previous year. His enrolment form identified him as only speaking
English in the home environment. At the end of Year 1, Bill demonstrated reading
skills that were considered to be ‘within expectations’ for his Year level, with a PM
Benchmark (Smith, Nelley, & Croft, 2009) level of 21 (with level 16 considered
satisfactory at the end of Year 1). As part of the Reading Success project, we inves-
tigated Bill’s reading skills using our five-step assessment to intervention approach,
as described in Chap. 2 (Fig. 2.2).
6.2 Case Study 3: Bill 121
On the YARC, Bill showed mild difficulties in reading accuracy (SS = 84, AE: 5;
10 years), with 14.3% mispronunciations, 71.4% substitutions, and 14.3% refusals.
We were unable to calculate reading rate as the beginner level passage is not timed
and Bill exceeded the maximum number of reading accuracy errors on Level 1 of
the YARC.
Step 2ii: Further Assessment of Students Who Scored Below Expectations:
Check LC
Bill showed language comprehension skills well below expectations (SS = 4), using
the Understanding Spoken Paragraphs subtest from the CELF-5 (Wiig et al., 2017).
Further assessment of word recognition, using the CC-2 showed difficulties in single
word reading across regular (z-score = −1.22), and nonsense words (z-score = −
1.03). However, Bill demonstrated satisfactory skills in irregular word reading (z
= −0.73). Bill performed within expectations on the SPAT-R (Neilson, 2003); he
showed difficulties in his orthographic knowledge on the LeST (z-score = −1.04).
A speech-to-print profile was created, based on the assessment results from Steps 1
to 3. The final speech-to-print profile is shown in Fig. 6.8.
Based on the assessment results, it is evident that Bill has a reading profile most
consistent with mixed reading difficulties, that is, Bill demonstrated reading accu-
racy below expectations as well as difficulties with his language comprehension. In
looking more closely at Bill’s word recognition skills, it was evident that he had dif-
ficulties with his single word reading, including regular and nonsense words on the
CC-2. He also showed poor orthographic knowledge. However, an area of strength
122 6 Case Studies
Fig. 6.8 Bill’s speech-to-print profile (Adapted from Gillon, 2004, with permission from the
author). Note For interpretation of scores, see Fig. 2.1 (Bell curve). Shading: white = not tested;
light grey = within normal limits/no concerns; dark grey = significant difficulties
for Bill included reading irregular words at the single word level, which reflects a
strength in sight word reading. He also showed adequate phonological awareness
skills when we administered the first seven subtests of the SPAT-R.
Bill would benefit from further assessment of his spoken language skills by the
speech pathologist to determine whether his difficulties on the Understanding Spoken
Paragraphs subtest of the CELF-5 stemmed from difficulties at word- and sentence-
level. We would also want to check his ability to use spoken language at the discourse
level, for example, to tell or retell a fictional narrative (see Chap. 2 for an overview
of relevant assessment tasks). In addition, we would ask the teacher for classroom
examples of Bill’s written work, including the most recent results of a spelling test.
It is of concern that Bill’s reading difficulties had not been identified by the school-
based reading assessment, PM Benchmark. This assessment was administered at a
similar point in the school year to the YARC. As outlined in Chap. 3, Bill is one of the
13 students (14% of the Year 1 cohort) who performed within typical limits on the PM
Benchmark but showed significant difficulties on the YARC reading comprehension
subtest.
Following the completion of the assessment, as part of the Reading Success project,
Bill received access (without one-on-one support in a whole-class setting) to the
Reading Doctor App by his classroom teacher to target his orthographic knowledge.
Re-assessment on the LeST following this intervention indicated continued difficul-
ties with orthographic knowledge (z-score = −1.62). Closer inspection of time spent
6.2 Case Study 3: Bill 123
on the app showed Bill was given access for only 56 min, reaching level 6 (out of 10).
This is significantly less than the time spent by the students in Year 5 (see Chap. 5),
indicating the importance of future research investigating the dosage effects of this
type of intervention.
In terms of other support, Bill’s narrative comprehension and production skills
were assessed by the speech pathologist in Term 1, Year 2. He subsequently received
small-group intervention targeting his narrative skills in Terms 2 and 3 of Year 2.
These small groups were run by the speech pathologist at the school with support from
a teacher aide. Following this intervention and on re-assessment with the YARC at
the end of Year 2 (i.e. one year later), Bill demonstrated improvements in his reading
accuracy (now SS 88) and reading comprehension (SS <70 to SS 86). Bill’s reading
rate continued to reflect difficulties (RR 76), though, suggesting he was still decoding
at a slower rate than expected for his age. Considering the importance of fluent word
recognition for reading comprehension, it is important Bill’s reading skills are closely
monitored.
In this chapter, we shared three case examples to demonstrate the usefulness of our
five-step assessment to intervention framework (based on the Simple View of Read-
ing) in: (a) determining which students may need further assessment of their spoken
and/or written language skills, (b) understanding an individual student’s strengths
and weaknesses in the skills needed for successful reading comprehension, and (c)
selecting specific targets for intervention. Both Year 4 students James and Hannah
had performed at or above the benchmark on the NAPLAN in Year 3; both students
showed difficulties on the PAT-R. However, further inspection revealed very different
reading profiles, with significant implications for intervention and progress monitor-
ing practices. Our case example Bill emphasised the importance of identification of
reading difficulties during the early years of schooling and how timely intervention
may assist early reading success. As outlined in Chap. 1, using the stepped assess-
ment framework and its corresponding speech-to-print profile will thus encourage
collaborative practice, by not only ensuring there is no double-up of assessments,
but also by promoting a shared understanding between all professionals involved in
the teaching of reading to aim for timely and effective instructional practices within
a multitiered systems of support approach.
124 6 Case Studies
References
Al Otaiba, S., Gillespie Rouse, A., & Baker, K. (2018). Elementary grade intervention approaches
to treat specific learning disabilities, including dyslexia. Language, Speech, and Hearing Services
in Schools, 49(4), 829–842. https://doi.org/10.1044/2018_LSHSS-DYSLC-18-0022.
Applebee, A. (1978). The child’s concept of story. Chicago: University of Chicago Press.
Australian Council for Educational Research. (2018). Progressive achievement tests in reading
(PAT-R). Australia: Author.
Australian Curriculum Assessment and Reporting Authority [ACARA]. (2012). The Australian
Curriculum—English. Retrieved from www.australiancurriculum.edu.au.
Castles, A., Coltheart, M., Larsen, L., Jones, P., Saunders, S., & McArthur, G. M. (2009). Assessing
the basic components of reading: A revision of the Castles and Coltheart test with new norms
(CC2). Retrieved from www.motif.org.au.
Chapman, J. W., & Tunmer, W. E. (1995). Development of young children’s reading self-concepts:
An examination of emerging subcomponents and their relationship with reading achievement.
Journal of Educational Psychology, 87(1), 154–167. https://doi.org/10.1037//0022-0663.87.
1.154.
Gillam, S. L., & Gillam, R. B. (2016). Narrative discourse intervention for school-aged children
with language impairment: Supporting knowledge in language and literacy. Topics in Language
Disorders, 36(1), 20–34. https://doi.org/10.1097/TLD.0000000000000081.
Gillam, R. B., & Pearson, N. A. (2004). Test of narrative language. Austin, TX: Pro-ed.
Heilmann, J., & Malone, T. O. (2014). The rules of the game: Properties of a database of expository
language samples. Language, Speech, and Hearing Services in Schools, 45(4), 277–290.
Hughes, D., McGillivray, L., & Schmidek, M. (1997). Guide to narrative language: Procedures for
assessment. Eau Claire, WI: Thinking Publications.
Larsen, L., Kohnen, S., Nickels, L., & McArthur, G. (2015). The Letter-Sound Test (LeST): a
reliable and valid comprehensive measure of grapheme–phoneme knowledge. Australian Journal
of Learning Difficulties, 20(2), 129-142. https://doi.org/10.1080/19404158.2015.1037323.
Miller, J. F., Andriacchi, K., Nockerts, A., Westerveld, M., & Gillon, G. (2016). Assessing language
production using SALT software. A clinician’s guide to language sample analysis. New Zealand—
Australia version (2nd ed.). Middleton, WI: SALT Software.
Miller, J. F., Gillon, G. T., & Westerveld, M. F. (2017). Systematic analysis of language transcripts
(SALT), New Zealand/Australia Version 18 [computer software]. Madison, WI: SALT Software
LLC.
Milosky, L. M. (1987). Narratives in the classroom. Seminars in Speech and Language, 8(4), 329–
343.
Neilson, R. (2003). Sutherland phonological awareness test - revised (SPAT-R) (Revised ed.).
Jamberoo, NSW: Author.
Parkin, C., & Parkin, C. (2011). PROBE 2: Reading comprehension assessment. Wellington, NZ:
Triune Initiatives.
Sacre, L., & Masterson, J. (2000). Single word spelling test (SWST). Windsor, Berkshire: NFER-
Nelson.
Semel, E., Wiig, E. H., & Secord, W. A. (2006). Clinical evaluation of language fundamentals—
fourth edition—Australian (4th ed.). Marrickville: Harcourt Assessment.
Smith, A., Nelley, E., & Croft, D. (2009). PM benchmark reading assessment resources (AU/NZ).
Melbourne: Cengage Learning Australia.
Snowling, M. J., Stothard, S. E., Clarke, P., Bowyer-Crane, C., Harrington, A., Truelove, E., &
Hulme, C. (2012). York assessment of reading for comprehension (YARC), (Australian ed.).
London: GL Assessment.
Snyder, L., & Caccamise, D. (2010). Comprehension processes for expository text: Building mean-
ing and making sense. In M. A. Nippold & C. M. Scott (Eds.), Expository discourse in children,
adolescents, and adults (pp. 13–39). NY: Psychology Press.
References 125
Wagner, R. K., Torgesen, J. K., & Rashotte, C. A. (2013). The comprehensive test of phonological
processing—Second Edition (CTOPP-2). Austin, TX: Pro-ed.
Westerveld, M. F., & Gillon, G. T. (2008). Oral narrative intervention for children with mixed
reading disability. Child Language Teaching and Therapy, 24(1), 31–54.
Wiig, E. H., Semel, E., & Secord, W. A. (2017). Clinical evaluation of language fundamentals
Australian and New Zealand (5th ed.). Bloomington, MN: NCS Pearson.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.
Part III
Findings from Teacher Interviews
and Recommendations
Chapter 7
Feedback
Abstract In this chapter we share results from the qualitative data collected prior to
(phase 1) and after completion (phase 2) of the Reading Success project. Information
from staff interviews revealed some broader issues related to a whole-school approach
to the teaching of reading. Findings showed several common themes, including the
importance of positive and collaborative learning environments; having consistent
language and communication across the whole school in relation to literacy education
and more specifically the teaching of reading; and the inclusion of varied approaches
that can be demonstrated for the purpose of professional learning.
Initially, we were interested in knowing what the teachers and leaders thought about
the literacy practices already being implemented in the school. We, therefore, carried
out a number of interviews with staff involved in the teaching of reading at the school
to find out what approaches and/or programmes were being used in the teaching of
reading and whether the staff thought these were effective or not. It is important to
note that these interviews were carried out at the beginning of the school’s reform in
literacy and reading instruction.
As outlined in Chap. 2, questions included: (1) What programmes are currently in
place that aim to improve literacy (and more specifically reading) outcomes for
students?; (2) What are the current perceptions of the success as well as areas
in need of improvement of these programmes?; and (3) What future programmes
and approaches to improving success in literacy learning might be considered in
your school? We also gathered demographic information about the teachers that are
displayed in Table 7.1.
Further, questions were semi-structured and focused on the approaches used
within the school to improve literacy learning, and more specifically reading out-
comes. Questions were based around the context of the school, the programmes
being used for literacy, and what the teachers feel about the effectiveness of such
methods. We also asked what the teachers thought should be enhanced and how this
could happen.
All interviews were transcribed and member checked (Koelsch, 2013; Oliver,
Serovich, & Mason, 2005). The team all read the transcripts and wrote down any
recurring codes and then themes (de Casterle, Gastmans, Bryon, & Denier, 2012).
After initial themes were identified one team member took the lead on further analysis
using an inductive thematic approach (Braun & Clarke, 2006) and consequently a
number of themes were identified in these data including:
• The importance of a positive learning environment;
• A consistent language for literacy learning;
• Varied activities as well as a targeted approach;
• Cultural diversity.
All staff interviewed commented extremely positively about the school’s culture and
learning environment. They unanimously felt that the ‘culture’ of the school had been
greatly improved over the past three years since the implementation of the Positive
Behaviour Learning programme, called RISE. Having a consistent approach that was
visible throughout the school via posters and slogans was making a difference. Staff
believed that the students were willing to learn as a result and that parents were also
aware of the school’s overarching philosophy. This, in turn, had also impacted on
the reputation of the school in the community.
It’s actually a really great school. The kids are really good. They’re very positive. They love
to learn which is great. You notice a huge change in their attitude towards school when they
7.1 Phase 1 Staff Interviews 131
hear RISE. So if you all of a sudden say what’s RISE, they go, Respect, Independence,
Safety, Effort and it’s one of those things that’s now drilled into them and they take it home
with them which is really good too. There’s a lot of visuals around the school which is great.
It’s a constant reminder for them. Lisa
Many of the teachers commented on how the school has a wide range of cultures
represented in its school cohort. They felt that this diversity contributed to the positive
learning environment established in the school. However, they also mentioned that
a number of students come and go regularly which makes teaching them difficult.
Overall, they felt the school had a good reputation in the community, which had not
always been the case.
Probably low to mid socio-economic backgrounds, varied culture. Fairly, transient I think.
Marie
I would suggest it is a school that is very multi-cultural but also very accepting. So we’re a
bit of a mixed bag. We’ve got high flyers, low flyers, different cultures all the way through.
I actually think that it’s a school that’s really progressing forward. […] I think a lot of effort
has gone into focusing on where the children have come from and then making them progress
forward. Polly
All staff commented on a whole-school literacy approach that had been developed
collaboratively over the past few years. The importance of a consistent language for
literacy learning was highlighted by each interviewee. All staff believed that the six
strategies for reading comprehension, for example, impacted positively on students’
learning as it enabled teachers to focus on other areas of instruction; children were
able to recite this quickly and move on to the next task. The reading and writing
strategies, as well as differentiation and behaviour management plans as outlined in
the school’s improvement priorities (Table 7.2), were certainly having an effect.
The school recognised that professional development (PD) was an important com-
ponent of all priority areas. Korthagen (2017) noted that both new and traditional
approaches to PD are needed for teachers. He highlighted how a lot of “teacher
learning takes place unconsciously and involves cognitive, emotional and motiva-
tional dimensions” (p. 387). Others have also revealed that professional development
needs to be varied and not always top-down (Bahr, Dole, Bahr, Barton, & Davies,
2007; Hargreaves & Ainscow, 2015) and should take into account all teachers’ needs.
Bahr et al. (2007) offered a model of effective PD that notes the importance of teach-
ers being able to choose the type of professional learning that suits their needs best.
Further, the staff felt valued and that their voice was being heard.
In the last three years … new ideas coming out and they are monitored properly. Teachers are
given time like this to sit down and discuss things and the peer observation we can go to other
classes where teachers sit through what they’re doing and then pick up something from there.
Then they come to your class and they pick up something. There’s a lot of opportunities.
There’s a lot of time factor given to us to collate all this information.
132 7 Feedback
Generally, the staff felt supported in their work in the teaching of reading and noted
that there were a number of improvements that had been happening over the past
year, since the implementation of the school’s reading plan.
Our school’s very focussed at the moment in terms of reading. Literacy is a really big push for
us, especially reading. We do a lot of data meetings and collaborative meetings based around
reading. We’ve got a huge data wall that we use to track how the kids are going in terms
of the age appropriate level and their year level targets. We are just constantly focussing on
reading and trying to improve it. Lisa
A couple of years ago we developed a framework to try and have consistent language and
consistent practices. It takes time to implement. It takes time to up-skill teachers and it
takes time – that’s continuous. But it takes time also to monitor it and to offer support
and coaching, etc. Part of it was around consistent strategies in oral reading, as such and
strategising comprehension. So there are two elements to it. Geoff
7.1 Phase 1 Staff Interviews 133
Another area mentioned consistently by staff was not only the targeted approach
involving the six strategies but also the fact that their literacy plans consisted of varied
activities for the students. Even though staff were asked to implement three literacy
blocks per week that included guided reading activities, they were also able to develop
and implement a range of pedagogical approaches to ensure all students’ needs were
met. Included in these activities were:
• Learning support lessons;
• Small group work;
• Explicit instruction;
• Time for reading;
• Guided reading;
• Comprehension skills;
• Text variance;
• Literacy blocks.
While the variety of activities was seen to be important, some staff did find there
was limited flexibility due to time constraints and the requirement to do three literacy
blocks each week. Further, some staff felt the need for more professional development
on guided reading and that there were just too many expectations as to what to use
in the classroom. Teachers still wanted more experience around particular teaching
strategies in comprehension and in particular inferential comprehension.
When staff described their school they were all well aware of the range of cultural
diversity present in the student cohort—with 33% of students from ESL/EAL back-
grounds and 6% Indigenous. While they mentioned this as a challenge, they also
believed it was strength of the school. Interestingly, some of the staff were also from
culturally diverse backgrounds and had extensive knowledge of different pedagogical
approaches that supported diverse students’ learning needs. Two staff members, for
example, spoke about the importance of having a range of bilingual resources such
as early readers in other languages for the children. They also recognised the need
to acknowledge what the children bring to school with them from home in terms
of experience. Similarly, some staff mentioned the need to improve the school’s
partnership with the community via a range of ways.
134 7 Feedback
After the project was completed, a number of interviews with the staff who were
available as arranged by the deputy principal, were conducted by an independent
research assistant. Post-interviews involved Lisa, Polly, Marie, and Georgia. The
same analytical process was carried out in this phase as in Phase 1, whereby codes
were initially identified and then the codes were clustered into several themes. The
themes in this phase of the project were identified as: communication, the importance
of demonstration, a variety of strategies, and high expectations.
7.2.1 Communication
A strong theme from the second round of interviews was communication. All of
the classroom teachers indicated that there was limited communication about the
project’s final results. They all felt that knowing the results of the children who
participated in the intervention as well as how they improved would have been good.
There wasn’t a lot of communication…I didn’t understand really in the end how… I guess
if it was an improvement I would have liked to have seen the results. The post results for the
kids that I had for those two years, because they are now in year six, and I’ve got no idea
how they’ve progressed. (Lisa)
The teachers were also concerned about how the communication about the process
itself was sometimes confusing. For example, Marie and Polly commented:
I think one of the things that we had the most difficulty with to be honest was just the way
that we were all communicating with each other about what was going on. (Marie)
I don’t really know anything else about the project … They were just being taken out to be
tested with the speech pathologist in the last month or so… I know very little about it except
that [it had] something that had to do with the university and that they were getting data and
that sort of thing (Polly)
Additionally, the teachers indicated that they would have liked to know what the
students were receiving in the small-group instruction. The teachers were, however,
aware of whether or not they were a control group or not. They also knew about
some of the measures such as the YARC, testing sound-letter correspondence, and
the assessments that the speech pathologist was doing. Despite this, they all felt
that knowing more about the results of the overall project as well as what activities
the students were engaging in would have contributed to a more effective approach
across the school.
I think there was some improvement shown from the students that went, but for the rest of
the class, I don’t know that it had a huge impact. Like I said, we being the control group, we
didn’t really get much to support I guess. The kids were being taken out of the classroom to
do it, so it was hard to know what skills they were being shown. (Georgia)
7.2 Phase 2 Staff Interviews 135
One of the positive aspects about communication about the project was that the
teachers observed that there was now a consistent language being used across the
school in relation to teaching of reading. This included the guided reading sessions
as well as the robust vocabulary instruction. The teachers all felt these were both
good approaches but more time reading and further understanding by the students
about reading as a process would enhance the practice overall.
Robust vocab is definitely a strength that we’ve got across the school. You talk about robust
vocab to kids [and they] know exactly what you’re talking about. Pretty much every class in
the school have now got RVI walls. I think it just makes it come to the forefront of teachers’
minds before they teach things. (Lisa)
Another theme related to the second round of interviews was the importance of
demonstrated practice. All of the teachers commented on how powerful it was for
them to see someone else demonstrate and support their practice in the classroom con-
text. This was particularly apparent with the Robust Vocabulary Instruction method.
The teachers, however, would have liked to have seen a similar approach to some of
the other activities related to the teaching of comprehension and decoding.
And I think as a school we’ve just taken on some skills from the reading and we’ve actually
applied it across the whole school level, so including smaller fluency groups, including
smaller vocab groups, doing robust vocab in the whole school I think has really improved as
well, so I think that’s improved my skills, but also just a whole skill level. I thought it was
really good because it was really good to be able to teach the robust vocab instruction and
have someone set that example for us. (Marie)
Marie commented that having someone else show them what to do directly in the
classroom with their students was extremely helpful. This also meant that the teach-
ers were directly involved rather than, as previously explained, the students just
being taken out of the class without the teachers knowing what activities they were
doing. The teachers indicated that they would like more opportunities for someone
to show them explicitly how to improve reading through a number of methods in
their classroom spaces.
Another positive demonstrated practice was identified as the student SWOT inter-
views and speech-to-print profiles. The teachers agreed that these data sets told
them specifically what the students needed once they were back in their mainstream
classrooms.
And I also thought it was really good that we had the interviews [and profiles] because we
could actually see specifically what the kids needed in reading. It told us a bit more than
just the general test, the diagnostic assessments that we would do with them in terms of
PROBE and things like that. It actually broke it down into their phonemic ability and their
phonological awareness, which was really good. (Marie)
136 7 Feedback
The professional learning gained from direct experience was an important notion for
the teachers and links with the theme of communication. It also aligns with the next
theme of being able to utilise a range of strategies to improve reading outcomes.
As per the first round of interviews, using a variety of strategies to improve reading
results was encouraged by the staff. These strategies, it was also revealed, needed
to be communicated and understood across the whole school. Teachers mentioned
some of the consistent approaches that were expected to be implemented such as
guided reading, robust vocabulary, and a number of comprehension strategies such
as skimming, scanning, and summarising. The interviews revealed teachers were
using a range of strategies, but they wanted to know more so they could support their
students further:
I think any focus on literacy, and vocabulary, and comprehension is a good thing. I think it
did help staff across the school, because as a school we have actually adopted the specific
instruction of robust vocab. It makes us think when we’re doing whole unit planning, about the
language involved, and the demands that are going to be needed from the kids to comprehend
the task. (Lisa—second interview)
We have a vocab group that we do when we’re reading, and that’s before they start to read.
We’ve got a fluency group that we’ve got going. We obviously have independent and guided
reading and things like that that we do. And then there’s also a small group that comes up
and admin actually works with them in terms of their phonemic awareness for those smaller
groups, which is really good. And then as a whole class I do robust vocab for those texts that
we do in the classroom, so it’s really good. (Marie)
Despite the teachers having new knowledge about a variety of strategies, it was
clear that more direct links between the classroom work and the Reading Success
project were needed. As highlighted, a whole-school approach needs a comprehen-
sive management plan that includes all methods being connected in meaningful ways.
Unfortunately, both time and money were mentioned as inhibitors to such practices.
We’re looking at different strategies and we do reading groups and we’ve just started doing
inferential questioning…We have a designated time when the children split into their differ-
entiated groups reading and that’s what I’ve been doing all year…They’ve all improved…I
have six groups in my class and I have one group that go with one teacher aide for half an
hour, and they’re doing vocabulary and a cold read. After that they go to the fluency group
the next day and in fluency group they are annotating the passage and after that they come
to me and that’s when I do the harder questions with them and we look at all the things that
they’ve annotated and vocab that they don’t know and then we do a really deep read and that
rotates and keeps rotating. (Polly)
As described by Polly the teachers now have a bank of approaches they can use to
support their students’ reading progress. These, however, seemed not to relate clearly
with the Reading Success programme.
7.2 Phase 2 Staff Interviews 137
An additional theme the teachers regularly mentioned was the importance of having
high expectations and rigorous approaches to teaching reading. The teachers’ dis-
course focused on how they were committed to supporting improvements in the stu-
dents’ overall learning and this required them to expect hard work and high achieve-
ment levels. They felt that the project did ensure that gains were made, particularly,
for the students who may have been struggling more than others.
Just to try, and lift the rigor that’s in a guided reading lesson. And then the second program
is, we actually have our leadership team coming in at least twice a term to actually watch,
observe, and give specific feedback on our guided reading. The children know exactly what
to expect from a guided reading lesson, and teachers are heavily supported in that process.
(Lisa)
And trying to find time and ways to do a more rigorous work, because in the half an hour,
you can only do so much…I think it’s more important that they’re I guess having more time
to look more deeply at the reading. (Georgia)
Finally, the staff believed that as the students were demonstrating improvements this
needed to be reported to the whole school and community. Further, the leadership
team acknowledged how teachers had learnt more about the teaching of reading and
in particularly teaching vocabulary. Not sharing these successes could potentially
make teachers and students feel their efforts are unrewarded. This sentiment also
relates to the need for clearer communication of the results of the project generally.
When interviewing the leadership team, it became clear involvement in the Reading
Success project had been a positive experience. When asked what the school had
learned from their involvement in the project and whether it had changed the way
they approached the teaching of reading, Geoff commented that:
The Reading Success Project confirmed that our school is on the right track in terms of
the targeted teaching of reading. It confirmed and highlighted the importance diagnosing
individuals’ needs and early and ongoing targeted intervention. It highlighted the fact that we
need to invest more time into focusing on comprehension and as a result we have introduced
‘Text Dependent Questioning’ to help in this area.
The leadership team had appreciated the additional human and financial resources
that were offered as part of the project which assisted the school in conducting
the specific reading assessments and in providing the additional targeted reading
interventions. However, there was acknowledgement of the need for ongoing pro-
fessional development for teachers to help them accurately diagnose areas of needs
and to develop specific teaching strategies to help make improvements.
138 7 Feedback
However, the leadership team acknowledged the challenges faced during the
project in “finding a balance between the project’s needs and the school’s needs”.
Although Geoff said there were many positives to come out of being involved in the
project, the school’s main challenges in the implementing the model relates to time
and resources:
There is also a need to ensure teachers have the knowledge, skills and time to implement
additional testing and to provide ongoing targeted teaching as a result of this. This requires a
significant amount of time (ongoing) and resources. Developing a sustainable model is one
of the challenges we face at a school level.
This chapter has shared findings from the staff interviews, prior to and following the
school’s involvement in the Reading Success project. Common themes within the
staff interviews showed the importance of clear and consistent communication across
the whole school in relation to approaches adopted towards teaching reading more
generally, as well as the Reading Success project more specifically. The teachers, in
particular, found demonstrated practice on a range of approaches to teaching reading
extremely beneficial.
It became clear from the teacher interviews that communication regarding the spe-
cific results of the Reading Success project (both assessment and interventions) did
not always filter back to all the teachers even though several professional development
sessions were held with small teams of teachers involved in the project. Achieving
whole-school reform is difficult, as acknowledged in the previous research (Ful-
lan, 2007; McLaughlin & Talbert, 2006), particularly, in relation to literacy learning
and the teaching of reading (Barton & McKay, 2016). Despite such challenges, it
is important that all staff, students, and the community understand the particular
approach being adopted and why. Chap. 8 will summarise what we have learned
from the Reading Success project and how we may ensure these findings are shared
with all stakeholders involved in ensuring reading success for all children.
References
Bahr, N., Dole, S., Bahr, M., Barton, G., & Davies, K. (2007). Longitudinal evaluation of the
effectiveness of professional development strategies. Queensland: University of Queensland &
Bond University.
Barton, G., & McKay, L. (2016). Conceptualising a literacy education model for junior secondary
students: The spatial and reflective practices of an Australian school. English in Australia, 51(1),
37–45.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in
Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa.
References 139
de Casterle, B. D., Gastmans, C., Bryon, E., & Denier, Y. (2012). QUAGOL: A guide for qualitative
data analysis. International Journal of Nursing Studies, 49(3), 360–371.
Fullan, M. (2007). The new meaning of educational change (4th ed.). London: Routledge.
Hargreaves, A., & Ainscow, M. (2015). The top and bottom of leadership and change. Phi Delta
Kappan, 97(3), 42–48.
Koelsch, L. E. (2013). Reconceptualizing the member check interview. International Journal of
Qualitative Methods, 12(1), 168–179.
Korthagen, F. (2017). Inconvenient truths about teacher learning: Towards professional development
3.0. Teachers and Teaching, 23(4), 387–405. https://doi.org/10.1080/13540602.2016.1211523.
McLaughlin, M. W., & Talbert, J. E. (2006). Building school-based teacher learning communi-
ties: Professional strategies to improve student achievement (Vol. 45). New York, NY: Teachers
College Press.
Oliver, D. G., Serovich, J. M., & Mason, T. L. (2005). Constraints and opportunities with interview
transcription: Towards reflection in qualitative research. Social Forces, 84(2), 1273–1289.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.
Chapter 8
Implications and Transferability
to Other School Contexts
Abstract This chapter starts by outlining what we have learned from the Reading
Success project. Based on a summary of the main findings from the Reading Success
project, we then consider ways in which the Reading Success project findings and
practices shared in this book may be transferable to other contexts. We offer recom-
mendations for best practice for schools wanting to investigate their current practices
in the identification of, and support for, students experiencing difficulties in learning
to read. We also describe some strategies that can be used by schools when striving
for scalability. To finish, we provide recommendations on how the approach taken
in this study might be adapted and adopted within other school contexts.
8.1 Introduction
It is undisputable that the teaching of reading is high stakes for all schools. Stu-
dents’ educational success relies on their ability to read fluently and to comprehend
a range of complex texts. To implement the proposed interdisciplinary assessment
and monitoring framework to assist in the identification of students experiencing
difficulties in reading may require some schools to change their practice. Change
is very much a constant in education, and it is important that these practices can
be sustainable over time (Mioduser, Nachmias, Forkosh-Baruch, & Tubin, 2004).
Pendergast, Main, Barton, Kanasa, Geelan, & Dowden (2015) discussed educational
reform that targets improvement in student outcomes, and argued that educational
change becomes increasingly more complex due to the different federal, state, and
local systemic expectations. We understand that schools already have mandated lit-
eracy progress monitoring practices in place, such as NAPLAN at the federal level,
and specific reading assessments at the regional level (see Chap. 2). One major chal-
lenge identified in the literature in relation to sustaining best practice is that schools
are different to each other (i.e. no one size fits all) and that schools can rapidly
change themselves. As suggested by Barton and Woolley (2017), to plan for change,
the importance of an in-depth context analysis should not be underestimated (see
also Thomson, 2010). In the next few sections, we will first provide an overview of
our take-home messages from the project, before briefly considering scalability and
sustainability of our proposed framework.
The overall aim of the Reading Success project was to implement within the target
school, a systematic process for identifying reading difficulties across the primary
school years, using an evidence-based multidisciplinary approach. We believed that
the introduction of an assessment and monitoring framework that was firmly based on
the Simple View of Reading (Gough & Tunmer, 1986) would not only facilitate the
early identification of reading difficulties, but would also enable timely and targeted
intervention to address these challenges in reading. Moreover, creating speech-to-
print profiles (see Table 1.2) for students who demonstrated challenges in reading
served different purposes: (a) to highlight the underlying spoken language skills that
are needed for written language (Gillon, 2018), (b) to encourage interdisciplinary
collaborative practice in collecting assessment data, and (c) to ensure consistent
language was used across the school.
The vision of the Department of Education is that every student in Queensland state
schools receives the support they need to belong to the school community, engages
purposefully in learning, and experiences academic and social success. Students
experience inclusive education when they can access and fully participate in learning
alongside their similar-aged peers, supported by reasonable adjustments and teaching
strategies tailored to meet their individual needs. As highlighted in Chap. 7, it was
clear that the school staff were highly committed to improving literacy learning
outcomes for all students. It was evident that over the past three years or so the
school had made great progress due to a number of strategies implemented around
positive behaviour and reading. The school staff were extremely happy to be working
at this school and commented on the nature of the school and students positively.
This is of credit to all staff and the families at the school. Results from the interview
data were overwhelmingly positive with also room for improvement.
The results from our project showed that the implementation of this assessment
model was possible within a school context and successful in highlighting which
students would benefit from more intensive reading interventions within a response-
to-intervention framework (Fuchs & Fuchs, 2006). However, several issues came to
light which may affect scalability, including time and resources, communication, and
the need for professional development.
8.2 Take-Home Messages From the Reading Success Project 143
Choice of Assessments
To meet the learning needs of all students, the Department has a commitment to
a whole-school approach to quality teaching and improving student achievement.
A whole-school approach directs support to different levels of need in response to
student achievement data and is based on the premise that every student can learn
and should have the opportunity to demonstrate progress on an ongoing basis. Imple-
mentation of a whole-school approach requires gathering and analysis of data that
reflects departmental, regional, and school priorities and demonstrates the integral
link between curriculum, interventions, and student outcomes.
At the differentiated level within a whole-school approach, cohort mapping pro-
vides the information teachers require to adequately differentiate reading instruction
to meet the learning needs of all students within the curriculum. Teachers in schools
have a range of cohort mapping tools available to them, such as Early Start for
Prep—Year 2, and the National Literacy Learning Progression tool from Prep–Year
10 (www.australiancurriculum.edu.au) and it is up to the school team involved in
the teaching of reading to decide which tools best suit their school’s needs and are
sensitive to early difficulties in reading performance. For young school-age students,
these tools should relate to the development of the following six elements as discussed
in Chap. 1 (see also Fig. 1.2): phonological awareness, phonics and phonemic aware-
ness, oral language, fluency, vocabulary, and comprehension (Hempenstall, 2016;
Konza, 2014).
To more accurately gather information about a student’s learning rate and level
of achievement, both individually and in comparison with a peer group, a combina-
tion of cohort mapping and progress monitoring is recommended. Progress moni-
toring ensures that all students, including high-achieving students are appropriately
engaged, challenged, and extended, by designing class activities to meet students’
learning needs, levels of readiness, interest, and motivation. Progress monitoring is
used at a differentiated level to monitor response-to-intervention and may involve
re-administering the cohort mapping tool or the use of curriculum-based measures.
Progress monitoring is also used at focused and intensive levels but may be admin-
istered more frequently and will usually reflect the varying types of intervention
individual students are receiving.
It became clear during the Reading Success Project that the school staff admin-
istered a wide range of assessments on a routine basis, for reporting, as well as
progress monitoring purposes. Administering the York Assessment of Reading for
Comprehension on a routine basis may be too time consuming. However, the results
from this assessment greatly assisted in the early identification of students with dif-
ferent profiles of reading strengths and challenges. To determine if administering
this specific test on a routine basis was needed within the school context, we con-
ducted some comparisons with routinely used reading assessment tasks. Our results
(Chap. 3) showed high correlations between student performance on the PM Bench-
mark Reading Assessments (Smith, Nelly, & Croft, 2009) and the YARC reading
144 8 Implications and Transferability to Other School Contexts
comprehension subtest during the early years of schooling, with excellent specificity
and sensitivity data (i.e. > 80% of students were correctly identified). Our results
also suggest that using a higher cut-off score on the PM Benchmark combined with
a reading fluency task (e.g. TOWRE-2; Torgesen, Wagner, & Rashotte, 2012) may
improve the sensitivity of this assessment.
A different picture emerged when comparing student performance on the Pro-
gressive Achievement Tests in Reading (PAT-R; Australian Council for Educational
Research, 2018) to their performance on the YARC. About 20% of students were
missed when using the PAT-R as a benchmark for reading competency, with these
students showing very different reading profiles, despite adequate performance on
the PAT-R. Therefore, use of the PAT-R on a routine basis may not only misidentify
some students as competent readers, the results of the PAT-R will not help guide
further assessment nor suggest which specific skills to address in intervention (see
case studies in Chap. 6 for more information).
Taken together, our results show the importance of careful consideration of the
routine assessments that are used to identify strengths and weaknesses in reading
performance. As time is a precious resource, we recommend only administering
high-quality assessments that are sensitive to ability and progress and will inform
intervention.
A Team Approach to Differential Diagnosis
8.2.2 Communication
Positive outcomes of reform are possible if the whole school is committed to making
a change in practice happen. In order to do this, effective and ongoing communication
is necessary. As became clear in Chap. 7, the lack of direct communication between
the research team and the teachers affected the teachers’ feelings towards the project.
During the course of the project, we mainly interacted with the leadership team
through regular meetings, although several small-scale professional development
and information-sharing events were held. Often in schools, leadership teams that
include principals, deputy principals, and lead literacy or master teachers, make the
decisions on what approaches to the teaching of reading are used in all classrooms.
Known as a top-down model, this approach frequently results in teachers taking up
mandated approaches rather than have a voice in change, with an increased chance
of failure (Hargreaves & Ainscow, 2015).
Based on the teachers’ feedback, we suggest that schools which consider adopt-
ing the evidence-based reading assessment and monitoring approach presented in
this book develop a comprehensive communication plan across the whole school.
This should include sharing and celebrating the results from projects, such as
the Reading Success project, as well as other professional learning and devel-
opment activities that have impacted positively on teachers’ practices. In addi-
tion, we acknowledge the importance of taking a ‘leading from the middle’
approach (Hargreaves & Ainscow, 2015) that involves schools in an entire dis-
trict, including the community. As Hargreaves and Ainscow argued when such an
approach is taken schools will be better able: (1) to respond to local needs and
diversities; (2) take collective responsibility for all students’ and each other’s suc-
cess; (3) exercise initiative rather than implementing other people’s initiatives; (4)
integrate their own efforts with broad system priorities; and (5) establish transparency
of participation and results (p. 44).
A large body of literature suggests that coaching, that is, individual teaching inter-
actions between an experienced mentor and learner, enhances teachers’ capacity to
make changes in their practice, and that coaching within classrooms is a high yield
146 8 Implications and Transferability to Other School Contexts
8.3 Scalability
Additional strategies are needed when moving beyond the implementation in a sin-
gle setting to multiple settings and may address staffing mix, professional roles, and
physical and organisational structure to support innovation. A major challenge in
scaling up new practices or models is to identify the policy that facilitates imple-
mentation of evidence-based practice, minimises barriers to implementation, and
promotes the innovations at a state-wide or national level (Fixsen, Blase, Metz, &
van Dyke, 2013).
References
Australian Council for Educational Research. (2018). Progressive achievement tests in reading
(PAT-R). Australia: Author.
Barton, G., & Woolley, G. (2017). Developing literacy in the secondary classroom. UK: Sage.
Catts, H. W., McIlraith, A., Bridges, M. S., & Nielsen, D. C. (2017). Viewing a phonological deficit
within a multifactorial model of dyslexia. Reading and Writing, 30, 613–629. https://doi.org/10.
1007/s11145-016-9692-2.
Fixsen, D., Blase, K., Metz, A., & Van Dyke, M. (2013). Statewide implementation of evidence
based programs. Exceptional Children, 79(3), 213–230.
Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components.
Research on Social Work Practice, 19(5), 531–540.
Fuchs, D., & Fuchs, L. S. (2006). Introduction to Response to Intervention: What, why, and how
valid is it? Reading Research Quarterly, 41(1), 93–99. https://doi.org/10.1598/RRQ.41.1.4.
Gillon, G. T. (2018). Phonological awareness: From research to practice (2nd ed.). New York: The
Guilford Press.
Gough, P. B., & Tunmer, W. E. (1986). Decoding, reading, and reading disability. Remedial and
Special Education, 7(1), 6–10. https://doi.org/10.1177/074193258600700104.
Graner, P. S., & Deshler, D. D. (2012). Improving outcomes for adolescents with learning disabilities.
In B. Wong & D. L. Butler (Eds.), Learning about learning disabilities (4th ed., pp. 299–323).
US: Academic Press.
Gray, S., Fox, A. B., Green, S., Alt, M., Hogan, T. P., Petscher, Y., et al. (2019). Working memory
profiles of children with dyslexia, developmental language disorder, or both. Journal of Speech,
Language, and Hearing Research, 62(6), 1839–1858. https://doi.org/10.1044/2019_JSLHR-L-
18-0148.
Hanushek, E. A., Kain, J. F., Markman, J. M., & Rivkin, S. G. (2003). Does peer ability affect
student achievement? Journal of Applied Psychometrics, 18, 527–544.
Hargreaves, A., & Ainscow, M. (2015). The top and bottom of leadership and change. Phi Delta
Kappan, 97(3), 42–48.
References 149
Hempenstall, K. (2016). Read about it: Scientific evidence for effective teaching of reading CIS
research report. Australia: Centre for Independent Studies.
Justice, L. M., Logan, J. A., Lin, T. J., & Kaderavek, J. N. (2014). Peer effects in early childhood
education: Testing the assumptions of special-education inclusion. Psychological Science, 25(9),
1722–1729.
Konza, D. (2014). Teaching reading: Why the Fab five should be the Big six. Australian Journal of
Teacher Education (Online), 39(12), 153–169.
Mioduser, D., Nachmias, R., Forkosh-Baruch, A., & Tubin, D. (2004). Sustainability, scala-
bility and transferability of ICT-based pedagogical innovations in Israeli schools. Education,
Communication & Information, 4(1), 71–82.
Pendergast, D., Main, K., Barton, G., Kanasa, H., Geelan, D., & Dowden, T. (2015). The education
change model as a vehicle for reform: Shifting year 7 and implementing junior secondary in
Queensland. Australian Journal of Middle Schooling, 15(2), 4–18.
Ryan, M., & Barton, G. (2019). Literacy teachers as reflexive agents? Enablers and constraints. The
Australian Educational Researcher. https://doi.org/10.1007/s13384-019-00349-9.
Smith, A., Nelley, E., & Croft, D. (2009). PM benchmark reading assessment resources (AU/NZ).
Melbourne: Cengage Learning Australia.
Snowling, M. J., Stothard, S. E., Clarke, P., Bowyer-Crane, C., Harrington, A., Truelove, E., &
Hulme, C. (2012). York assessment of reading for comprehension (YARC), (Australian ed.).
London: GL Assessment.
Thomson, P. (2010). Whole School Change: A literature review. Newcastle upon Tyne, UK:
Creativity, Culture and Education.
Tiernan, B., Casserly, A. M., & Maguire, G. (2018). Towards inclusive education: instructional
practices to meet the needs of pupils with special educational needs in multi-grade settings.
International Journal of Inclusive Education, 1–21, 01. https://doi.org/10.1080/13603116.2018.
1483438.
Tomlinson, C., Brimijoin, K., & Narvaez, L. (2008). The differentiated school: Making revolutionary
changes in teaching and learning. Alexandria, VA: ASCD.
Torgesen, J. K., Wagner, R. K., & Rashotte, C. A. (2012). Test of word reading efficiency 2 (TOWRE-
2). Austin, TX: Pro-Ed.
Tschannen-Moran, M., & Johnson, D. (2011). Exploring literacy teachers’ self-efficacy beliefs:
Potential sources at play. Teaching and Teacher Education, 27, 751–761. https://doi.org/10.1016/
j.tate.2010.12.005.
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0
International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing,
adaptation, distribution and reproduction in any medium or format, as long as you give appropriate
credit to the original author(s) and the source, provide a link to the Creative Commons license and
indicate if changes were made.
The images or other third party material in this chapter are included in the chapter’s Creative
Commons license, unless indicated otherwise in a credit line to the material. If material is not
included in the chapter’s Creative Commons license and your intended use is not permitted by
statutory regulation or exceeds the permitted use, you will need to obtain permission directly from
the copyright holder.