A Quantitative Study of The Effects of A Summer Literacy Camp To
A Quantitative Study of The Effects of A Summer Literacy Camp To
DOCTOR OF EDUCATION
of
at
Date Submitted: November 10, 2020 Date Approved: January 21, 2021
_____________________________ ______________________________
Patricia O’Regan Anthony J. Annunziato, Ed.D.
41
students who receive free/ reduced lunch. The school receives Title III money in order to
assist the ELLs. The combination of the Title I and Title III monies were used to cover
the cost of staff and materials needed for the camp. A daily schedule was created based
on literacy components incorporated each day and each week featured a different STEM
theme. The purpose of this study is to look at the short term effects of the camp in order
literacy camp which could then lead to potential positive long term effects.
Conclusion
The research shows that in order for students to get better at reading, they must
read. Increasing the frequency and time spent practicing the act of reading leads to
(Allington, 2006; Guthrie et al., 2004). Because of an extended summer hiatus from
learning, students experience summer learning loss. This gap widens for students each
year as it compounds over time. The wide reading gaps that we see in later years traces
back to out-of-school time during the early elementary years (Alexander et al., 2007).
This summer learning loss has a greater impact on students who are economically
disadvantaged. Through Title I and Title III grant monies, as well as community
ELLs to help mitigate summer learning loss. The NGSS were designed to address
diversity and equity issues, so it would make sense to create a summer learning
experience for students that incorporated rich daily literacy experiences infused with
well-planned STEM activities. Through this study, the effectiveness of a summer literacy
camp is evaluated to see the effects it has on mitigating summer learning loss.
42
CHAPTER 3: METHODOLOGY
The researcher’s purpose in this chapter is to identify and describe the quantitative
procedures used to examine the research questions surrounding summer learning loss.
The remainder of the chapter is organized into sections that will present the data
collection, analysis methods, and procedures used to carry out this study. First, the
rationale for the research approach is described, followed by an explanation of why the
research setting and research sample were chosen. The data collection and analysis
methods are justified and finally the trustworthiness and the limitations of the study are
discussed.
Research Questions
1 How do students who attended the summer literacy camp compare to those who
were invited to the camp but did not attend in regard to reading levels and
2 Did the summer literacy camp impact the following groups of students: students
from a low SES home and students who are identified as ELLs (English Language
Learners)?
The research questions for this study were approached from a quantitative
identified based on the need to explain why something occurs. The purpose of this
research is to identify why gaps exist in student learning for certain groups of students.
This study is quantitative because it is based on data that were collected from students
43
who attended the summer literacy camp and students who were invited to attend but did
Table 2
Table 3
Research Setting/Context
The study involves students who attend one of four elementary schools in a
suburban school district on Long Island in New York State. The school consists of 602
students, of which 51% are male and 49% are female. The demographics include 35%
White, 44% Hispanic or Latino, 11% Black or African American, 7% Asian or Native
Hawaiian or other Pacific Islander, and 3% Multiracial students. Additionally, 54% of the
44
students come from poverty, 16% are English language learners, 13% have either an IEP
or a 504 plan, and 5% are homeless. It is important to note that this school is an outlier to
the district and does not represent the overall demographics of the district, as illustrated
Table 4
The site school was chosen because of the high number of economically
disadvantaged students and ELLs. It is the only school that hosted a summer literacy
camp utilizing funds from Title I and Title III monies. The summer literacy camp was
disadvantaged and 37 were English Language Learners. Students were invited to the
summer literacy camp based on their Spring reading levels. Classroom teachers and
reading teachers made recommendations and then students were categorized into List A
and List B based on their reading levels. Invitations were sent out to List A. As students
from List A declined, invitations were sent out to students from List B until each class
The camp was created based on a vision shared by the building principal, the
director of ENL and the science director in hopes of preventing summer slide and
increasing positive relationships in order for students to have a more positive school
experience. Planning meetings were held from February through June in order to prepare
for the camp. An internal posting advertised the program and asked for interested
applicants to apply. The posting called for classroom teachers, ENL teachers, a library
media specialist, a social worker, and a STEAM consultant. Interested candidates applied
and interviews were conducted. Five classroom teachers, five ENL teachers, a library
Five classes were set up; one each for students entering grades one through five.
Each class had two teachers: a classroom teacher and an ENL teacher. Also on staff were
met for two days of professional learning before the camp started to gather materials and
plan their lessons. The camp took place over five weeks and students attended Monday
through Thursday from 9:00 AM – 12:00 PM. A schedule was created by each classroom
teacher that included breakfast, reading workshop, read aloud, science, recess, and lunch.
Students received breakfast and lunch during the literacy camp through a partnership with
46
Long Island Harvest. Students received 10 books on their independent reading level that
they used during the literacy camp and then were able to keep and add to their home
library. Through a partnership with KPMG’s Family for Literacy, students received an
additional 4 books to add to their library. As shown in Figure 4, KPMG collaborates with
First Book through their KPMG’s Family for Literacy with a mission to child illiteracy
by putting new books into the hands of children in need. KPMG partners with First Book.
First Book believes that education is the best way out of poverty for children in need and
aims to remove barriers to quality education for all kids by making things like high-
quality books and educational resources affordable to its member network. KPMG
collects donations that are deposited into their First Book account which allows them to
purchase and donate books to schools in need. This donation allowed the participants in
Classroom libraries were also purchased with Title I funds around the science
themes: what is the role of a scientist, human impact on the environment, weather, and
47
forces and motion. These themes were chosen by the science director because of their
spiral through the elementary program. Classes went to the library each week where the
library media specialist and the STEAM consultant had a hands-on science activity
planned that went along with the theme for that week. Teachers did a read aloud each day
modeling for students the strategies that they would use during independent reading time.
During reading workshop, teachers met with small groups of students for guided reading
or strategy groups. While they met with small groups, the rest of the students were
independently reading. Book bags were sent home each day in hopes that students would
also read each night. Parents were invited in during the first week of the camp to meet the
teachers and hear about what would be done in school and how they could support that
work at home. All of these pieces were put in place in hopes of mitigating summer
learning loss.
Data collection for this study included a quasi-experimental design using two
groups, one that received treatment (attended literacy camp) and the other that did not
(those who were invited but did not attend). This research could shed light on the student
achievement gap that widens each year for students. With these findings, next steps can
be determined and implemented in order to provide a blue print for districts to use to
provide a summer learning experience that could close the student achievement gap for
The overall sample consists of 129 students broken down into two groups to be
studied: the group who received treatment and the group who did not. The treatment
group consists of 78 students entering grades 1-5 who attended the summer literacy
48
Language Learners. Fifty one students were invited to attend the summer literacy camp
but declined the invitation. Of those 51, 34 were economically disadvantaged and 13
were English Language Learners. This group was considered the control group that did
The data source for this quantitative study consists of reading data that were
recorded by classroom teachers and collected by the district. Teachers record reading
levels in September, December, March and June in the district’s student management
system, Infinite Campus. These reading levels are tracked over time by the building
principal using a spreadsheet that also houses the aimsWebPlus scores. aimsWebPlus is
the universal screener that is used by the district and is administered to students in
March and June. Official benchmarks are collected using the Fountas and Pinnell
Benchmark Assessment System for Levels A-J. Using the Fountas and Pinnell
Benchmark Assessment System, teachers are able to identify the instructional and
independent reading levels of all students and document student progress through one-on-
one formative and summative assessments. Teachers use Jennifer Serravallo’s Complete
Comprehension Assessment kit to assess reading level K and above. The district switches
over at Level K because the Complete Comprehension kit assesses students on an entire
book that is similar to those that they will encounter on their level. Teachers record each
In addition to reading levels, student performance is reported three times per year
with aimswebPlus. The aimswebPlus scores are compared to established cut scores and
national and/or local norms. aimswebPlus uses both timed curriculum-based measures
(CBMs) and untimed standards-based measures to assess skills and inform instruction.
aimswebPlus gives subtests for word reading fluency (WRF) for kindergarten and first
grade, oral reading fluency (ORF) for grades 1-3, and vocabulary and reading
comprehension for grades 2-5. Students receive a composite score for reading in the form
of a percentile that is locally and nationally normed. For the purpose of this study,
reading levels and aimswebPlus scores will be compared from Spring 2019 to Fall 2019.
Research Ethics
Data that were collected for this study were preexisting data that the school
collects each year. Teachers had no knowledge of the study or data analyses. The summer
literacy camp would have taken place regardless of this study. Students who participated
in the summer literacy camp were invited based on their reading levels and enrolled by
their parents.
Using IBM SPSS, the data were screened using descriptive statistics. A one-way
between-subjects ANOVA was performed to make sure the covariate (pretest) did not
vary across the groups (treatment group versus non-treatment group). A repeated
measures ANOVA was conducted to compare the reading levels of students who attended
the summer literacy camp versus the control group who did not receive treatment. The
same procedures were repeated with the aimsweb Plus reading composite scores. From
For the second research question, a mixed repeated measures ANOVA was
language learners. The main goal of the repeated measure ANOVA is to test whether a
score changes over time as a result of random fluctuations or if there is evidence for
effects into their models, such as grouping variables or covariates. The test this researcher
used includes a grouping variable often called a one-within one-between ANOVA, which
refers to the within effect of time, which could influence everyone within the sample, and
the between effect of group, which describes differences between two or more groups
(Field, 2005). The summer literacy camp was the treatment or intervention condition
compared to the control group with measurements taken at two times, before and after
treatment. This kind of analysis allows researchers to see if scores changed as a result of
the treatment, but also compare the changes over time between a group who should have
shown a change (the treatment group) and one who should not have changed (the control
group). This wrinkle in the design can help account for threats to internal validity, such as
Issues of Trustworthiness
Differences among student experiences over the summer can influence reading levels.
Students who read at home over the summer are more likely to not experience summer
learning loss than students who did not read at home over the summer. The quality of the
classroom experience that students had over the summer is also a variable as each pair of
During the school year, each classroom teacher administers their own reading
testers leave room for error. It should also be noted that there are two different testers
from spring to fall as the students change grade levels and teachers.
Limitations of the study include the sample size and the length of the study. To
get a more robust dataset and to be able to make better generalizations, this study should
be completed in different schools in several school districts over the course of several
years.
Researcher Role
The researcher is the building principal of the Title I school where the
intervention took place. This study is important to see if the allocation of Title I and Title
III monies is worth spending on a summer literacy camp. The researcher will look at the
Conclusion
A Summer Literacy Camp was formed using Title I and Title III monies as an
intervention to help mitigate summer learning loss. In addition to the grant money,
community partnerships with Long Island Harvest and KPMG provided food and books
for the students who attended the camp. Reading was the primary subject addressed along
with a specific integration of science. While this chapter outlined the camp used as the
intervention, the next chapter will go over the results of the data analysis to see if the
CHAPTER 4: RESULTS
This chapter contains the analysis of the results as outlined in Chapter 3. The
chapter presents the findings broken down and discussed by research question. As stated
in Chapter 1, this study examined the spring and fall reading scores of students to see if a
summer literacy intervention could help mitigate summer learning loss. This chapter
presents analyses of differences in reading scores based on students who attended the
camp as well as students who were invited to attend but did not. Data were also examined
to see if the camp had a greater impact on ELLs as well as students identified as
economically disadvantaged. Reading levels and aimswebPlus scores from Spring 2019
to Fall 2019 were analyzed. A repeated measures ANOVA was used to compare the
pretest/posttest results to see if the intervention was able to help mitigate summer
learning loss. A mixed repeated measures ANOVA was used to see the effect the camp
Results/Findings
Participants in this study were 129 students in grades K-4. Of those 129 students,
were economically disadvantaged and ELLS. 51 students did not participate in the
summer literacy camp and became the control group. Of the 51 students, 34 were
Table 5
Measure 1 Measure 2
Variable Mean St. Deviation Mean St. Deviation
RQ1: How do students who attended the summer literacy camp compare to those
who were invited to the camp but did not attend in regard to reading levels and aimsWeb
Plus scores?
The researcher first used a repeated measures ANOVA to see if the intervention
had an impact on reading scores. Prior to running the Repeated Measures ANOVA, the
researcher calculated the difference between the pretest total and the posttest total. A box
plot revealed five outliers. The researcher removed the outliers from the sample because
they made up less than 5% of the sample. A histogram produced from the cleaned data
revealed a normal distribution. The Repeated Measures ANOVA indicated that there
were no gains in terms of reading levels from June to September when looking at reading
levels reported using the Fountas and Pinnell and Complete Comprehension reading
levels from June to September based on those who attended the summer literacy camp
54
(treatment group) versus those who did not attend (control group) with a p value of .54.
Because the p value was greater than .05, the difference was not statistically significant.
Based on these results, students who attended the camp did not grow in reading levels nor
was there a decrease in reading levels. These results will be further discussed in Chapter
5.
The aimsWebPlus scores for those who attended the camp had a p value of .05,
which shows that attending the camp had a positive effect on the aimswebPlus scores.
For RQ1, the repeated measures ANOVA indicated that attending the camp did not have
a significant effect reading levels but it did have an effect on aimswebPlus scores, which
are standardized and normed. Since the two dependent variables showed differing results,
RQ2 asked, did the summer literacy camp impact the following groups of
students: students from a low SES home and students who are identified as ELLs? The
Measures ANOVA. This test compares the mean differences between groups that have
been split on two factors where one factor is a within subjects and the other factor is
between subjects to look at the main effects and the interactions. Before running the
Two-Way Repeated Measures ANOVA, the researcher checked the data for normality
before/after results for the 2 groups of students, ELLS and those with low SES. In this
case, the test for homogeneity of variance does not apply as a required assumption. The
applicable assumption required is the Mauchly’s sphericity test. Mauchly’s sphericity test
condition where the variances of the differences between all possible pairs of within-
subject conditions are equal. Sphericity can be evaluated when there are three or more
levels of a repeated measure factor. However, since there are only two levels in this
investigate the effects of a summer literacy camp on aimsWeb Plus scores and the
interactions. ANOVA is used to compare means when there are two or more independent
repeated measures variables. The two between-groups variables for this research question
are English Language Learner (ELL) and Free and Reduced Lunch (low SES). The
The Mixed Repeated Measures ANOVA showed that there was an overall change
in aimswebPlus scores for all students from June to September with a p value of .015.
The Mixed Repeated Measures ANOVA done for the aimsWebPlus scores indicated that
there was no significant main effect for those who attended the summer literacy camp
with a p value of .91. It also showed that the intervention did not have a significant
impact on either group of students. The summary table of repeated measures effects in
the ANOVA with corrected F-values is below. The F ratio is the ratio of two mean
square values. If the null hypothesis is true, the expectation is for the F to have
a value close to 1.0 most of the time. A large F ratio means that the variation among
group means is more than one would expect to see by chance. The output is split into
sections for each of the effects in the model and their associated error terms.
56
Table 6
Measure: AimScores
Type III
Sum of Mean
Source Squares df Square F Sig.
difference in reading scores when looking the aimsWebPlus scores for those who
attended the camp who were ELLs and low SES. Because the p value was greater than
.05, the difference was not statistically significant. The hypothesis is rejected in favor of
accepting the null hypothesis. The difference in the reading scores based on the
intervention was not significantly different for the groups of ELLs or low SES groups
The data show that the summer literacy camp was less effective than time in
causing the increase in aimsWeb plus scores. However, the marginal means shows a large
increase for those attending camp, as shown in Table 7. If you look at confidence
57
intervals, they tell the story of why the difference is not statistically significant. The
ranges of the confidence intervals overlap because the standard deviations are so large.
Table 7
Confidence Intervals
Bound Bound
Conclusion
The researcher hypothesized that students who received the intervention would
score better in terms of reading levels and aimswebPlus scores than students who were
invited to attend the camp but did not. The aimswebPlus scores showed an increase in
scores for those who attended versus those who did not. Reading levels did not show an
increase, but they also did not decease. The null hypothesis is rejected as there was an
The researcher hypothesized that there will be a difference in reading scores based
on the different groups of students. There was no significant change in reading scores for
economically disadvantaged or ELLs who attended the camp. The null hypothesis is not
rejected. The difference in the reading scores based on the intervention was not
significantly different for the groups of ENL or Free and Reduced groups compared to
non-ENL/Free and Reduced groups of students. The data shows that there was an impact
on students who attended the camp. The researcher believes that students benefitted even
more than the data suggests. This will be further discussed in Chapter 5.
59
CHAPTER 5: DISCUSSION
The main purpose of this study was to examine the impact of a summer literacy
camp intervention on reading scores from spring to fall to see if the camp was effective.
In addition, the researcher wanted to see if the summer literacy camp had a greater impact
on ELLs and students who are economically disadvantaged. This chapter discusses the
results from Chapter 4 and their connection to existing research. In addition, the chapter
discusses the impact of these conclusions on future professional practice and research.
Implications of Findings
Research Question 1: How do students who attended the summer literacy camp
compare to those who were invited to the camp but did not attend in regard to reading
Finding: Students who attended the literacy camp saw an increase in their
aimswebPlus scores versus those who did not. The aimswebPlus scores for those who
attended the camp increased significantly from spring to fall. Because the standard
deviations were so large, the ranges of the confidence intervals overlapped causing the
levels as assessed with the Fountas and Pinnell and Complete Comprehension
literacy camp and those who did not. Reading levels for most students remained the same
from the spring to fall benchmarks regardless of whether they attended the camp or not.
While the data did not show growth on reading levels for students who attended the
summer literacy camp, it did show that levels for most students remained the same. This
indicated that there was no learning loss for most students from June to September.
60
Research Question 2: Did the summer literacy camp impact the following groups
of students: students from a low SES home and students who are identified as ELLs
Finding: The literacy camp intervention did not have a greater impact on those
reading scores on either assessment for students who were economically disadvantaged
While the data collected allowed for analyzing changes in reading scores, the
researcher hypothesized that there would be a change in reading scores because the
theoretical framework for the summer literacy camp was based on Vygotsky’s theory of
constructivism which argues that cognitive abilities are socially guided and constructed.
Vygotsky’s (1978) theory asserts three major themes including Social Interaction, the
More Knowledgeable Other (MKO), and the Zone of Proximal Development (ZPD).
Cognitive development stems from social interactions from guided learning within the
(Wertsch & Tulviste, 1992). Students who attended the literacy camp engaged in social
interactions throughout the day. Students worked together in reading partnerships and
small groups. They took part in hands-on science experiments in small groups in which
Students also interacted with several teachers throughout the day who would be
considered the MKO in respect to reading and science. Each class had a classroom
teacher and an ENL teacher who worked together and often created smaller student to
teacher ratios in the class. Students also worked with a library media specialist as well
61
as a STEAM consultant teacher twice a week who were able to lend their expertise by
creating hands on learning activities for the students to take part in in order to solidify
their learning in science. Teachers worked with students in small groups instructing
The Zone of Proximal Development refers to the skills that are too difficult for a
child to master on his/her own but can be done with guidance and encouragement.
Teachers met with students in small groups for guided reading in which teachers
worked at the students’ instructional level. The instructional level is in the Zone of
Proximal Development. The instructional reading level is the highest level at which
a reader is not independent, but has adequate background knowledge for a topic, and can
access text quickly and with no or few errors (Fountas & Pinnell, 1996).
Cognitive
development
is limited to a
certain range
Full cognitive
development
Social Interaction
The More Knowledgeable Other (MKO)
Zone of Proximal Development (ZPD)
62
Calkins also believes that it is important for students to work in the Zone of
Proximal (ZPD), which is one of Vygotsky’s themes of constructivism. The ZPD is the
distance between a learner’s ability to perform a task under adult guidance and/or with
peer collaboration and their ability to solve the problem independently. This is why
Calkins promotes the importance of students reading at their independent reading level
and teachers teaching students at their instructional level, which is in the Zone of
independent reading level, guided reading at the instructional level, reading aloud, social
interaction and student choice are all part of the balanced literacy approach that was
utilized in the summer literacy camp that falls in line with Vygotsky’s three major themes
who attended the camp, the researcher believes that the students also benefitted from
the social interactions they had, the teachers who supported them, and being instructed
at their instructional level which in falls within the Zone of Proximal Development.
In addition to the impact on reading scores for those who attended, the researcher
also believes that the camp had a positive impact on students for other reasons as well.
Students who attended the camp received meals, books to keep at home, and social
interaction that they would not have received had they not attended the camp. Further
research needs to be done in order to see if there are long term effects for students who
Title I Money
Reading
The data show that there was statistical significance to prove that students who
attended the summer literacy benefitted more than those who did not. The aimswebPlus
scores for those who attended the camp were higher in the fall that those who did not
attend. When looking at reading levels, students who attended the camp did not have an
increase in reading levels as shown by their benchmark assessments, but they did not
decrease either, showing that there was no summer learning loss. Summer learning loss
refers to the loss of knowledge and academic skills over summer months when students
are out of school, and is widely recognized as a pervasive and significant problem in
United States education (Zaromb et al., 2014). Students who attended the summer
literacy camp grew on their aimswebPlus reading scores and mostly remained on the
same reading level from spring to fall. The researcher believes that this shows that the
summer camp did have an impact and if continued can help in closing the gap for
students who attend. Zaromb et al. (2014) found that the negative effects of summer
increase with increases in students’ grade levels thus compounding the issue each year
64
and never giving students the time they need to close their achievement gap. It is
important for students to have opportunities like this early on in order to give them the
support and that they need to help close the achievement gap.
Summer learning loss varies with respect to grade level, subject matter, and
socioeconomic status (Alexander et al., 2007; Cooper et al., 1996). While the data show
that there was no significant growth for economically disadvantaged students, it also
shows that there was no loss either. Research on summer learning loss has provided
slides back a few months every summer (Allington et al., 2010). Students who
participated in the camp received books to take home and keep in order to give them
access to books over the summer. A small set of studies reports that simply supplying
poor students with books over the summer results in improved reading achievement
(Allington et al., 2010). The researcher believes that students who are economically
disadvantaged benefitted from attending the camp by having the opportunity to read in
school and then had access to books to take home to continue reading over the summer.
Students who attended the camp also received breakfast and lunch in school, which the
While the reading levels did not grow for ELLs over the summer, they did remain
the same. aimswebPlus levels grew for ELLS but did not prove to be statistically
significant. A study done by Michael Kieffer showed that children who entered
achievement below that of their monolingual English speaking peers through fifth grade
(Kieffer, 2010). These findings suggest that students who enter school with limited
65
English proficiency or score low on early literacy measures never catch up. Research
shows that it is possible to predict in early childhood who is at risk for later reading
order to close the gap for ELLs. Attending the summer literacy camp allowed all of the
students to maintain the reading skills that they left school with in the spring. These
students did not have to spend the first two months of the next school year making up lost
ground. They were able to start the new school year where they left off, which over time
can help to close the gap, and at the very least prevent the gap from widening for these
students.
Limitations of the study included the sample size and the length of the study.
Ultimately, to get a more robust dataset and be able to make better generalizations, this
study should be replicated in multiple schools and across several school districts.
Differences among student experiences over the summer can influence reading levels.
Students who read at home over the summer are more likely to not experience summer
learning loss than students who did not read at home over the summer. The quality of the
classroom experience that students had over the summer is also a variable as each pair of
During the school year, each classroom teacher administers their own reading
testers leave room for error. It should also be noted that there are two different testers
66
from spring to fall as the students change grade levels and teachers. Furthermore,
teachers administer benchmark reading assessments earlier in the spring in order to give
students the opportunity to receive instruction and practice with their independent reading
level before summer break. Students are then given time to get back into their reading
routine once school starts again in September. Benchmark assessments are not given until
the end of September. This could account for the lack of movement in either direction for
The impetus for this research was to examine the effectiveness of a summer
literacy camp intervention in order to help mitigate summer learning loss. In future, other
school districts may want to consider realigning use of their Title I and Title III funds in
order to create a summer literacy camp for their students. The summer literacy camp was
effective for ELLs who benefitted from the additional support. The researcher believes
While the intervention for this study focused on literacy, it could be beneficial to
add mathematics to a summer program since there is research that summer learning loss
has a greater effect on mathematics. Students who attend a summer literacy camp can
benefit academically, but equally as important, they can benefit socially and emotionally
through social interactions, consistent meals, and books to read at home. School leaders
are urged to consider partnering with organizations like Long Island Harvest and KPMG
in order to help support their most vulnerable students. It is important for interventions
67
like this to start early in order to mitigate summer learning loss and help close the gap for
struggling students.
The conclusions of this study can form a foundation for other studies that more
deeply examine interventions that mitigate summer learning loss. Future work should
include more schools across multiple school districts. This study should be replicated
with a larger sample that spans geographic areas including rural, urban, and suburban
varying economic statuses, the intervention should expand to include mathematics and
summer literacy camp intervention and students’ attitudes towards school. There is
research that supports the notion that students who have a positive attitude toward school
do better in school. Attending a summer literacy camp intervention that promotes social
interaction, provides steady meals and books to read at home should have a positive
literacy camp and academic achievement over time in order to determine if summer
programs can help to close the achievement gap for struggling students. A longitudinal
study tracking students over time would give a more accurate picture of overall gains and
losses than a one-time analysis of student data. Students who participate in a summer
68
literacy camp should be tracked using the same assessments given on the same dates in
Lastly, future researchers should examine the effect the summer literacy camp had
on academic vocabulary, specifically in the area of science. This summer literacy camp
incorporated the NGSS because they elevate expectations for students’ language and
literacy development across the content areas and raised the bar linguistically and
academically for all learners, especially ELLs. The NGSS were also used because of the
disciplines allowing students to make connections among science ideas. The topics were
carefully chosen to help build students’ prior knowledge in order to better prepare them
for the upcoming school year. Future research should examine the impact an intervention
Conclusion
Districts need to provide for their struggling students over the summer to help
mitigate summer learning loss. The Title I school in which the summer literacy camp
took place recognized a need to make changes to its summer program in order to mitigate
summer learning loss and close the gap for some of its most vulnerable students. Title I
and Title III monies were reallocated and a new summer literacy camp was implemented
in the summer of 2019. The summer literacy camp had classroom teachers and ENL
teachers teamed together to create small teacher to student ratios. There was a STEAM
theme each week and a STEAM coordinator and library media specialist who worked
together to create meaningful and engaging hands-on science experiments that would
prepare students with the academic vocabulary needed for upcoming science units in their
69
September grade level. A social worker was also on-site to continue the support that
many of the students received during the school year. Students who participated received
breakfast and lunch each day as well as books to take home to help build a home library
through generous donations or organizations with which the school partnered. Students
worked together and played together building relationships with each other and their
teachers. The researcher believes that students who attended the summer literacy camp
benefitted in even more ways than what was shown through the data.
Since the data that were analyzed did not have all of the expected results, the
process of looking more closely at the data points and the benchmark assessments in
reading added information that can help shape future studies to better determine the
impact the camp had. In future studies, it would be more helpful to use only one
benchmark reading assessment instead of the two different ones that were used for the
different reading levels. The researcher recommends using the Fountas and Pinnell
benchmark reading assessments for all reading levels instead of the using the Complete
Comprehension Benchmark Assessment for levels K and above. Giving the assessment at
the end of June and again at the beginning of September would give the researcher more
accurate data to look at in conjunction with the aimswebPlus data. It also might be even
more effective to give the benchmark by the same tester, preferable a trained reading
specialist. These steps could provide more reliable data points in order to truly assess the
The data that were analyzed showed that the summer literacy camp intervention
did not impact the reading levels of those who attended. Although the levels did not show
growth, they also did not show loss. Most students remained on the same reading level in
70
the fall as where they had left off in the spring showing that there was no summer loss.
The researcher believes that if the benchmarking was done differently the results may
look different and show a greater impact. The data show that aimswebPlus scores grew
significantly for all who attended the summer literacy camp. Furthermore, the data
showed that ELLs who attended the camp grew more than ELLs who did not.
The researcher also believes that the students further benefitted from the books
and meals that they received. Many students suffer from food insecurity over the
summer. Students who attended the camp ate two meals and many brought leftovers
home. Students who attended the camp received about 10 books altogether that were
theirs to keep at the end of the program, giving them access to books at home. Finally,
students benefitted from the support they received from their teachers, the social
interactions that took place, and the academic support that they received in science to
The summer literacy camp became virtual in the summer of 2020. We are hopeful
that it will take place in 2021 so that we can continue to study and refine. Going through
the process of this study validated the researcher’s belief in the importance of a summer
literacy camp intervention, especially for ELLs. Research must continue to identify
causes and possible solutions to the achievement gaps we see in our most vulnerable
students. School and district leaders must continue to look at the funds we are given for
these students to initiate systemic reform practices in order to help close these gaps for
good.
71
APPENDIX A
REFERENCES
Alexander, K. L., Entwisle, D. R., & Olson, L. S. (2007). Lasting consequences of the
Allington, R. L. (2012). What really matters for struggling readers: Designing research-
Allington, R. L., McCuiston, K., & Billen, M. (2014). What research says about text
Allington, R. L., McGill-Franzen, A., Camilli, G., Williams, L., Graff, J., Zeig, J., . . .
Anderson, R. C., Wilson, P. T., & Fielding, L. G. (1988). Growth in reading and how
children spend their time outside of school. Reading Research Quarterly, 23, 285-
303.
Armbruster, B. B., Lehr, M. A., & Osborn, M. (2008). The research building blocks for
teaching children to read: Putting reading first. Jessup, MD: National Institute
for Literacy.
Beach, K. D., Philippakos, Z. A., McIntyre, E., Mraz, M., Pilonieta, P., & Vintinner, J. P.
black and hispanic students in elementary school. Reading & Writing Quarterly,
34(3), 263-280.
Beck, I., McKeown, M., & Kucan, L. (2002). Bringing words to life: Robust vocabulary
Borman, G. D., Benson, J., & Overman, L. T. (2005). Families, schools, and summer
Bowers, L. M., & Schwarz, I. (2018). Preventing summer learning loss: results of a
summer literacy program for students from low-SES homes. Reading & Writing
Brozo, W., & Sutton Flynt, E. (2008). Motivating students to read in the content
Macmillan.
Calkins, L. (1999). Let the words work their magic. Instructor, 25-30.
Castles, A., Rastle, K., & Nation, K. (2018). Ending the reading wars: reading acquisition
from novice to expert. Psychological Science in the Public Interest, 19(1), 5-51.
Catalano, R. F., Haggerty, K. P., Oesterle, S., Fleming, C. B., & Hawkins, D. (2004). The
Ceci, S. J., & Papierno, P. B. (2005). The rhetoric and reality of gap closing. American
Chaudry, A., Wimer, C., Macartney, S., Frohlich, L., Campbell, C., Swenson, K., . . .
Hauan, S. (2016). Poverty in the United States: 50-year trends and safety net
Cheuk, T. (2016). Discourse practices in the new standards. Electronic Journal of Science
Education.
Cipielewski, J., & Stanovich, K. E. (1992). Predicting growth in reading ability from
89.
Cooper, H., Nye, B., Charlton, K., & Greathouse, S. (1996). The effects of summer
Cooper, H., Valentine, J. C., Charlton, K., & Melson, A. (2003). The effects of modified
quantitative and qualitative research (5th ed.). Upper Saddle River, NJ: Pearson
Education, Inc.
Cunningham, A. E., & Stanovich, K. E. (1998). What reading does for the mind.
Davies, B., & Kerry, T. (1999). Improving student learning through calendar change.
De Laet, S., Colpin, H., Vervoort, E., Doumen, S., Van Leeuwen, K., Goossens, L., &
30.
Downey, D. B., vonHippel, P. T., & Broh, B. A. (2004). Are schools the great equalizer?
Cognitive inequality during the summer months and the school year. American
Dzaldov, B. S., & Peterson, S. (2005). Book leveling and readers. The Reading Teacher,
59(3), 222-229.
Entwisle, D. R., Alexander, K. L., & Olson, L. S. (2001). Keep the faucet flowing:
Fountas, I. C., & Pinnell, G. S. (1996). Guided reading: Good first teaching for all
Frazier, J. (1998). Extended year schooling and achievement: Can 30 days make a
difference? Houston, Texas: paper presented to the National Association for Year
Friedman, H. H., Hampton-Sosa, W., & Friedman, L. W. (2014). Solving the mega-crisis
Gambrell, L. B., & Morrow, L. M. (2019). Best practices in literacy instruction (6th ed.).
Gambrell, L., Mandel Morrow, L., & Pressley, M. (2003). Best practices in literacy
Guthrie, J. T., Schafer, W. D., & Huang, C. (2001). Benefits of opportunity to read and
Hayes, D. P., & Grether, J. (1983). The school year and vacations: When do students
Helman, L. A., & Burns, M. K. (2008). What does oral language have to do with it?
Heyns, B. (1978). Summer learning and the effect of schooling. New York, NY:
Academic Press.
Hinton, P., Brownlow, C., & McMurray, I. (2004). SPSS Explained. Routledge.
77
Januszyk, R., Miller, E. C., & Lee, O. (2016). Addressing student diversity and equity:
The Next Generation Science Standards are leading a new wave of reform. The
Kim, J. S., & Quinn, D. M. (2014). The effects of summer reading on low-income
431.
Kirkland, L. D., Camp, D., & Manning, M. (2008/2009, Winter). Changing the face of
Lesaux, N. K. (2012). Reading and reading instruction for children from low-income and
Manyak, P. C., & Bauer, E. B. (2009). English vocabulary instruction for English
McLeod, S. (2018, August 5). Lev Vygotsky's Sociocultural Theory. Retrieved from
University Press.
Miller, E., Lauffer, H. B., & Messina, P. (2014). NGSS for English language learners:
Moats, L. (2007, January). Whole-language high jinks: How to tell when "scientifically-
Mraz, M., & Rasinski, T. V. (2007, May). Summer reading loss. International Reading
Education Commission.
New York State Education Department. (2019). TITLE III, Part A: English Language
http://www.nysed.gov/common/nysed/files/2017-18-title-iii-allowable-
unallowable-.pdf
Okhee, L., Miller, E. C., & Januszyk, R. (2014). Next generation science standards: All
Rasinski, T., Blachowicz, C., & Lems, K. (2012). Fluency instruction; Research-based
247-284.
Smith, D. (2006). On the shoulders of giants: Leaders in the field of literacy as teacher
https://www2.ed.gov/policy/elsec/leg/esea02/pg1.html
White, T. G., Kim, J. S., Kingston, H. C., & Foster, L. (2014). Replicating the effects of a
Zaromb, F., Adler, R. M., Bruce, K., Attali, Y., & Rock, J. (2014). Using no-stakes
educational testing to mitigate summer learning loss: a pilot study. Princeton, NJ: