0% found this document useful (0 votes)
128 views90 pages

A Quantitative Study of The Effects of A Summer Literacy Camp To

The document describes a quantitative study that examines the effects of a summer literacy camp on mitigating summer learning loss for students entering grades 1-5. The study compares reading levels and test scores of students who attended the camp to those who were invited but did not attend. It also looks at effects on low-SES students and English language learners. The methodology section provides details on the research questions, design, setting, participants and data collection and analysis methods used in the study.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
128 views90 pages

A Quantitative Study of The Effects of A Summer Literacy Camp To

The document describes a quantitative study that examines the effects of a summer literacy camp on mitigating summer learning loss for students entering grades 1-5. The study compares reading levels and test scores of students who attended the camp to those who were invited but did not attend. It also looks at effects on low-SES students and English language learners. The methodology section provides details on the research questions, design, setting, participants and data collection and analysis methods used in the study.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 90

A QUANTITATIVE STUDY ON THE EFFECTS OF A SUMMER LITERACY

CAMP TO HELP MITIGATE SUMMER LEARNING LOSS FOR STUDENTS


ENTERING GRADES 1-5

A dissertation submitted in partial fulfillment


of the requirements for the degree of

DOCTOR OF EDUCATION

to the faculty of the

DEPARTMENT OF ADMINISTRATIVE AND INSTRUCTIONAL LEADERSHIP

of

THE SCHOOL OF EDUCATION

at

ST. JOHN’S UNIVERSITY


New York
by
Patricia O’Regan

Date Submitted: November 10, 2020 Date Approved: January 21, 2021

_____________________________ ______________________________
Patricia O’Regan Anthony J. Annunziato, Ed.D.
41

students who receive free/ reduced lunch. The school receives Title III money in order to

assist the ELLs. The combination of the Title I and Title III monies were used to cover

the cost of staff and materials needed for the camp. A daily schedule was created based

on literacy components incorporated each day and each week featured a different STEM

theme. The purpose of this study is to look at the short term effects of the camp in order

to determine if it is possible to mitigate summer learning loss through this summer

literacy camp which could then lead to potential positive long term effects.

Conclusion

The research shows that in order for students to get better at reading, they must

read. Increasing the frequency and time spent practicing the act of reading leads to

increases in reading achievement by developing accuracy, fluency, and comprehension

(Allington, 2006; Guthrie et al., 2004). Because of an extended summer hiatus from

learning, students experience summer learning loss. This gap widens for students each

year as it compounds over time. The wide reading gaps that we see in later years traces

back to out-of-school time during the early elementary years (Alexander et al., 2007).

This summer learning loss has a greater impact on students who are economically

disadvantaged. Through Title I and Title III grant monies, as well as community

partnerships, it is possible to create opportunities for economically disadvantaged and

ELLs to help mitigate summer learning loss. The NGSS were designed to address

diversity and equity issues, so it would make sense to create a summer learning

experience for students that incorporated rich daily literacy experiences infused with

well-planned STEM activities. Through this study, the effectiveness of a summer literacy

camp is evaluated to see the effects it has on mitigating summer learning loss.
42

CHAPTER 3: METHODOLOGY

The researcher’s purpose in this chapter is to identify and describe the quantitative

procedures used to examine the research questions surrounding summer learning loss.

The remainder of the chapter is organized into sections that will present the data

collection, analysis methods, and procedures used to carry out this study. First, the

rationale for the research approach is described, followed by an explanation of why the

research setting and research sample were chosen. The data collection and analysis

methods are justified and finally the trustworthiness and the limitations of the study are

discussed.

Research Questions

The study was guided by the following questions:

1 How do students who attended the summer literacy camp compare to those who

were invited to the camp but did not attend in regard to reading levels and

aimsWeb Plus scores?

2 Did the summer literacy camp impact the following groups of students: students

from a low SES home and students who are identified as ELLs (English Language

Learners)?

Rationale for Research Approach

The research questions for this study were approached from a quantitative

research design. According to Creswell (2015), in quantitative research a problem is

identified based on the need to explain why something occurs. The purpose of this

research is to identify why gaps exist in student learning for certain groups of students.

This study is quantitative because it is based on data that were collected from students
43

who attended the summer literacy camp and students who were invited to attend but did

not. More specifically, the research design is quasi-experimental, as explained by

Creswell (2015) and illustrated in Tables 2 and 3.

Table 2

Creswell’s Quasi-Experimental Research Design

Pre- and Posttest Design Time

Select Control Pretest No Treatment Posttest


Group
Select Experimental Pretest Experimental Posttest
Group Treatment

Table 3

Quasi-Experimental Research Design as Applied to this Study

Pre- and Posttest Design Time

Select Control Group: Pretest: No Treatment: Posttest:


51 Non-participants aimswebPlus and Declined Invitation aimswebPlus and
F & P or CC to Summer Literacy F&P or CC
Camp
Select Experimental Pretest: Treatment: Posttest:
Group: aimswebPlus and Attended Summer aimswebPlus and
78 Participants F&P or CC Literacy Camp F&P or CC

Research Setting/Context

The study involves students who attend one of four elementary schools in a

suburban school district on Long Island in New York State. The school consists of 602

students, of which 51% are male and 49% are female. The demographics include 35%

White, 44% Hispanic or Latino, 11% Black or African American, 7% Asian or Native

Hawaiian or other Pacific Islander, and 3% Multiracial students. Additionally, 54% of the
44

students come from poverty, 16% are English language learners, 13% have either an IEP

or a 504 plan, and 5% are homeless. It is important to note that this school is an outlier to

the district and does not represent the overall demographics of the district, as illustrated

in Table 4 taken from the 2017 NYS School Report Card.

Table 4

Demographics of Study District and Host School

District High District Summer Literacy Camp


School Middle School Host Elementary School
1870 Students 1282 Students 602 Students
Male/Female 53%/47% 54%/46% 51%/49%
American 5/0% 2/0% 2/0%
Indian/Alaskan Native
Black 106/6% 79/6% 66/11%
Hispanic/Latino 325/17% 253/20% 263/44%
Asian or Native 82/4% 51/4% 40/7%
Hawaiian/
Pacific Islander
White 1336/71% 872/68% 209/35%
Multiracial 24/1% 24/2% 22/4%
English Language 103/6% 67/5% 98/16%
Learners
Students with 284/15% 230/18% 79/13%
Disabilities
Economically 664/36% 457/36% 325/54%
Disadvantaged
Homeless 55/3% 48/4% 33/5%

The site school was chosen because of the high number of economically

disadvantaged students and ELLs. It is the only school that hosted a summer literacy

camp utilizing funds from Title I and Title III monies. The summer literacy camp was

made up of 78 students in grades K-4. Of those 78 students, 64 were economically


45

disadvantaged and 37 were English Language Learners. Students were invited to the

summer literacy camp based on their Spring reading levels. Classroom teachers and

reading teachers made recommendations and then students were categorized into List A

and List B based on their reading levels. Invitations were sent out to List A. As students

from List A declined, invitations were sent out to students from List B until each class

had at least 14 students in it.

The camp was created based on a vision shared by the building principal, the

director of ENL and the science director in hopes of preventing summer slide and

increasing positive relationships in order for students to have a more positive school

experience. Planning meetings were held from February through June in order to prepare

for the camp. An internal posting advertised the program and asked for interested

applicants to apply. The posting called for classroom teachers, ENL teachers, a library

media specialist, a social worker, and a STEAM consultant. Interested candidates applied

and interviews were conducted. Five classroom teachers, five ENL teachers, a library

media specialist, a social worker, and a STEAM consultant were hired.

Five classes were set up; one each for students entering grades one through five.

Each class had two teachers: a classroom teacher and an ENL teacher. Also on staff were

a library/media specialist, a STEAM consultant, a social worker, and a nurse. Teachers

met for two days of professional learning before the camp started to gather materials and

plan their lessons. The camp took place over five weeks and students attended Monday

through Thursday from 9:00 AM – 12:00 PM. A schedule was created by each classroom

teacher that included breakfast, reading workshop, read aloud, science, recess, and lunch.

Students received breakfast and lunch during the literacy camp through a partnership with
46

Long Island Harvest. Students received 10 books on their independent reading level that

they used during the literacy camp and then were able to keep and add to their home

library. Through a partnership with KPMG’s Family for Literacy, students received an

additional 4 books to add to their library. As shown in Figure 4, KPMG collaborates with

First Book through their KPMG’s Family for Literacy with a mission to child illiteracy

by putting new books into the hands of children in need. KPMG partners with First Book.

First Book believes that education is the best way out of poverty for children in need and

aims to remove barriers to quality education for all kids by making things like high-

quality books and educational resources affordable to its member network. KPMG

collects donations that are deposited into their First Book account which allows them to

purchase and donate books to schools in need. This donation allowed the participants in

the summer literacy camp to build/add to their home library.

Figure 4. KPMG’s Collaboration with First Book

Classroom libraries were also purchased with Title I funds around the science

themes: what is the role of a scientist, human impact on the environment, weather, and
47

forces and motion. These themes were chosen by the science director because of their

spiral through the elementary program. Classes went to the library each week where the

library media specialist and the STEAM consultant had a hands-on science activity

planned that went along with the theme for that week. Teachers did a read aloud each day

modeling for students the strategies that they would use during independent reading time.

During reading workshop, teachers met with small groups of students for guided reading

or strategy groups. While they met with small groups, the rest of the students were

independently reading. Book bags were sent home each day in hopes that students would

also read each night. Parents were invited in during the first week of the camp to meet the

teachers and hear about what would be done in school and how they could support that

work at home. All of these pieces were put in place in hopes of mitigating summer

learning loss.

Data collection for this study included a quasi-experimental design using two

groups, one that received treatment (attended literacy camp) and the other that did not

(those who were invited but did not attend). This research could shed light on the student

achievement gap that widens each year for students. With these findings, next steps can

be determined and implemented in order to provide a blue print for districts to use to

provide a summer learning experience that could close the student achievement gap for

students, especially those from poverty and ELLs.

Research Sample and Data Sources

The overall sample consists of 129 students broken down into two groups to be

studied: the group who received treatment and the group who did not. The treatment

group consists of 78 students entering grades 1-5 who attended the summer literacy
48

camp. Of those 78 students, 64 were economically disadvantaged and 37 were English

Language Learners. Fifty one students were invited to attend the summer literacy camp

but declined the invitation. Of those 51, 34 were economically disadvantaged and 13

were English Language Learners. This group was considered the control group that did

not receive treatment.

The data source for this quantitative study consists of reading data that were

recorded by classroom teachers and collected by the district. Teachers record reading

levels in September, December, March and June in the district’s student management

system, Infinite Campus. These reading levels are tracked over time by the building

principal using a spreadsheet that also houses the aimsWebPlus scores. aimsWebPlus is

the universal screener that is used by the district and is administered to students in

September, January, and June.

Data Collection Methods

Teachers in this school report student reading levels in September, December,

March and June. Official benchmarks are collected using the Fountas and Pinnell

Benchmark Assessment System for Levels A-J. Using the Fountas and Pinnell

Benchmark Assessment System, teachers are able to identify the instructional and

independent reading levels of all students and document student progress through one-on-

one formative and summative assessments. Teachers use Jennifer Serravallo’s Complete

Comprehension Assessment kit to assess reading level K and above. The district switches

over at Level K because the Complete Comprehension kit assesses students on an entire

book that is similar to those that they will encounter on their level. Teachers record each

student’s independent reading level trough Infinite Campus.


49

In addition to reading levels, student performance is reported three times per year

with aimswebPlus. The aimswebPlus scores are compared to established cut scores and

national and/or local norms. aimswebPlus uses both timed curriculum-based measures

(CBMs) and untimed standards-based measures to assess skills and inform instruction.

aimswebPlus gives subtests for word reading fluency (WRF) for kindergarten and first

grade, oral reading fluency (ORF) for grades 1-3, and vocabulary and reading

comprehension for grades 2-5. Students receive a composite score for reading in the form

of a percentile that is locally and nationally normed. For the purpose of this study,

reading levels and aimswebPlus scores will be compared from Spring 2019 to Fall 2019.

Research Ethics

Data that were collected for this study were preexisting data that the school

collects each year. Teachers had no knowledge of the study or data analyses. The summer

literacy camp would have taken place regardless of this study. Students who participated

in the summer literacy camp were invited based on their reading levels and enrolled by

their parents.

Data Analysis Methods

Using IBM SPSS, the data were screened using descriptive statistics. A one-way

between-subjects ANOVA was performed to make sure the covariate (pretest) did not

vary across the groups (treatment group versus non-treatment group). A repeated

measures ANOVA was conducted to compare the reading levels of students who attended

the summer literacy camp versus the control group who did not receive treatment. The

same procedures were repeated with the aimsweb Plus reading composite scores. From

this data, the summer gain/loss scores were analyzed.


50

For the second research question, a mixed repeated measures ANOVA was

conducted but disaggregated for economically disadvantaged students and English

language learners. The main goal of the repeated measure ANOVA is to test whether a

score changes over time as a result of random fluctuations or if there is evidence for

something. The repeated measures ANOVA allows researchers to incorporate different

effects into their models, such as grouping variables or covariates. The test this researcher

used includes a grouping variable often called a one-within one-between ANOVA, which

refers to the within effect of time, which could influence everyone within the sample, and

the between effect of group, which describes differences between two or more groups

(Field, 2005). The summer literacy camp was the treatment or intervention condition

compared to the control group with measurements taken at two times, before and after

treatment. This kind of analysis allows researchers to see if scores changed as a result of

the treatment, but also compare the changes over time between a group who should have

shown a change (the treatment group) and one who should not have changed (the control

group). This wrinkle in the design can help account for threats to internal validity, such as

maturation and testing effects (Girden, 1992).

Issues of Trustworthiness

Possible threats to internal validity can be time, history and instrumentation.

Differences among student experiences over the summer can influence reading levels.

Students who read at home over the summer are more likely to not experience summer

learning loss than students who did not read at home over the summer. The quality of the

classroom experience that students had over the summer is also a variable as each pair of

teachers created their own lesson plans.


51

During the school year, each classroom teacher administers their own reading

benchmark assessment to obtain a student’s independent reading level. Although there

are specific protocols in place to administer these benchmark assessments, 27 different

testers leave room for error. It should also be noted that there are two different testers

from spring to fall as the students change grade levels and teachers.

Limitations of the Study

Limitations of the study include the sample size and the length of the study. To

get a more robust dataset and to be able to make better generalizations, this study should

be completed in different schools in several school districts over the course of several

years.

Researcher Role

The researcher is the building principal of the Title I school where the

intervention took place. This study is important to see if the allocation of Title I and Title

III monies is worth spending on a summer literacy camp. The researcher will look at the

data and let the results guide the decision making.

Conclusion

A Summer Literacy Camp was formed using Title I and Title III monies as an

intervention to help mitigate summer learning loss. In addition to the grant money,

community partnerships with Long Island Harvest and KPMG provided food and books

for the students who attended the camp. Reading was the primary subject addressed along

with a specific integration of science. While this chapter outlined the camp used as the

intervention, the next chapter will go over the results of the data analysis to see if the

camp was successful.


52

CHAPTER 4: RESULTS
This chapter contains the analysis of the results as outlined in Chapter 3. The

chapter presents the findings broken down and discussed by research question. As stated

in Chapter 1, this study examined the spring and fall reading scores of students to see if a

summer literacy intervention could help mitigate summer learning loss. This chapter

presents analyses of differences in reading scores based on students who attended the

camp as well as students who were invited to attend but did not. Data were also examined

to see if the camp had a greater impact on ELLs as well as students identified as

economically disadvantaged. Reading levels and aimswebPlus scores from Spring 2019

to Fall 2019 were analyzed. A repeated measures ANOVA was used to compare the

pretest/posttest results to see if the intervention was able to help mitigate summer

learning loss. A mixed repeated measures ANOVA was used to see the effect the camp

had on the different groups of students.

Results/Findings

Participants in this study were 129 students in grades K-4. Of those 129 students,

78 participated in the Summer Literacy Camp. Of those 78 students, 64 were

economically disadvantaged and 37 were English Language Learners. 35 of the students

were economically disadvantaged and ELLS. 51 students did not participate in the

summer literacy camp and became the control group. Of the 51 students, 34 were

economically disadvantaged and 12 were English Language Learners.


53

Table 5

Means and Standard Deviations of Group Scores on aimswebPlus

Measure 1 Measure 2
Variable Mean St. Deviation Mean St. Deviation

Attend 17.16 15.48 23.46 24.60

Not Attend 22.70 21.02 20.65 17.35

ENL 15.78 17.34 21.28 23.84

Not ENL 21.46 18.05 23.07 21.03

Low SES 18.16 18.48 21.48 22.97

Not Low SES 23.35 15.40 25.65 18.47

RQ1: How do students who attended the summer literacy camp compare to those

who were invited to the camp but did not attend in regard to reading levels and aimsWeb

Plus scores?

The researcher first used a repeated measures ANOVA to see if the intervention

had an impact on reading scores. Prior to running the Repeated Measures ANOVA, the

researcher calculated the difference between the pretest total and the posttest total. A box

plot revealed five outliers. The researcher removed the outliers from the sample because

they made up less than 5% of the sample. A histogram produced from the cleaned data

revealed a normal distribution. The Repeated Measures ANOVA indicated that there

were no gains in terms of reading levels from June to September when looking at reading

levels reported using the Fountas and Pinnell and Complete Comprehension reading

assessments. The Repeated Measures ANOVA showed no significant change in reading

levels from June to September based on those who attended the summer literacy camp
54

(treatment group) versus those who did not attend (control group) with a p value of .54.

Because the p value was greater than .05, the difference was not statistically significant.

Based on these results, students who attended the camp did not grow in reading levels nor

was there a decrease in reading levels. These results will be further discussed in Chapter

5.

The aimsWebPlus scores for those who attended the camp had a p value of .05,

which shows that attending the camp had a positive effect on the aimswebPlus scores.

For RQ1, the repeated measures ANOVA indicated that attending the camp did not have

a significant effect reading levels but it did have an effect on aimswebPlus scores, which

are standardized and normed. Since the two dependent variables showed differing results,

there is not enough evidence to accept the null hypothesis.

RQ2 asked, did the summer literacy camp impact the following groups of

students: students from a low SES home and students who are identified as ELLs? The

researcher used a Split-Plot ANOVA also known as a Two-Way Mixed Repeated

Measures ANOVA. This test compares the mean differences between groups that have

been split on two factors where one factor is a within subjects and the other factor is

between subjects to look at the main effects and the interactions. Before running the

Two-Way Repeated Measures ANOVA, the researcher checked the data for normality

and homogeneity of variance. The Repeated Measures ANOVA compared the

before/after results for the 2 groups of students, ELLS and those with low SES. In this

case, the test for homogeneity of variance does not apply as a required assumption. The

applicable assumption required is the Mauchly’s sphericity test. Mauchly’s sphericity test

is a statistical test used to validate a repeated measures analysis of variance (ANOVA).


55

Sphericity is an important assumption of a repeated-measures ANOVA. It is the

condition where the variances of the differences between all possible pairs of within-

subject conditions are equal. Sphericity can be evaluated when there are three or more

levels of a repeated measure factor. However, since there are only two levels in this

ANOVA, then sphericity has been met.

A 2 x 2 mixed factorial ANOVA with repeated measures was conducted to

investigate the effects of a summer literacy camp on aimsWeb Plus scores and the

interactions. ANOVA is used to compare means when there are two or more independent

variables. The ANOVA is mixed because there is a mixture of between-groups and

repeated measures variables. The two between-groups variables for this research question

are English Language Learner (ELL) and Free and Reduced Lunch (low SES). The

repeated measures are the spring and fall aimsWebPlus assessments.

The Mixed Repeated Measures ANOVA showed that there was an overall change

in aimswebPlus scores for all students from June to September with a p value of .015.

The Mixed Repeated Measures ANOVA done for the aimsWebPlus scores indicated that

there was no significant main effect for those who attended the summer literacy camp

with a p value of .91. It also showed that the intervention did not have a significant

impact on either group of students. The summary table of repeated measures effects in

the ANOVA with corrected F-values is below. The F ratio is the ratio of two mean

square values. If the null hypothesis is true, the expectation is for the F to have

a value close to 1.0 most of the time. A large F ratio means that the variation among

group means is more than one would expect to see by chance. The output is split into

sections for each of the effects in the model and their associated error terms.
56

Table 6

ANOVA Source Table of aimswebPlus Gain/Loss Scores

Measure: AimScores

Type III
Sum of Mean
Source Squares df Square F Sig.

Time 497.66 1.00 497.66 6.08 .02

Attend 1.15 1.00 1.15 0.01 .91

ENL 526.33 1.00 526.33 6.44 .01

Low SES 363.58 1.00 363.58 4.44 .04

Attend and ENL 102.57 1.00 102.57 1.25 .27

Attend and Low SES 189.24 1.00 189.24 2.31 .13

ENL and Low SES 410.39 1.00 410.40 5.01 .03

For RQ2, the mixed repeated measures ANOVA indicated no significant

difference in reading scores when looking the aimsWebPlus scores for those who

attended the camp who were ELLs and low SES. Because the p value was greater than

.05, the difference was not statistically significant. The hypothesis is rejected in favor of

accepting the null hypothesis. The difference in the reading scores based on the

intervention was not significantly different for the groups of ELLs or low SES groups

compared to non-ELLs and not low SES.

The data show that the summer literacy camp was less effective than time in

causing the increase in aimsWeb plus scores. However, the marginal means shows a large

increase for those attending camp, as shown in Table 7. If you look at confidence
57

intervals, they tell the story of why the difference is not statistically significant. The

ranges of the confidence intervals overlap because the standard deviations are so large.

Figure 5. Marginal means of aimswebPlus scores from spring to fall.

Table 7

Confidence Intervals

95% Confidence Interval

Intervention aimswebPlus Mean Std. Erro4 Lower Upper

Bound Bound

Did Not Measure 1: 17.539 2.076 13.421 21.657


Attend Spring Test
Literacy Measure 2: 18.082 2.319 13.482 22.682
Camp Fall Test
Attended Measure 1: 18.112 3.535 11.100 25.124
Literacy Spring Test
Camp Measure 2: 26.364 3.949 18.530 34.198
Fall Test
58

Conclusion

The researcher hypothesized that students who received the intervention would

score better in terms of reading levels and aimswebPlus scores than students who were

invited to attend the camp but did not. The aimswebPlus scores showed an increase in

scores for those who attended versus those who did not. Reading levels did not show an

increase, but they also did not decease. The null hypothesis is rejected as there was an

impact on students who attended the camp.

The researcher hypothesized that there will be a difference in reading scores based

on the different groups of students. There was no significant change in reading scores for

economically disadvantaged or ELLs who attended the camp. The null hypothesis is not

rejected. The difference in the reading scores based on the intervention was not

significantly different for the groups of ENL or Free and Reduced groups compared to

non-ENL/Free and Reduced groups of students. The data shows that there was an impact

on students who attended the camp. The researcher believes that students benefitted even

more than the data suggests. This will be further discussed in Chapter 5.
59

CHAPTER 5: DISCUSSION
The main purpose of this study was to examine the impact of a summer literacy

camp intervention on reading scores from spring to fall to see if the camp was effective.

In addition, the researcher wanted to see if the summer literacy camp had a greater impact

on ELLs and students who are economically disadvantaged. This chapter discusses the

results from Chapter 4 and their connection to existing research. In addition, the chapter

discusses the impact of these conclusions on future professional practice and research.

Implications of Findings

Research Question 1: How do students who attended the summer literacy camp

compare to those who were invited to the camp but did not attend in regard to reading

levels and aimsweb Plus scores?

Finding: Students who attended the literacy camp saw an increase in their

aimswebPlus scores versus those who did not. The aimswebPlus scores for those who

attended the camp increased significantly from spring to fall. Because the standard

deviations were so large, the ranges of the confidence intervals overlapped causing the

statistical significance to have a p value of .05, which is just at significance. Reading

levels as assessed with the Fountas and Pinnell and Complete Comprehension

Assessments showed no statistical significance between students who attended the

literacy camp and those who did not. Reading levels for most students remained the same

from the spring to fall benchmarks regardless of whether they attended the camp or not.

While the data did not show growth on reading levels for students who attended the

summer literacy camp, it did show that levels for most students remained the same. This

indicated that there was no learning loss for most students from June to September.
60

Research Question 2: Did the summer literacy camp impact the following groups

of students: students from a low SES home and students who are identified as ELLs

(English Language Learners)?

Finding: The literacy camp intervention did not have a greater impact on those

who are economically disadvantaged or ELLS. There was no significant change in

reading scores on either assessment for students who were economically disadvantaged

or ELLS who attended the camp.

While the data collected allowed for analyzing changes in reading scores, the

researcher hypothesized that there would be a change in reading scores because the

theoretical framework for the summer literacy camp was based on Vygotsky’s theory of

constructivism which argues that cognitive abilities are socially guided and constructed.

Vygotsky’s (1978) theory asserts three major themes including Social Interaction, the

More Knowledgeable Other (MKO), and the Zone of Proximal Development (ZPD).

Cognitive development stems from social interactions from guided learning within the

zone of proximal development as children and their partners co-construct knowledge

(Wertsch & Tulviste, 1992). Students who attended the literacy camp engaged in social

interactions throughout the day. Students worked together in reading partnerships and

small groups. They took part in hands-on science experiments in small groups in which

they discussed their predictions and their findings.

Students also interacted with several teachers throughout the day who would be

considered the MKO in respect to reading and science. Each class had a classroom

teacher and an ENL teacher who worked together and often created smaller student to

teacher ratios in the class. Students also worked with a library media specialist as well
61

as a STEAM consultant teacher twice a week who were able to lend their expertise by

creating hands on learning activities for the students to take part in in order to solidify

their learning in science. Teachers worked with students in small groups instructing

students at their Zone of Proximal Development.

The Zone of Proximal Development refers to the skills that are too difficult for a

child to master on his/her own but can be done with guidance and encouragement.

Teachers met with students in small groups for guided reading in which teachers

worked at the students’ instructional level. The instructional level is in the Zone of

Proximal Development. The instructional reading level is the highest level at which

a reader is not independent, but has adequate background knowledge for a topic, and can

access text quickly and with no or few errors (Fountas & Pinnell, 1996).

Figure 6. Vygotsky’s social theory model.

Cognitive
development
is limited to a
certain range
Full cognitive
development

Social Interaction
The More Knowledgeable Other (MKO)
Zone of Proximal Development (ZPD)
62

Calkins also believes that it is important for students to work in the Zone of

Proximal (ZPD), which is one of Vygotsky’s themes of constructivism. The ZPD is the

distance between a learner’s ability to perform a task under adult guidance and/or with

peer collaboration and their ability to solve the problem independently. This is why

Calkins promotes the importance of students reading at their independent reading level

and teachers teaching students at their instructional level, which is in the Zone of

Proximal Development. Reading workshop, including reading independently at the

independent reading level, guided reading at the instructional level, reading aloud, social

interaction and student choice are all part of the balanced literacy approach that was

utilized in the summer literacy camp that falls in line with Vygotsky’s three major themes

of constructivism. In addition to the increase in reading scores on aimswebPlus for those

who attended the camp, the researcher believes that the students also benefitted from

the social interactions they had, the teachers who supported them, and being instructed

at their instructional level which in falls within the Zone of Proximal Development.

In addition to the impact on reading scores for those who attended, the researcher

also believes that the camp had a positive impact on students for other reasons as well.

Students who attended the camp received meals, books to keep at home, and social

interaction that they would not have received had they not attended the camp. Further

research needs to be done in order to see if there are long term effects for students who

attend a summer literacy camp intervention.


63

Figure 7. Conceptual framework for summer literacy camp intervention.

Title I Money

Reading

Students Summer Literacy Short / Long


Reading Below Camp Term Effect
Level STEM

Title III Money

Relationship to Prior Research

The data show that there was statistical significance to prove that students who

attended the summer literacy benefitted more than those who did not. The aimswebPlus

scores for those who attended the camp were higher in the fall that those who did not

attend. When looking at reading levels, students who attended the camp did not have an

increase in reading levels as shown by their benchmark assessments, but they did not

decrease either, showing that there was no summer learning loss. Summer learning loss

refers to the loss of knowledge and academic skills over summer months when students

are out of school, and is widely recognized as a pervasive and significant problem in

United States education (Zaromb et al., 2014). Students who attended the summer

literacy camp grew on their aimswebPlus reading scores and mostly remained on the

same reading level from spring to fall. The researcher believes that this shows that the

summer camp did have an impact and if continued can help in closing the gap for

students who attend. Zaromb et al. (2014) found that the negative effects of summer

increase with increases in students’ grade levels thus compounding the issue each year
64

and never giving students the time they need to close their achievement gap. It is

important for students to have opportunities like this early on in order to give them the

support and that they need to help close the achievement gap.

Summer learning loss varies with respect to grade level, subject matter, and

socioeconomic status (Alexander et al., 2007; Cooper et al., 1996). While the data show

that there was no significant growth for economically disadvantaged students, it also

shows that there was no loss either. Research on summer learning loss has provided

reliable evidence that the reading achievement of economically disadvantaged students

slides back a few months every summer (Allington et al., 2010). Students who

participated in the camp received books to take home and keep in order to give them

access to books over the summer. A small set of studies reports that simply supplying

poor students with books over the summer results in improved reading achievement

(Allington et al., 2010). The researcher believes that students who are economically

disadvantaged benefitted from attending the camp by having the opportunity to read in

school and then had access to books to take home to continue reading over the summer.

Students who attended the camp also received breakfast and lunch in school, which the

researcher believes had a positive impact on the students.

While the reading levels did not grow for ELLs over the summer, they did remain

the same. aimswebPlus levels grew for ELLS but did not prove to be statistically

significant. A study done by Michael Kieffer showed that children who entered

kindergarten with limited proficiency in English continued to demonstrate reading

achievement below that of their monolingual English speaking peers through fifth grade

(Kieffer, 2010). These findings suggest that students who enter school with limited
65

English proficiency or score low on early literacy measures never catch up. Research

shows that it is possible to predict in early childhood who is at risk for later reading

difficulties (Lesaux, 2012). This is why it is crucial to intervene as early as possible in

order to close the gap for ELLs. Attending the summer literacy camp allowed all of the

students to maintain the reading skills that they left school with in the spring. These

students did not have to spend the first two months of the next school year making up lost

ground. They were able to start the new school year where they left off, which over time

can help to close the gap, and at the very least prevent the gap from widening for these

students.

Limitations of the Study

Limitations of the study included the sample size and the length of the study.

Ultimately, to get a more robust dataset and be able to make better generalizations, this

study should be replicated in multiple schools and across several school districts.

Two possible threats to internal validity can be history and instrumentation.

Differences among student experiences over the summer can influence reading levels.

Students who read at home over the summer are more likely to not experience summer

learning loss than students who did not read at home over the summer. The quality of the

classroom experience that students had over the summer is also a variable as each pair of

teachers created their own lesson plans.

During the school year, each classroom teacher administers their own reading

benchmark assessment to obtain a student’s independent reading level. Although there

are specific protocols in place to administer these benchmark assessments, 27 different

testers leave room for error. It should also be noted that there are two different testers
66

from spring to fall as the students change grade levels and teachers. Furthermore,

teachers administer benchmark reading assessments earlier in the spring in order to give

students the opportunity to receive instruction and practice with their independent reading

level before summer break. Students are then given time to get back into their reading

routine once school starts again in September. Benchmark assessments are not given until

the end of September. This could account for the lack of movement in either direction for

the reading levels.

Recommendations for Future Practice

The impetus for this research was to examine the effectiveness of a summer

literacy camp intervention in order to help mitigate summer learning loss. In future, other

school districts may want to consider realigning use of their Title I and Title III funds in

order to create a summer literacy camp for their students. The summer literacy camp was

effective for ELLs who benefitted from the additional support. The researcher believes

the camp also benefitted economically disadvantaged students as it provided books at

home and two meals a day.

While the intervention for this study focused on literacy, it could be beneficial to

add mathematics to a summer program since there is research that summer learning loss

has a greater effect on mathematics. Students who attend a summer literacy camp can

benefit academically, but equally as important, they can benefit socially and emotionally

through social interactions, consistent meals, and books to read at home. School leaders

are urged to consider partnering with organizations like Long Island Harvest and KPMG

in order to help support their most vulnerable students. It is important for interventions
67

like this to start early in order to mitigate summer learning loss and help close the gap for

struggling students.

Recommendations for Future Research

The conclusions of this study can form a foundation for other studies that more

deeply examine interventions that mitigate summer learning loss. Future work should

include more schools across multiple school districts. This study should be replicated

with a larger sample that spans geographic areas including rural, urban, and suburban

school districts across the US in which there is an extended summer break.

In order to generalize findings to all elementary grade levels and students of

varying economic statuses, the intervention should expand to include mathematics and

this study should be replicated with analysis of mathematics scores. In addition, a

qualitative piece should be added to examine the relationship between attending a

summer literacy camp intervention and students’ attitudes towards school. There is

research that supports the notion that students who have a positive attitude toward school

do better in school. Attending a summer literacy camp intervention that promotes social

interaction, provides steady meals and books to read at home should have a positive

effect on how students feel about school.

Future researchers should examine the relationship between attending a summer

literacy camp and academic achievement over time in order to determine if summer

programs can help to close the achievement gap for struggling students. A longitudinal

study tracking students over time would give a more accurate picture of overall gains and

losses than a one-time analysis of student data. Students who participate in a summer
68

literacy camp should be tracked using the same assessments given on the same dates in

the spring and fall from kindergarten through fifth grade.

Lastly, future researchers should examine the effect the summer literacy camp had

on academic vocabulary, specifically in the area of science. This summer literacy camp

incorporated the NGSS because they elevate expectations for students’ language and

literacy development across the content areas and raised the bar linguistically and

academically for all learners, especially ELLs. The NGSS were also used because of the

explicitness of crosscutting concepts that connect interrelated ideas across science

disciplines allowing students to make connections among science ideas. The topics were

carefully chosen to help build students’ prior knowledge in order to better prepare them

for the upcoming school year. Future research should examine the impact an intervention

like this has on students’ achievement in science.

Conclusion

Districts need to provide for their struggling students over the summer to help

mitigate summer learning loss. The Title I school in which the summer literacy camp

took place recognized a need to make changes to its summer program in order to mitigate

summer learning loss and close the gap for some of its most vulnerable students. Title I

and Title III monies were reallocated and a new summer literacy camp was implemented

in the summer of 2019. The summer literacy camp had classroom teachers and ENL

teachers teamed together to create small teacher to student ratios. There was a STEAM

theme each week and a STEAM coordinator and library media specialist who worked

together to create meaningful and engaging hands-on science experiments that would

prepare students with the academic vocabulary needed for upcoming science units in their
69

September grade level. A social worker was also on-site to continue the support that

many of the students received during the school year. Students who participated received

breakfast and lunch each day as well as books to take home to help build a home library

through generous donations or organizations with which the school partnered. Students

worked together and played together building relationships with each other and their

teachers. The researcher believes that students who attended the summer literacy camp

benefitted in even more ways than what was shown through the data.

Since the data that were analyzed did not have all of the expected results, the

process of looking more closely at the data points and the benchmark assessments in

reading added information that can help shape future studies to better determine the

impact the camp had. In future studies, it would be more helpful to use only one

benchmark reading assessment instead of the two different ones that were used for the

different reading levels. The researcher recommends using the Fountas and Pinnell

benchmark reading assessments for all reading levels instead of the using the Complete

Comprehension Benchmark Assessment for levels K and above. Giving the assessment at

the end of June and again at the beginning of September would give the researcher more

accurate data to look at in conjunction with the aimswebPlus data. It also might be even

more effective to give the benchmark by the same tester, preferable a trained reading

specialist. These steps could provide more reliable data points in order to truly assess the

impact the intervention had.

The data that were analyzed showed that the summer literacy camp intervention

did not impact the reading levels of those who attended. Although the levels did not show

growth, they also did not show loss. Most students remained on the same reading level in
70

the fall as where they had left off in the spring showing that there was no summer loss.

The researcher believes that if the benchmarking was done differently the results may

look different and show a greater impact. The data show that aimswebPlus scores grew

significantly for all who attended the summer literacy camp. Furthermore, the data

showed that ELLs who attended the camp grew more than ELLs who did not.

The researcher also believes that the students further benefitted from the books

and meals that they received. Many students suffer from food insecurity over the

summer. Students who attended the camp ate two meals and many brought leftovers

home. Students who attended the camp received about 10 books altogether that were

theirs to keep at the end of the program, giving them access to books at home. Finally,

students benefitted from the support they received from their teachers, the social

interactions that took place, and the academic support that they received in science to

help prepare them for the next school year.

The summer literacy camp became virtual in the summer of 2020. We are hopeful

that it will take place in 2021 so that we can continue to study and refine. Going through

the process of this study validated the researcher’s belief in the importance of a summer

literacy camp intervention, especially for ELLs. Research must continue to identify

causes and possible solutions to the achievement gaps we see in our most vulnerable

students. School and district leaders must continue to look at the funds we are given for

these students to initiate systemic reform practices in order to help close these gaps for

good.
71

APPENDIX A

Institutional Review Board Approval Memo


72

REFERENCES

Alexander, K. L., Entwisle, D. R., & Olson, L. S. (2007). Lasting consequences of the

summer learning gap. American Sociological Review, 72, 167-180.

Allington, R. L. (2012). What really matters for struggling readers: Designing research-

based programs (3rd ed.). Boston: Allyn and Bacon.

Allington, R. L., McCuiston, K., & Billen, M. (2014). What research says about text

complexity and learning to read. The Reading Teacher, 1-10.

Allington, R. L., McGill-Franzen, A., Camilli, G., Williams, L., Graff, J., Zeig, J., . . .

Nowak, R. (2010). Addressing summer reading setback among economically

disadvantaged elementary students. Reading Psychology, 31, 411-427.

Anderson, R. C., Wilson, P. T., & Fielding, L. G. (1988). Growth in reading and how

children spend their time outside of school. Reading Research Quarterly, 23, 285-

303.

Armbruster, B. B., Lehr, M. A., & Osborn, M. (2008). The research building blocks for

teaching children to read: Putting reading first. Jessup, MD: National Institute

for Literacy.

Beach, K. D., Philippakos, Z. A., McIntyre, E., Mraz, M., Pilonieta, P., & Vintinner, J. P.

(2018). Effects of a summer reading intervention on reading skills for low-income

black and hispanic students in elementary school. Reading & Writing Quarterly,

34(3), 263-280.

Beck, I., McKeown, M., & Kucan, L. (2002). Bringing words to life: Robust vocabulary

instruction. New York: Guildford Press.


73

Borman, G. D., Benson, J., & Overman, L. T. (2005). Families, schools, and summer

learning. The Elementary School Journal, 106(2), 131-149.

Bowers, L. M., & Schwarz, I. (2018). Preventing summer learning loss: results of a

summer literacy program for students from low-SES homes. Reading & Writing

Quarterly, 34(2), 99-116.

Brozo, W., & Sutton Flynt, E. (2008). Motivating students to read in the content

classroom: Six evidence-based principles. The Reading Teacher, 62(2), 172-174.

Bull, B. L. (2008). Social justice in education: An introduction. New York: Palgrave

Macmillan.

Calkins, L. (1991). Living between the lines. Portsmouth, NH: Heinemann.

Calkins, L. (1999). Let the words work their magic. Instructor, 25-30.

Castles, A., Rastle, K., & Nation, K. (2018). Ending the reading wars: reading acquisition

from novice to expert. Psychological Science in the Public Interest, 19(1), 5-51.

Catalano, R. F., Haggerty, K. P., Oesterle, S., Fleming, C. B., & Hawkins, D. (2004). The

importance of bonding to school for healthy development: findings from the

social development research group. Journal of School Health, 74(7), 252-261.

Ceci, S. J., & Papierno, P. B. (2005). The rhetoric and reality of gap closing. American

Psychologist, 60, 149-160.

Chaudry, A., Wimer, C., Macartney, S., Frohlich, L., Campbell, C., Swenson, K., . . .

Hauan, S. (2016). Poverty in the United States: 50-year trends and safety net

impacts. Washington, DC: Office of Human Services Policy.


74

Cheuk, T. (2016). Discourse practices in the new standards. Electronic Journal of Science

Education, 20(3), 92-107. Retrieved from Electronic Journal of Science

Education.

Cipielewski, J., & Stanovich, K. E. (1992). Predicting growth in reading ability from

children's exposure to print. Journal of Experimental Child Psychology, 54, 74-

89.

Coleman, J. (1966). Equality of educational opportunity. Washington, DC: US

Department of Health, Education, and Welfare, Office of Education.

Cooper, H., Nye, B., Charlton, K., & Greathouse, S. (1996). The effects of summer

vacation on achievement test scores: A narrative and meta-analytic review.

Review of Educational Research, 66(3), 227-268.

Cooper, H., Valentine, J. C., Charlton, K., & Melson, A. (2003). The effects of modified

school calendars on student achievement and an school and community attitudes.

Review of Educational Research, 73(1), 1-52.

Creswell, J. (2015). Educational research: Planning, conducting, and evaluating

quantitative and qualitative research (5th ed.). Upper Saddle River, NJ: Pearson

Education, Inc.

Cunningham, A. E., & Stanovich, K. E. (1998). What reading does for the mind.

American Educator, 1-8.

Davies, B., & Kerry, T. (1999). Improving student learning through calendar change.

School Leadership & Management, 19(3), 359-371.

De Laet, S., Colpin, H., Vervoort, E., Doumen, S., Van Leeuwen, K., Goossens, L., &

Verchueren, K. (2015). Developmental trajectories of children's behavioral


75

engagement in late elementary school: both teachers and peers matter.

Developmental Psychology, 51(9), 1292-1306.

DeLuca, E. (2010). Unlocking academic vocabulary. The Science Teacher, 27-32.

Desimone, L. (1999, September 1). Linking parent involvement with student

achievement: do race and income matter? Journal of Educational Research, 11-

30.

Downey, D. B., vonHippel, P. T., & Broh, B. A. (2004). Are schools the great equalizer?

Cognitive inequality during the summer months and the school year. American

Sociological Review, 69, 613-635.

Dzaldov, B. S., & Peterson, S. (2005). Book leveling and readers. The Reading Teacher,

59(3), 222-229.

Entwisle, D. R., Alexander, K. L., & Olson, L. S. (2001). Keep the faucet flowing:

Summer learning and home environment. American Educator, 25(3), 11-15.

Field, A. (2005). Discovering statistics using SPSS. Sage Publications.

Fountas, I. C., & Pinnell, G. S. (1996). Guided reading: Good first teaching for all

children. Portsmouth, NH: Heinemann.

Frazier, J. (1998). Extended year schooling and achievement: Can 30 days make a

difference? Houston, Texas: paper presented to the National Association for Year

Round Educational Conference.

Friedman, H. H., Hampton-Sosa, W., & Friedman, L. W. (2014). Solving the mega-crisis

in education: Concrete, cost-effective solutions. Journal of Educational

Technology, 10(4), 6-17.


76

Gambrell, L. B. (2011). Seven rules of engagement: What's most important to know

about motivation to read. Reading Teacher, 172-178.

Gambrell, L. B., & Morrow, L. M. (2019). Best practices in literacy instruction (6th ed.).

New York: The Guilford Press.

Gambrell, L., Mandel Morrow, L., & Pressley, M. (2003). Best practices in literacy

instruction (2nd ed.). New York: Guilford Publications.

Girden, E. (1992). ANOVA: Repeated measures. Newbury Park, CA: Sage.

Glines, D. (1998). Year round education: Creating a philosophical rationale-present to

2000. [Paper Presentation]. National Association for Year Round Education,

Houston, Texas. 14-18.

Guthrie, J. T., Schafer, W. D., & Huang, C. (2001). Benefits of opportunity to read and

balanced instruction on the NAEP. The American Journal of Educational

Research, 94(3), 145-162.

Hayes, D. P., & Grether, J. (1983). The school year and vacations: When do students

learn? Cornell Journal of Social Relations, 17(1), 56-71.

Helman, L. A., & Burns, M. K. (2008). What does oral language have to do with it?

Helping young English-language learners acquire a sight word vocabulary. The

Reading Teacher, 62(1), 14-19.

Heyns, B. (1978). Summer learning and the effect of schooling. New York, NY:

Academic Press.

Hinton, P., Brownlow, C., & McMurray, I. (2004). SPSS Explained. Routledge.
77

Januszyk, R., Miller, E. C., & Lee, O. (2016). Addressing student diversity and equity:

The Next Generation Science Standards are leading a new wave of reform. The

Science Teacher, 47-50.

Kieffer, M. J. (2010). Socioeconomic status, English proficiency, and late-emerging

reading difficulties. Educational Researcher, 484-486.

Kim, J. S., & Quinn, D. M. (2014). The effects of summer reading on low-income

children's literacy achievement from kindergarten to grade 8: a meta-analysis of

classroom and home interventions. Review of Educaional Research, 83(3), 386-

431.

Kirkland, L. D., Camp, D., & Manning, M. (2008/2009, Winter). Changing the face of

summer programs. Childhood Education, 82(3), 96-101.

Lesaux, N. K. (2012). Reading and reading instruction for children from low-income and

non-English speaking households. The Future of Children, 22(2), 73-88.

Manyak, P. C., & Bauer, E. B. (2009). English vocabulary instruction for English

learners. The Reading Teacher, 63(2), 174-176.

McLeod, S. (2018, August 5). Lev Vygotsky's Sociocultural Theory. Retrieved from

Simply Psychology: https://www.simplypsychology.org/vygotsky.html

McMillen, B. J. (2001). A statewide evaluation of academic achievement in year-round

schools. The Jouranl of Educational Research, 95(2), 67-74.

McMullen, S. C., & Rouse, K. E. (2012). The impact of year-round schooling on

academic achievement: evidence from mandatory school calendar conversions.

American Economic Journal: Economic Policy, 4(4), 230-252.


78

Miller, D. (2001). Principles of social justice. Cambridge, Massachusetts: Harvard

University Press.

Miller, E., Lauffer, H. B., & Messina, P. (2014). NGSS for English language learners:

From theory to planning to practice. Science and Children, 55-59.

Moats, L. (2007, January). Whole-language high jinks: How to tell when "scientifically-

based reading instruction" isn't. Washington: Thomas B. Fordham Institute.

Retrieved from ERIC.

Mraz, M., & Rasinski, T. V. (2007, May). Summer reading loss. International Reading

Association, 60(8), 784-789.

National Education Commission. (1994). Prisoners of time. Washington: National

Education Commission.

New York State Education Department. (2019). TITLE III, Part A: English Language

Learners and Immigrant Students. Retrieved from

http://www.nysed.gov/common/nysed/files/2017-18-title-iii-allowable-

unallowable-.pdf

Okhee, L., Miller, E. C., & Januszyk, R. (2014). Next generation science standards: All

standards, all students. Journal of Science Teacher Education, 25, 223-233.

Rasinski, T., Blachowicz, C., & Lems, K. (2012). Fluency instruction; Research-based

best practices. New York: The Guilford Press.

Rawls, J. (1999). A theory of justice: Revised edition. Cambridge, Massachusetts: The

Belknap Press of Harvard University Press.

Rawls, J. (2001). Just as fairness: A restatement. Cambridge, Massachusetts: The

Belknap Press of Harvard university Press.


79

Slavin, R. E., & Cheung, A. (2005). A synthesis of research on language of reading

instruction for English language learners. Review of Educational Research, 75(2),

247-284.

Smith, D. (2006). On the shoulders of giants: Leaders in the field of literacy as teacher

advocates. Language Arts Journal of Michigan, 22(1), 63-68.

U.S. Department of Education. (2019). Title I - Improving the Academic Achievement of

the Disadvantaged. Retrieved from

https://www2.ed.gov/policy/elsec/leg/esea02/pg1.html

Vygotsky, L. (1978). Mind in society: The development of higher psychological

processes. Cambridge, Massachusetts: Harvard University Press.

Wertsch, J. V., & Tulviste, P. (1992). LS Vygotsky and contemporary developmental

psychology. Developmental Psychology, 28, 548-557.

White, T. G., Kim, J. S., Kingston, H. C., & Foster, L. (2014). Replicating the effects of a

teacher-scaffolded voluntary summer reading program: The role of poverty.

Reading Research Quarterly, 49(1), 5-30.

Winters, W. (1995). A review of recent studies related to the achievement of students

enrolled in ryear round education programs. San Diego, CA: National

Association for Year Round Education.

Zaromb, F., Adler, R. M., Bruce, K., Attali, Y., & Rock, J. (2014). Using no-stakes

educational testing to mitigate summer learning loss: a pilot study. Princeton, NJ:

Educational Testing Service.


Vita

Name Patricia O’Regan

Baccalaureate Degree Bachelor of Arts, Molloy College,


Rockville Centre, New York
Major: Math / Elementary Education

Date Graduated May 1996

Other Degrees and Certificates Master of Science, LIU/ CW Post,


Brookville, New York
Major: Reading

Professional Certificate in School


District, Building Administration,
Stony Brook University, Stony Brook,
New York (2003)

Date Graduated December 2000

You might also like