Description: Tags: Allprogs
Description: Tags: Allprogs
Program
Performance Report
i
IDEA: Special Education Grants for Infants and Families....................................................................... .119
IDEA: Special Education Grants to States........................................................................ ......................121
IDEA: Special Education Parent Information Centers......................................................... ....................127
IDEA: Special Education Personnel Preparation.................................................................................... .128
IDEA: Special Education Preschool Grants................................................................ ............................130
IDEA: Special Education State Personnel Development Grants................................................. ............132
IDEA: Special Education Technical Assistance and Dissemination................................................ .........133
IDEA: Special Education Technology and Media Services............................................................... .......135
MVHAA: Education for Homeless Children and Youths................................................ ..........................137
VTEA: Tech-Prep Education State Grants................................................................... ...........................139
VTEA: Vocational Education National Programs.............................................................................. .......141
VTEA: Vocational Education State Grants........................................................................ ......................144
GOAL 3...................................................................................................................................1
ESEA: Alcohol Abuse Reduction.......................................................................................................... .......1
ESEA: Character Education............................................................................................................. ...........5
ESEA: Close-Up Fellowships........................................................................................................... ...........6
ESEA: Elementary and Secondary School Counseling............................................................................ ...7
ESEA: Exchanges with Historic Whaling and Trading Partners........................................... .......................8
ESEA: Mentoring Program............................................................................................................. ...........10
ESEA: Physical Education Program....................................................................................................... ...12
ESEA: Safe and Drug-Free Schools and Communities Other National Programs....................................14
ESEA: Safe and Drug-Free Schools and Communities State Grants.............................. .........................19
GOAL 4...................................................................................................................................1
ESEA: Indian Education National Activities....................................................................................... ..........1
ESRA: Research in Special Education..................................................................................................... ...2
ESRA: Research, Development and Dissemination.................................................................................. ..4
ESRA: Statistics............................................................................................................................. .............7
RA: National Institute on Disability and Rehabilitation Research...................................... ........................12
GOAL 5...................................................................................................................................1
AEFLA: Adult Basic and Literacy State Grants.................................................................... .......................1
AEFLA: Adult Education National Leadership Activities................................................................ ..............5
AEFLA: National Institute for Literacy................................................................................ .........................6
ATA: Assistive Technology Alternative Financing................................................................. .......................8
ATA: Assistive Technology Programs............................................................................................ ..............9
EDA: Gallaudet University.................................................................................................................... .....14
EDA: National Technical Institute for the Deaf......................................................................................... ..23
HEA: AID Developing Hispanic-Serving Institutions.......................................................................... ........30
HEA: AID Minority Science and Engineering Improvement.................................................................. .....33
HEA: AID Strengthening Alaska Native and Native Hawaiian Serving Institutions....................................35
HEA: AID Strengthening Historically Black Colleges and Universities.................................. ....................38
HEA: AID Strengthening Historically Black Graduate Institutions............................................................. .40
HEA: AID Strengthening Institutions................................................................................ .........................41
HEA: AID Strengthening Tribally Controlled Colleges and Universities.................................................. ...44
HEA: B.J. Stupak Olympic Scholarships.................................................................................... ...............46
HEA: Byrd Honors Scholarships....................................................................................... ........................48
HEA: College Assistance Migrant Program.................................................................................... ...........50
HEA: Demonstration Projects to Ensure Quality Higher Education for Students with Disabilities.............52
HEA: Fund for the Improvement of Postsecondary Education................................................. .................53
HEA: Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR-UP)......................55
HEA: Graduate Assistance in Areas of National Need (GAANN)............................................ ..................58
HEA: International Education and Foreign Language Studies Domestic Programs..................................64
HEA: International Education and Foreign Language Studies Institute for International Public Policy......77
HEA: Javits Fellowships.................................................................................................................. ..........79
HEA: SFA Federal Pell Grants........................................................................................... .......................80
HEA: Student Aid Administration.................................................................................................. .............82
ii
HEA: TRIO Educational Opportunity Centers............................................................................................ 84
HEA: TRIO McNair Postbaccalaureate Achievement........................................................................ ........85
HEA: TRIO Student Support Services................................................................................................... ....87
HEA: TRIO Talent Search................................................................................................. ........................90
HEA: TRIO Upward Bound........................................................................................... ............................92
HEA: Underground Railroad Program.............................................................................................. .........94
HKNCA: Helen Keller National Center for Deaf-Blind Youths and Adults................................... ...............95
MECEA: International Education and Foreign Language Studies Overseas Programs............................99
RA: Client Assistance State Grants................................................................................................ .........107
RA: Independent Living Centers..................................................................................... ........................109
RA: Independent Living Services for Older Blind Individuals........................................................... ........112
RA: Independent Living State Grants..................................................................................... .................115
RA: Migrant and Seasonal Farmworkers.......................................................................................... .......118
RA: Projects with Industry...................................................................................................... .................120
RA: Protection and Advocacy of Individual Rights................................................................................. ..123
RA: Supported Employment State Grants.......................................................................................... .....124
RA: Vocational Rehabilitation Demonstration and Training Programs.................................................. ...125
RA: Vocational Rehabilitation Grants for Indians................................................................................... ..127
RA: Vocational Rehabilitation Recreational Programs................................................... .........................129
RA: Vocational Rehabilitation State Grants.................................................................................. ...........130
RA: Vocational Rehabilitation Training................................................................................ ....................135
USC: Howard University.............................................................................................. ...........................138
VTEA: Tribally Controlled Postsecondary Vocational and Technical Institutions.....................................140
GOAL 6...................................................................................................................................1
DEOA: Office for Civil Rights..................................................................................................................... ..1
DEOA: Office of Inspector General........................................................................................ .....................3
ONGONG PROGRAMS WITHOUT FY 2006 PLANS............................................................5
ESEA: Indian Education Grants to Local Educational Agencies............................................................. .134
ESRA: National Assessment....................................................................................................... ............136
HEA: Child Care Access Means Parents in School......................................................................... ........136
HEA: SFA Federal Direct Student Loans............................................................................................. ....139
HEA: SFA Federal Family Education Loan Program & Liquidating................................. ........................140
HEA: SFA Federal Perkins Loans................................................................................... ........................141
HEA: SFA Federal Supplemental Educational Opportunity Grants................................. ........................142
HEA: SFA Federal Work-Study...................................................................................... .........................142
HEA: SFA Leveraging Educational Assistance Partnerships.............................................................. .....143
iii
GOAL 2
1
APEB: American Printing House for the Blind
FY 2006 Program Performance Report
Strategic Goal 2
Direct Appropriation
APEB, Title 20, Section 101 et seq.
Document Year 2006 Appropriation: $17,572
Measure 1.2 of 10: The percentage of APH advisory committee members who agree that APHs
educational materials are appropriate, timely, and high quality and allow blind students to benefit
more fully from their educational programs. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 100 100 Target Met
Measure 1.3 of 10: The percentage of consumers who agree that APH's educational materials
are appropriate, timely, and high quality and allow blind students to benefit more fully from their
educational programs. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 90 Measure not in place
2000 95 100 Target Exceeded
2001 95 97 Target Exceeded
2002 95 96 Target Exceeded
2003 95 100 Target Exceeded
2004 95 99 Target Exceeded
2005 95 96 Target Exceeded
2006 96 98 Target Exceeded
2007 96 (October 2007) Pending
2008 96 (October 2008) Pending
Source. American Printing House for the Blind, survey of consumers.
Frequency of Data Collection. Annual
Explanation. The survey instrument used by APH was constructed with the input of an external research
firm and was designed to measure the levels of customer/consumer satisfaction with each of the factors.
The survey was available on the APH Web site. This makes it easily available for response by individuals
who are not on a specific mailing list, but who are encouraged to respond through invitations on listservs
and in various newsletters and announcements. The web-based format also provides accessibility to
visually impaired individuals who require alternate media.
Measure 1.4 of 10: The percentage of teachers who agree that APH's educational materials are
appropriate, timely, and high quality and allow blind students to benefit more fully from their
educational programs. (Desired direction: increase)
Measure 1.5 of 10: The percentage of APH trustees who agree that the performance of
students and their participation in educational programs improves as result of the availability of
educational materials provided by APH. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1998 98 Measure not in place
1999 98 98 Target Met
2000 99 97 Did Not Meet Target
2001 99 97 Did Not Meet Target
2002 99 100 Target Exceeded
2003 99 99.5 Target Exceeded
2004 99 100 Target Exceeded
2005 99 99.5 Target Exceeded
2006 99 99 Target Met
2007 99 (October 2007) Pending
2008 99 (October 2008) Pending
Source. American Printing House for the Blind, survey of ex officio trustees.
Frequency of Data Collection. Annual
Explanation. The survey instrument used by APH was constructed with the input of an external research
firm and was designed to measure the levels of customer/consumer satisfaction with each of the factors.
The survey was available on the APH Web site. This makes it easily available for response by individuals
who are not on a specific mailing list, but who are encouraged to respond through invitations on listservs
and in various newsletters and announcements. The web-based format also provides accessibility to
visually impaired individuals who require alternate media.
Measure 1.6 of 10: The percentage of teachers (of students who are visually impaired) who
agree that the performance of students and their participation in educational programs improves
Measure 1.7 of 10: The percentage of students who attain identified concepts or skills during
the field testing of American Printing House for the Blind early childhood products. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Maintain a Baseline Not Collected Not Collected
Source. American Printing House for the Blind, records on testing of new products.
Frequency of Data Collection. Annual
Explanation. The indicator is being deleted.
Measure 1.8 of 10: The percentage of students who attain identified concepts or skills during
the field testing of American Printing House for the Blind low vision products. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Maintain a Baseline Not Collected Not Collected
Source. American Printing House for the Blind, records on testing of new products.
Frequency of Data Collection. Annual
Explanation. The indicator is being deleted.
Measure 1.9 of 10: The percentage of students who attain identified concepts or skills during
FY 2006 Program Performance Report
4 11/14/2006
U.S. Department of Education
the field testing of American Printing House for the Blind products for those with multiple
disabilities. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Maintain a Baseline Not Collected Not Collected
Source. American Printing House for the Blind, records on testing of new products.
Frequency of Data Collection. Annual
Explanation. The indicator is being deleted.
Measure 1.10 of 10: The percentage of students who attain identified concepts or skills during
the field testing of American Printing House for the Blind tactile graphics products. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Maintain a Baseline Not Collected Not Collected
Source. American Printing House for the Blind, records on testing of new products.
Frequency of Data Collection. Annual
Explanation. The indicator is being deleted.
Measure 2.2 of 3: The percentage of new American Printing House for the Blind products
judged to be of high relevance and utility for the target audience. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
Measure 2.3 of 3: The absolute value of the difference between the percentage of APHB
products sold that are new products and 15 percent. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 4.3 Measure not in place
2002 3.2 Measure not in place
2003 8.5 Measure not in place
2004 3.3 Measure not in place
2005 0.4 Measure not in place
2006 3 (November 2006) Pending
2007 3 (October 2007) Pending
2008 3 (October 2008) Pending
Source. U.S. Department of Education, American Printing House for the Blind, annual report.
Frequency of Data Collection. Annual
Explanation. The intent of this measure is to keep the percentage of APHB products sold that are new
products in the range between 12 percent and 18 percent. Having a percentage lower than 12 percent or
a percentage higher than 18 percent are each considered not meeting the target. The percentage is
calculated as the number of new APHB product sales divided by total product sales. To align with the
automated VPS, the measure is phrased as an absolute value of how far the actual percentage value is
above or below the ideal percentage value of 15 percent.
Program Goal: To support access and equity in public schools and help school
districts solve equity problems in education related to race, sex,
and national origin.
Objective 1 of 1: Provide high-quality technical assistance and training to public school
districts in addressing equity in education.
Measure 1.1 of 3: The percentage of customers of Equity Assistance Centers that develop,
implement, or improve their policies and practices in eliminating, reducing, or preventing
harassment, conflict, and school violence. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Set a Baseline 66 Target Met
2007 67 (July 2007) Pending
2008 68 (July 2008) Pending
2009 69 (July 2009) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Training and
Advisory Services, Customer Satisfaction Survey.
Frequency of Data Collection. Annual
Data Quality. Only 112 (48%) of all 233 Equity Assistance Center customers responded to the 2006
survey. The low response rate may be because this was an initial start-up year for these Equity
Assistance Centers, and the survey addressed technical assistance provided only during the first eight
months (September 2005-April 2006).
Explanation. Respondents reported being in the process or having developed or revised (64%) their
policy related to sexual and ethnic harassment or in the process or having implemented (65%) their policy
related to sexual and ethnic harassment, as a result of Equity Assistance Center (EAC) technical
assistance. Respondents reported being in the process or having developed or revised (68%) their policy
related to safe schools and school climate or in the process or having implemented (66%) their policy
related to safe schools and school climate, as a result of EAC technical assistance.
Measure 1.2 of 3: The percentage of customers of Equity Assistance Centers that develop,
implement, or improve their policies and practices ensuring that students of different race, sex,
and national origin have equitable opportunity for high-quality instruction. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Set a Baseline 71 Target Met
Measure 1.3 of 3: The percentage of customers who report that the products and services they
received from the Equity Assistance Centers are of high usefulness to their policies and
practices. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Set a Baseline 85 Target Met
2007 86 (July 2007) Pending
2008 87 (July 2008) Pending
2009 88 (July 2009) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Training and
Advisory Services, Customer Satisfaction Survey.
Frequency of Data Collection. Annual
Data Quality. Only 112 (48%) of all Equity Assistance Centers customers responded to the 2006 survey.
The low response rate may be because this was an initial start-up year for these Equity Assistance
Centers, and the survey addressed technical assistance provided only during the first eight months
(September 2005-April 2006).
Explanation. Eighty-five percent of respondents reported that the products and services received
from the EACs from September 2005 - April 2006 were "always useful," compared to 15 percent who
reported that the products and services were "sometimes useful," and none who reported that the
products and services were "never useful."
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers. 2004
data reported for 3539 centers
Explanation.
Measure 1.2 of 14: The percentage of middle or high school 21st Century regular program
participants whose mathematics grades improved from fall to spring. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2000 36 Measure not in place
2001 37 Measure not in place
2002 45 37 Did Not Meet Target
2003 45 36 Did Not Meet Target
2004 45 38 Made Progress From Prior Year
2005 45 36.78 Did Not Meet Target
2006 46 (December 2006) Pending
2007 47 (October 2007) Pending
2008 47.5 Pending
2009 48 Pending
2010 48.5 Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers.
Explanation.
This is being addressed through the 21st CCLC Technical Support Contract to develop toolkits and other
resources to help grantees provide high-quality academic enrichment programming.
Measure 1.3 of 14: The percentage of all 21st Century regular program participants whose
mathematics grades improved from fall to spring. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 39 Measure not in place
2001 40 Measure not in place
2002 45 39 Did Not Meet Target
2003 45 40 Made Progress From Prior Year
2004 45 41 Made Progress From Prior Year
2005 45 38.82 Did Not Meet Target
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers. 2004
data reported for 3539 centers.
Explanation.
This is being addressed through the 21st CCLC Technical Support Contract to develop toolkits and other
resources to help grantees provide high-quality academic enrichment programming.
Measure 1.4 of 14: The percentage of elementary 21st Century regular program participants
whose English grades improved from fall to spring. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 45 Measure not in place
2001 46 Measure not in place
2002 45 44 Did Not Meet Target
2003 45 45 Target Met
2004 45 47 Target Exceeded
2005 45 42.18 Did Not Meet Target
2006 46 (December 2006) Pending
2007 47 (October 2007) Pending
2008 47.5 Pending
2009 48 Pending
2010 48.5 Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers. 2004
data reported for 3539 centers
This is being addressed through the 21st CCLC Technical Support Contract to develop toolkits and other
resources to help grantees provide high-quality academic enrichment programming.
Measure 1.5 of 14: The percentage of middle or high school 21st Century regular program
participants whose English grades improved from fall to spring. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 37 Measure not in place
2001 39 Measure not in place
2002 45 39 Did Not Meet Target
2003 45 37 Did Not Meet Target
2004 45 41 Made Progress From Prior Year
2005 45 39.79 Did Not Meet Target
2006 46 (December 2006) Pending
2007 47 (October 2007) Pending
2008 47.5 Pending
2009 48 Pending
2010 48.5 Pending
Source.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers. 2004
data reported for 3539 centers.
This is being addressed through the 21st CCLC Technical Support Contract to develop toolkits and other
resources to help grantees provide high-quality academic enrichment programming.
Measure 1.6 of 14: The percentage of all 21st Century regular program participants whose
English grades improved from fall to spring. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 41 Measure not in place
2001 43 Measure not in place
2002 45 42 Did Not Meet Target
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers. 2004
data reported for 3539 centers.
Explanation.
This is being addressed through the 21st CCLC Technical Support Contract to develop toolkits and other
resources to help grantees provide high-quality academic enrichment programming.
Measure 1.7 of 14: The percentage of elementary 21st Century regular program participants
who improve from not proficient to proficient or above in reading on state assessments.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 30.72 Measure not in place
2006 BL+1% (December 2006) Pending
2007 BL+2% (October 2007) Pending
2008 BL+3% (October 2008) Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality. Data supplied by grantees.
Measure 1.8 of 14: The percentage of middle/high school 21st Century regular program
participants who improve from not proficient to proficient or above in mathematics on state
assessments. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 27.2 Measure not in place
Measure 1.9 of 14: The percentage of elementary 21st Century regular program participants
with teacher-reported improvement in homework completion and class participation. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2004 66.71 Measure not in place
2005 75.8 Measure not in place
2006 70 (December 2006) Pending
2007 75 (September 2007) Pending
2008 77 (October 2008) Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality. Data supplied by grantees.
Explanation. For 2006 we are considering this a new measure because this program is no longer a
Federal discretionary program but rather administrated by states. As a result a different data collection
instrument is now being used. These two changes mean that the data collected before 2004 are not
comparable with data for 2004 and beyond.
Measure 1.10 of 14: The percentage of middle and high school 21st Century program
participants with teacher-reported improvement in homework completion and class participation.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 70 Measure not in place
2005 72.02 Measure not in place
2006 70 (December 2006) Pending
2007 75 (October 2007) Pending
2008 77 (October 2008) Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality. Data supplied by grantees.
Explanation. For 2006 we are considering this a new measure because this program is no longer a
Federal discretionary program but rather administrated by states. As a result a different data collection
Measure 1.11 of 14: The percentage of all 21st Century regular program participants with
teacher-reported improvement in homework completion and class participation. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2004 68.75 Measure not in place
2005 74.98 Measure not in place
2006 70 (December 2006) Pending
2007 75 (October 2007) Pending
2008 77 (October 2008) Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality. Data supplied by grantees.
Explanation. For 2006 we are considering this a new measure because this program is no longer a
Federal discretionary program but rather administrated by states. As a result a different data collection
instrument is now being used. These two changes mean that the data collected before 2004 are no longer
comparable with data for 2004 and beyond.
Measure 1.12 of 14: The percentage of elementary 21st Century participants with teacher-
reported improvements in student behavior. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 61.2 Measure not in place
2005 71.48 Measure not in place
2006 67 (December 2006) Pending
2007 75 (October 2007) Pending
2008 75 (October 2008) Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality. Data supplied by grantees.
Explanation. For 2006 we are considering this a new measure because this program is no longer a
Federal discretionary program but rather administrated by states. As a result a different data collection
instrument is now being used. These two changes mean that the data collected before 2004 are not
comparable with data for 2004 and beyond.
Measure 1.13 of 14: The percentage of middle and high school 21st Century participants with
teacher-reported improvements in student behavior. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
Measure 1.14 of 14: The percentage of all 21st Century participants with teacher-reported
improvements in student behavior. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 64.08 Measure not in place
2005 71.08 Measure not in place
2006 67 (December 2006) Pending
2007 75 (October 2007) Pending
2008 75 (October 2008) Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers. 2004
data reported for 3539 centers.
Explanation. For 2006 we are considering this a new measure because this program is no longer a
Federal discretionary program but rather administrated by states. As a result a different data collection
instrument is now being used. These two changes mean that the data collected before 2004 are not
comparable with data for 2004 and beyond.
Objective 2 of 3: 21st Century Community Learning Centers will offer high-quality enrichment
opportunities that positively affect student outcomes such as school
attendance and academic performance, and result in decreased disciplinary
actions or other adverse behaviors.
Measure 2.1 of 3: The percentage of 21st Century Centers reporting emphasis in at least one
core academic area. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 85 97 Target Exceeded
This is being addressed through the 21st CCLC Technical Support Contract to develop toolkits and other
resources to help grantees provide high-quality academic enrichment programming.
Measure 2.2 of 3: The percentage of 21st Century Centers offering enrichment and support
activities in technology. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 85 70 Made Progress From Prior Year
2001 85 79 Made Progress From Prior Year
2002 85 80.6 Made Progress From Prior Year
2003 85 81.3 Made Progress From Prior Year
2004 85 65.6 Did Not Meet Target
2005 85 67.34 Made Progress From Prior Year
2006 85 (December 2006) Pending
2007 85 Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers. 2004
data reported for 3539 centers.
This is being addressed through the 21st CCLC Technical Support Contract to develop toolkits and other
resources to help grantees provide high-quality academic enrichment programming.
Measure 2.3 of 3: The percentage of 21st Century Centers offering enrichment and support
activities in other areas. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 85 97 Target Exceeded
2001 85 95 Target Exceeded
2002 85 96 Target Exceeded
2003 85 95.9 Target Exceeded
2004 85 92.57 Target Exceeded
2005 100 95.19 Made Progress From Prior Year
2006 100 (December 2006) Pending
2007 100 (October 2007) Pending
2008 100 (October 2008) Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Annual
Data Quality.
Data is supplied by grantees and partially validated by States. 2005 data reported for 7834 centers. 2004
data reported for 3539 centers. However, the performance on some indicators might be closer to target
than suggested by the data. The high number of first time centers reporting data suggests a possible lack
of understanding of definitions of terms associated with indicators (e.g. enrichment). This is being
addressed through greater emphasis on support to grantees for reporting through the 21st CCLC Analytic
Support Contract
Explanation.
This is being addressed through the 21st CCLC Technical Support Contract to develop toolkits and other
resources to help grantees provide high-quality academic enrichment programming.
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, grantee
performance reports.
Frequency of Data Collection. Other
Measure 3.2 of 2: The percentage of SEAs that submit complete and accurate data on 21st
Century program performance measures in a timely manner. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 85 Measure not in place
2006 Set a Baseline (December 2006) Pending
2007 90 Pending
Source. U.S. Department of Education, OESE, 21st Century Community Learning Centers, online data
collection system.
Frequency of Data Collection. Other
Data Quality. Based on States certifying data as complete and accurate by APR deadline
Note that the target for 2002 should be 22,000 and the target for 2003 should be 30,000.
Measure 1.2 of 5: The number of Advanced Placement tests taken by minority (Hispanic, Black,
Native American) public school students nationally. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 267,608 Measure not in place
2005 315,203 Measure not in place
Measure 1.3 of 5: The percentage of Advanced Placement tests passed (tests receiving scores
of 3-5) by low-income public school students nationally. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 37.5 Measure not in place
2006 38.5 38.1 Made Progress From Prior Year
2007 38.6 (August 2007) Pending
2008 39.2 (August 2008) Pending
2009 39.8 (August 2009) Pending
2010 40.2 (August 2010) Pending
Source. The College Board, Freeze File Report.
Frequency of Data Collection. Annual
Data Quality. The Freeze File Report is a mid-year data file of Advanced Placement exams taken in May
of that year and provides basic student demographic characteristics.
Explanation. College Board considers a test "mastered" if it receives a score of 3, 4, or 5 out of a scale
of 1 to 5.
Measure 1.4 of 5: The number of Advanced Placement tests passed (tests receiving scores of
3-5) by low-income public school students nationally. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 79,800 Measure not in place
2006 90,009 95,350 Target Exceeded
2007 99,000 (August 2007) Pending
2008 103,728 (August 2008) Pending
2009 113,194 (August 2009) Pending
2010 126,660 (August 2010) Pending
Source. The College Board, Freeze File Report.
Frequency of Data Collection. Annual
Data Quality. The Freeze File Report is a mid-year data file of Advanced Placement exams taken in May
of that year and provides basic student demographic characteristics.
Measure 1.5 of 5: The ratio of Advanced Placement and International Baccalaureate tests taken
in public high schools served by API grants to the number of seniors enrolled at those schools.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 BL+1% (May 2007) Pending
2007 BL+1% (September 2008) Pending
2008 BL+1 (September 2009) Pending
2009 BL+1 (September 2010) Pending
2010 BL+1 (September 2011) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Advanced
Placement, Grantee Performance Report.
Frequency of Data Collection. Annual
Explanation. This measure is being used for the first time in 2006. The FY 2005 data will be used as the
baseline.
Program Goal: To help meet the unique educational needs of Alaska Natives and
to support the development of supplemental educational
programs to benefit Alaska Natives.
Objective 1 of 1: Support supplemental educational programs to benefit Alaska Natives.
Measure 1.1 of 3: The dropout rate of Alaska Native and American Indian middle and high
school students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 2.2 Measure not in place
2006 2 5.5 Did Not Meet Target
2007 1.8 Pending
Source. U.S. Department of Education, Alaska Native Education Equity Program, grantee performance
report.
Frequency of Data Collection. Annual
Data Quality. 5 of 23 grantees reported data on this measure
Explanation. Program did not meet target. The measure is not representative of the program, and
program office is considering changing the measure.
Measure 1.2 of 3: The percentage of Alaska Native children participating in early learning and
preschool programs who improve on measures of school readiness. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline Not Collected Not Collected
2005 Set a Baseline 76.4 Target Met
2006 80 84.2 Target Exceeded
2007 85 Pending
Source. U.S. Department of Education, Alaska Native Education Equity Program, grantee performance
report.
Frequency of Data Collection. Annual
Data Quality. 5 of 23 grantees reported on this measure
Explanation. Program exceeded target
Source. U.S. Department of Education, Alaska Native Education Equity Program, grantee performance
report.
Frequency of Data Collection. Annual
Data Quality. 8 out of 23 grantees reported on this measure
Explanation. Program did not meet target. The measure is not representative of the program, and
program office is considering changing the measure.
Program Goal: To help ensure that all program participants meet challenging
state academic content standards in the arts.
Objective 1 of 1: Activities supported with federal funds will improve the quality of standards-
based arts education for all participants.
Measure 1.1 of 8: The percentage of students participating in arts models programs who
demonstrate higher achievement in mathematics than those in control or comparison groups.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 BL+1PP (February 2007) Pending
2006 999 Pending
2007 999 (November 2008) Pending
2008 BL+5% (November 2009) Pending
Source. U.S. Department of Education, Arts in Education Program Grantee Performance Report;
independent evaluation.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Explanation. The FY 2004 data will be used as the baseline. The FY 2006 target is the previous year's
actual plus 1 percentage point.
Measure 1.2 of 8: The percentage of students participating in arts models programs who
demonstrate higher achievement in reading than those in control or comparison groups.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 BL+1PP (February 2007) Pending
2006 999 (November 2007) Pending
2007 999 (November 2008) Pending
2008 BL+5PP (November 2009) Pending
Source. U.S. Department of Education, Arts in Education Program Grantee Performance Report;
independent evaluation.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Measure 1.3 of 8: The total number of students who participate in standards-based arts
education sponsored by the VSA and JFK Center for Performing Arts. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 BL+2PP (February 2007) Pending
2007 BL+4PP (February 2008) Pending
2008 BL+6PP (February 2009) Pending
Source. U.S. Department of Education, Arts in Education Program Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Measure 1.4 of 8: The number of low income students who participate in standards-based arts
education sponsored by the VSA and JFK Center for Performing Arts. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2005 BL+1PP (February 2007) Pending
2006 BL+2PP (February 2008) Pending
2007 BL+4PP (February 2009) Pending
2008 BL+6PP (February 2010) Pending
Source. U.S. Department of Education, Arts in Education Program Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Measure 1.5 of 8: The number of students with disabilities who participate in standards-based
arts education sponsored by Very Special Arts and the John F. Kennedy Center for Performing
Arts. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 BL+1PP (February 2007) Pending
2006 BL+2PP (February 2008) Pending
2007 BL+4PP (February 2009) Pending
2008 BL+6PP (February 2009) Pending
Source. U.S. Department of Education, Arts in Education Program Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Measure 1.7 of 8: The percentage of teachers participating in the Very Special Arts programs
who receive professional development that is sustained and intensive. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (February 2007) Pending
2007 BL+1PP (February 2008) Pending
2008 BL+2PP (February 2009) Pending
Source. U.S. Department of Education, Arts in Education Program Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Measure 1.8 of 8: The percentage of teachers participating in the Professional Development for
Arts Educators program who receive professional development that is sustained and intensive.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (February 2007) Pending
2007 BL+1PP (February 2008) Pending
2008 BL+2PP (February 2009) Pending
Source. U.S. Department of Education, Arts in Education Program Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Measure 1.2 of 8: The number of charter schools in operation around the nation. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
1997 428 Measure not in place
1998 790 Measure not in place
1999 1,100 Measure not in place
2000 2,060 1,700 Made Progress From Prior Year
2001 2,667 2,110 Made Progress From Prior Year
Measure 1.3 of 8: The percentage of fourth grade charter school students who are achieving at
or above proficient on state assessments in reading. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (January 2007) Pending
2007 BL+1PP Pending
2008 BL+2PP Pending
Measure 1.4 of 8: The percentage of fourth grade students in charter schools who are achieving
at or above proficient on state assessments in mathematics. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (January 2007) Pending
2007 BL+1PP Pending
2008 BL+2PP Pending
Measure 1.5 of 8: The percentage of eighth grade charter school students who are achieving at
or above proficient on state assessments in reading. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (January 2007) Pending
2007 BL+1PP Pending
2008 BL+2PP Pending
Measure 1.6 of 8: The percentage of eighth grade students in charter schools who are
achieving at or above proficient on state assessments in mathematics (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (January 2007) Pending
2007 BL+1PP Pending
2008 BL+2PP Pending
Measure 1.7 of 8: The federal cost per student in a 'successful' charter school (defined as a
school in operation for three or more years). (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (November 2006) Pending
2007 999 Pending
2008 999 Pending
Measure 1.8 of 8: The ratio of funds leveraged by states for charter facilities to funds awarded
by the Department under the State Charter School Facilities Incentive Grant Program. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2004 1.82 Measure not in place
2005 2.52 Measure not in place
2006 2.7 3.7 Target Exceeded
2007 3.1 (July 2007) Pending
2008 3.5 (July 2008) Pending
Source. U.S. Department of Education, Charter Schools Grantee Performance Report.
Frequency of Data Collection. Annual
Explanation. The leveraging ratio is the total funds available (the federal grant and the state match)
divided by the federal grant for a specific year.
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Consolidated
State Performance Report.
Frequency of Data Collection. Annual
Explanation. The FY 2004 data were used as the baseline.
Measure 1.2 of 2: The percentage of CSR schools meeting state targets in reading/language
arts. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 67 Measure not in place
2005 68 (January 2007) Pending
2006 68 Pending
2007 68 Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Consolidated
State Performance Report.
Frequency of Data Collection. Annual
Source. U.S. Department of Education, National Center for Education Statistics, National Longitudinal
Survey of Schools.
Frequency of Data Collection. Annual
Data Quality. Data are taken from a nationally representative sample of Title I schools; data are not
available for all Title I schools. Because data are based on self-reports, it is difficult to judge the extent to
which reform programs are comprehensive and research based. An examination of school documents on
a subsample of Title I schools will allow some indication of the quality of comprehensive school reform
efforts in Title I schools in general.
Since 2002 and 2003 data collected was unusable it is indicated as uncollected.
Explanation. Increasing numbers of Title I schools are implementing research-based school reform
models to improve curriculum and instruction. The Comprehensive School Reform Demonstration
Program is meeting its purpose of increasing awareness of and support for comprehensive school reform
among states, districts and schools, and acts as a catalyst for how Title I funds can be used in schoolwide
programs to support the adoption of research-based comprehensive school reform programs. The student
achievement data at CSR schools collected for 2002 and 2003 were found to be incomplete and
inconsistent, and were not used. A contractor worked with states to complete the data collection process
for 2004-06, and to provide quality assurance.
Measure 2.2 of 3: The percentage of a random sample of all products and services that receive
audience ratings for usefulness of "high and above" on a field survey. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 BL+1% Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Comprehensive
School Reform, ratings of products and services.
Frequency of Data Collection. Annual
Measure 2.3 of 3: The percentage of new research projects funded by the CSR Quality
Initiatives program that are deemed to be of high relevance to education practice as determined
by a review panel of practitioners. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Comprehensive
School Reform, ratings by review panel of practitioners.
Frequency of Data Collection. Annual
Source. U.S. Department of Education, annual and final performance reports grantee submissions.
Frequency of Data Collection. Annual
Explanation.
Though grantees prior to the FY 2005 cohort submitted data on this measure on a voluntary basis, it was
not required, and the PPVT - III was not the only assessment used. FY 2005 grantees are the first cohort
required to submit data for this measure using the PPVT - III for the 2006 performance year. The baseline
for this measure will be available in March 2007.
Source. U.S. Department of Education, annual and final performance reports grantee submissions.
Frequency of Data Collection. Annual
Explanation.
Grantees were not required to submit data for this measure until the FY05 grantee cohort. Baseline data
will be available in March 2007 for the 2006 performance year.
Source. U.S. Department of Education, annual and final performance reports, grantee submissions.
Frequency of Data Collection. Annual
Explanation. Though grantees prior to the FY 2005 cohort submitted data on this measure, it was not
required. FY 2005 grantees are the first cohort required to submit data for this measure. Baseline data will
be available in March 2007.
Source. U.S. Department of Education, Early Reading First Program, annual performance report.
Frequency of Data Collection. Annual
Data Quality. The Peabody Picture Vocabulary Test-Third Edition (PPVT), nationally-normed, has been
validated internally and correlated with other measures of cognitive development.
Explanation. SY 2003-04, Early Reading First preschool children took a Peabody Picture Vocabulary
Test-III pre-test and a post-test after the year of Early Reading First intervention. The post-test scores
were compared to the national norms provided by the test publisher.
Measure 1.2 of 3: The number of letters Early Reading First children can identify as measured
by the PALS Pre-K Upper Case Alphabet Knowledge subtask. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline 15 Target Met
2005 16 16 Target Met
2006 17 Pending
2007 18 Pending
2008 20 Pending
Source. U.S. Department of Education, Early Reading First Program, performance report.
Frequency of Data Collection. Annual
Data Quality. Not all Early Reading First grantees use the PALS Pre-K Upper Case Alphabet Knowledge
subtask to measure alphabet knowledge. Data collected represent the sample of grantees who use the
PALS Pre-K Upper Case Alphabet Knowledge subtask. Early Reading First grantees will be encouraged
to use the PALS Pre-K Upper Case Alphabet Knowledge subtask as the measure of alphabet knowledge.
Explanation. The PALS Pre-K Upper Case Alphabet Knowledge subtask is a measure of alphabet
knowledge that will be administered to ERF preschool children with scores reported in the ERF
FY 2006 Program Performance Report
37 11/14/2006
U.S. Department of Education
Performance Report. It has been demonstrated to have a strong positive correlation with the Woodcock-
Johnson Letter-Word Identification test.
Measure 1.3 of 3: The percent of 4-year old children participating in Early Reading First
programs who achieve significant learning gains on the Peabody Picture Vocabulary Test-III
(PPVT-III). (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (March 2007) Pending
2007 BL+1% Pending
2008 BL+1% Pending
Source. U.S. Department of Education, Early Reading First Program, annual performance report.
Frequency of Data Collection. Annual
Explanation. The Peabody Picture Vocabulary Test-Third Edition (PPVT) is a nationally normed test
which has been validated internally and correlated with other measures of cognitive development.
Source. U.S. Department of Education, Native Hawaiian Education Program, grantee performance
report.
Measure 1.2 of 3: The percentage of students participating in the Education for Native
Hawaiians program who meet or exceed proficiency standards in mathematics, science, or
reading. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline Not Collected Not Collected
2005 Set a Baseline 82 Target Met
2006 83.64 (August 2006) Pending
2007 85.31 Pending
Source. U.S. Department of Education, Native Hawaiian Education Program, grantee performance
report.
Frequency of Data Collection. Annual
Measure 1.3 of 3: The percentage of teachers involved with professional development activities
that address the unique education needs of Native Hawaiians. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline Not Collected Not Collected
2005 Set a Baseline 89.3 Target Met
2006 91 (August 2006) Pending
2007 92.82 Pending
Source. U.S. Department of Education, Native Hawaiian Education Program, grantee performance
report.
Frequency of Data Collection. Annual
Program Goal: To help limited English proficient students learn English and
reach high academic standards.
Objective 1 of 3: Improve the English proficiency and academic achievement of students
served by the Language Acquisition State Grants program.
Measure 1.1 of 7: The number of States that have demonstrated the alignment of English
language proficiency (ELP) assessment with ELP standards. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 16 Measure not in place
2005 10 (January 2007) Pending
2006 50 (January 2008) Pending
2007 52 (January 2009) Pending
2008 52 (January 2010) Pending
Source. U.S. Department of Education, Consolidated State Performance Report; EDEN when available.
Frequency of Data Collection. Annual
Explanation. All 52 entities (50 states, the District of Columbia and Puerto Rico) are providing
information regarding aligned English language proficiency assessments under NCLB. States are counted
as having demonstrated progress in alignment if they explain how their current ELP assessment is being
aligned with ELP standards.
FY 2006 Program Performance Report
41 11/14/2006
U.S. Department of Education
Measure 1.2 of 7:
The number of States reporting that their English language proficiency standards are aligned
with State academic content standards.
Measure 1.3 of 7:
The percentage of LEAs receiving Title III services making AYP for limited English proficient
students.
Measure 1.4 of 7:
Measure 1.5 of 7: The percentage of limited English proficient students receiving Title III
services who have achieved English language proficiency. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 23 Measure not in place
2006 29 (January 2007) Pending
2007 58 (January 2008) Pending
2008 67 (January 2009) Pending
2009 87 (January 2010) Pending
2010 92 (January 2011) Pending
Source. U.S. Department of Education, Consolidated State Performance Report and Office of English
Language Acquisition Title III Biennial Evaluation Reports.
Frequency of Data Collection. Annual
Measure 1.6 of 7:
The percentage of States being monitored on-site each year that resolve Title III compliance
findings within twelve months of notification.
Measure 1.7 of 7:
The average number of days for States receiving Title III funds to make subgrants to
subgrantees.
The percentage of preservice teachers served under the National Professional Development
Program who are placed in an instructional setting serving LEP students within one year of
graduation.
The percentage of projects funded under the Native American/Alaska Native Children in School
Program that increase LEP student academic achievement as measured by state academic
content assessments.
After review, corrections are made to the date when data is expected for report due to time changes in
processing the Annual Grant Performance.
The percentage of projects funded under the Native American/Alaskan Native Children in
School Program that increase the level of English language proficiency of participating LEP
students as measured by performance on the state English language proficiency (ELP)
assessment or the state approved local ELP assessment.
Program Goal: To help break the cycle of poverty and illiteracy by improving the
educational opportunities of the nation's low-income families
through a unified family literacy program that integrates early
childhood education, adult literacy and adult basic education,
and parenting education.
Objective 1 of 1: The literacy of participating families will improve.
Measure 1.1 of 5: The percentage of Even Start adults who achieve significant learning gains
on measures of reading/English language acquisition, as measured by the Comprehensive Adult
Student Assessment System (CASAS) and the Tests of Adult Basic Education (TABE). (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline 70 Target Met
2004 70.7 60.5 Did Not Meet Target
2005 71.4 63.8 Made Progress From Prior Year
2006 72.1 Pending
2007 70.9 Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Explanation. Demonstrated increase from previous year.
Measure 1.2 of 5: The percentage of Even Start adults with a high school completion goal who
earn a high school diploma. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline 59 Target Met
2004 59.6 44.6 Did Not Meet Target
2005 60.2 47.2 Made Progress From Prior Year
2006 60.8 Pending
2007 60.8 Pending
2008 61 Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality. Definitions of high school diploma vary across participating programs.
Explanation. Demonstrated increase from previous year.
Measure 1.3 of 5: The percentage of Even Start adults with a goal of General Equivalency
Diploma (GED) attainment who earn a GED. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline 44.6 Target Met
2004 44.4 80.2 Target Exceeded
2005 44.9 57.9 Target Exceeded
2006 45.3 Pending
2007 45.3 Pending
2008 48 Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality. Some States not able to distinguish between high school diploma and GED.
Explanation. 2008 target is draft.
Measure 1.4 of 5: The percentage of Even Start children who are entering kindergarten and
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality. Basis for including children and determining significance of gain varied among States.
PPVT measure not required until 2005-06; hence limited number of States reporting.
Explanation. 2008 target is draft.
Measure 1.5 of 5: The number of letters Even Start children can identify, as measured by the
PALS Pre-K Uppercase Letter Naming Subtask. (Desired direction: increase)
Year Target Actual Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 BL+1 Pending
2007 BL+1 Pending
2008 BL+1.5 Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality.
PALS measure not required until 2005-06; only two States reported usable results. Data expected to be
available in 2007.
Explanation. The PALS Pre-K Uppercase Letter Naming Subtask is a measure that has been validated
using a statewide sample of typically developing children.
Source. U.S. Department of Education, Excellence in Economic Education annual grantee performance
report.
Frequency of Data Collection. Annual
Several FY 2006 applications selected for field reviews required extended time to resolve data issues.
This resulted in payment delays for those applicants. A technical adjustment in the method of calculating
this measure also partially explains the decline from 2005. For FY 2007 payments, the Department will
further focus on making initial payments in a timely manner.
Source. U.S. Department of Education, Impact Aid Basic Support payments, Impact Aid data system.
Frequency of Data Collection. Annual
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Impact Aid
Construction, grantee reports.
Frequency of Data Collection. Other
Explanation. This new efficiency measure is intended to track programmatic efficiency by reducing the
amount of time it takes to process the formula construction grant payments under Section 8007(a) of the
Impact Aid Program. The target for FY 2006 is 7/31/2006; the target for FY 2007 is 6/30/2007; the target
for FY 2008 is 5/31/2008; the target for FY 2009 is 4/30/2009.
Measure 1.3 of 3: The average number of days elapsed between the initial Impact Aid
discretionary construction award and the LEAs' awarding of contracts. (Desired direction:
decrease)
Actual
Year Target Status
(or date expected)
2006 250 (December 2007) Pending
2007 250 Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Impact Aid
Construction, GAPS reports.
Frequency of Data Collection. Annual
Explanation. This is a new efficiency measure for FY 2006.
Program Goal: To assist local school districts that have lost a portion of their
local tax base because of federal ownership of property.
Objective 1 of 1: Manage Section 8002 Payments for Federal Property to disburse funds
accurately and efficiently under the statutory formula.
Measure 1.1 of 2: The percentage of eligible Section 8002 applicants reviewed during the year.
Measure 1.2 of 2: The percentage of initial payments to eligible LEAs under Impact Aid
Payments for Federal Property that are made by the end of the second quarter. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2006 75 1.5 Made Progress From Prior Year
2007 67 Pending
2008 75 (December 2008) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Impact Aid
Construction, program reports.
Frequency of Data Collection. Annual
Program Goal: To improve teacher and principal quality and increase the number
of highly qualified teachers in the classroom and highly qualified
principals and assistant principals in schools.
Objective 1 of 2: Show an annual increase in the percentage of classes taught by highly
qualified teachers.
Measure 1.1 of 4: The percentage of core academic classes in high-poverty schools taught by
highly qualified teachers. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 74 Measure not in place
2004 85.7 Measure not in place
2005 90 89.5 Made Progress From Prior Year
2006 95 (December 2007) Pending
2007 100 (December 2008) Pending
2008 100 (December 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
FY 2006 Program Performance Report
55 11/14/2006
U.S. Department of Education
Frequency of Data Collection. Annual
Data Quality. Data for 2004-05 for all States. 2003-04 data updated from FY2005 submission.
Value is for elementary schools only. Value for secondary schools is 84.4. Data is reported seperately
because elementary grades count classes as self-contained while secondary grades count classrroms as
departmentalized. 2004 data is combined for elementary and secondary.
Explanation.
This program essentially met the FY 2005 Target. This was achieved through providing extensive
technical assistance, and intensive monitoring - 2 monitoring visits were made to most States in this
period.
Measure 1.2 of 4: The percentage of core academic classes in low poverty schools taught by
highly qualified teachers. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 91 Measure not in place
2005 90 95 Target Exceeded
2006 95 (May 2007) Pending
2007 100 (December 2008) Pending
2008 100 (December 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Frequency of Data Collection. Annual
Data Quality.
Data for 2004-05 for all States. 2003-04 data updated from FY2005 submission.
Value is for elementary schools only. Value for secondary schools is 91.8. Data is reported seperately
because elementary schools count classes as self-contained while secondary schools count classrooms
as departmentalized. 2004 data is combined for elementary and secondary.
Explanation.
This program met the FY 2005 Target. This was achieved through providing extensive technical
assistance, and intensive monitoring - 2 monitoring visits were made to most States in this period.
Measure 1.3 of 4: The percentage of core academic classes in elementary schools taught by
highly qualified teachers. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline 85 Target Met
2004 89 90.6 Target Exceeded
2005 90 93 Target Exceeded
2006 95 (December 2007) Pending
2007 100 (January 2008) Pending
Data for 2004-05 for all States. 2003-04 data updated from FY2005 submission.
Explanation. This program met the FY 2005 Target. This was achieved through providing extensive
technical assistance, and intensive monitoring - 2 monitoring visits were made to most States in this
period.
Measure 1.4 of 4: The percentage of core academic classes in secondary schools taught by
highly qualified teachers. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline 80 Target Met
2004 85 88.3 Target Exceeded
2005 85 89 Target Exceeded
2006 92 (December 2007) Pending
2007 100 (December 2008) Pending
2008 100 (December 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality.
Data for 2004-05 is for all States. 2003-04 data has been updated post FY 2005 submission
Explanation. This program met the FY 2005 Target. This was achieved through providing extensive
technical assistance, and intensive monitoring - 2 monitoring visits were made to most States in this
period.
Program Goal: To improve the teaching and learning of gifted and talented
students through research, demonstration projects, personal
training, and other activities of national significance.
Objective 1 of 1: Develop models for developing the talents of students who are economically
disadvantaged, are limited English proficient, and/or have disabilities.
Measure 1.1 of 3: The number of Javits Gifted and Talented Education project designs for
effective professional development focusing on gifted and talented education with average
reviewer ratings for quality of high and above. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 90 Target Met
2006 BL+1% (December 2007) Pending
2007 BL+2% Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Javits Gifted and
Talented Education Program, grantee performance report.
Frequency of Data Collection. Annual
Data Quality. Of the 10 (FY 2002) projects reviewed, 5 were priority one grants (5 yr. projects) and 5
were priority two grants (3 yr. projects)
Measure 1.2 of 3: The number of new evidence-based Javits Gifted and Talented Education
project designs with average reviewer ratings for quality of high and above. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 70 Target Met
2006 BL+1% (December 2007) Pending
2007 BL+2% Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Javits Gifted and
Talented Education Program, grantee performance report.
Frequency of Data Collection. Annual
Data Quality. Of the 10 (FY 2002) projects reviewed, 5 were priority one grants (5 yr. projects) and 5
were priority two grants (3 yr. projects)
Measure 1.3 of 3: The number of Javits Gifted and Talented Education projects with significant
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Javits Gifted and
Talented Education Program, evaluations.
Frequency of Data Collection. Annual
Data Quality. Of the 10 (FY 2002) projects reviewed, 5 were priority one grants (5 yr. projects) and 5
were priority two grants (3 yr. projects)
Source. U.S. Department of Education, Literacy Through School Libraries Program, grantee annual
performance report. U.S. Department of Education, Institute of Education Sciences (IES), Schools and
Staffing Survey (SASS). U.S. Department of Education, Institute of Education Sciences (IES), National
Center of Educational Statistics, 2005 program evaluation.
Frequency of Data Collection. Annual
Target Context. Unclear if data is available in EDFACTS or in annual reports
Source. U.S. Department of Education, Literacy Through School Libraries Program, grantee annual
performance report. U.S. Department of Education, Institute of Education Sciences (IES), Schools and
Staffing Survey (SASS). U.S. Department of Education, Institute of Education Sciences (IES), National
Center of Educational Statistics, 2005 program evaluation.
Frequency of Data Collection. Other
Explanation. Data not collected for 2004-05. Will be collected for 2005-06
Objective 2 of 2: Magnet school students meet their state's academic achievement standards.
Measure 2.2 of 2: The percentage of cohort 1 magnet schools that meet the state's adequate
yearly progress standard. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline (October 2006) Pending
2006 BL+1PP (October 2007) Pending
2007 999 (October 2008) Pending
2008 Set a Baseline (October 2009) Pending
Source. U.S. Department of Education, Magnet Schools Program Performance Report. State
assessments required by NCLB. State educational agencies.
Frequency of Data Collection. Other
Data Quality. Data are frequently late in being released.
Explanation. Cohort 1 was established in SY 2004-05, and cohort 2 will be established in SY 2007-08.
The FY 2007 target for cohort 1 is the FY 2006 actual level plus 1 percentage point.
Program Goal: To improve the quality of mathematics and science teachers and
increase both the number of highly qualified math and science
teachers and the achievement of students participating in
Mathematics and Science Partnerships programs.
Objective 1 of 2: To increase the number of highly qualified mathematics and science
teachers in schools participating in Mathematics and Science Partnership
FY 2006 Program Performance Report
61 11/14/2006
U.S. Department of Education
(MSP) programs.
Measure 1.1 of 3: The percentage of K-5 teachers in MSP schools who significantly increase
their knowledge of mathematics and science. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline Pending
2005 Set a Baseline (September 2007) Pending
2006 BL+20% (September 2008) Pending
2007 BL+21% (September 2009) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Mathematics and
Science Partnerships, program annual reports.
Frequency of Data Collection. Annual
Data Quality. This number reflects the number of teachers who participated in high quality professional
development as part of the MSP funded projects. Most projects report gains based on pretest-posttest
results. However, the data and analysis does not lend itself to the rigor of statistical significance. The FY
2004 target is to establish a baseline. The target for FY 2005 is the baseline plus 20%.
Measure 1.2 of 3: The percentage of highly qualified high school (grades nine through twelve)
teachers in MSP schools. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline Pending
2005 Set a Baseline (September 2007) Pending
2006 BL+20% Pending
2007 BL+21% Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Mathematics and
Science Partnerships, program evaluation; U.S. Department of Education, Office of Elementary and
Secondary Education, Mathematics and Science Partnerships, program annual reports.
Frequency of Data Collection. Annual
Data Quality.
Measure 1.3 of 3: The percentage of highly qualified middle school (grades six through eight)
teachers in MSP schools. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline Not Collected Not Collected
2005 Set a Baseline (April 2007) Pending
2006 BL+20% (September 2007) Pending
2007 BL+21% (September 2008) Pending
Measure 2.2 of 2: The percentage of students in MSP classrooms scoring at proficient or above
in science on state assessments. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (September 2007) Pending
2007 Maintain a Baseline (September 2008) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Mathematics and
Science Partnerships, program annual reports.
Frequency of Data Collection. Annual
Explanation. The FY 2005 data will be used as baseline.
Measure 1.2 of 12: The number of states that reported results for reading proficiency of
elementary school migrant students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 15 Measure not in place
1998 18 Measure not in place
1999 19 Measure not in place
2000 26 Measure not in place
Explanation. The annually set state targets for 2002 through 2008 project an increase in the number of
states that report state assessment results in reading for migrant students in elementary school.
Measure 1.3 of 12: The number of states meeting an annually set performance target in reading
at the middle school level for migrant students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 3 Measure not in place
1998 6 Measure not in place
1999 4 Measure not in place
2000 2 Measure not in place
2001 7 Measure not in place
2002 9 6 Did Not Meet Target
2003 11 10 Made Progress From Prior Year
2004 15 10 Did Not Meet Target
2005 17 (December 2006) Pending
2006 19 (July 2007) Pending
2007 21 (July 2008) Pending
2008 23 (July 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Data Quality. Information that directly measures the impact of the Title I, Migrant Education Program is
not available. However, each state has its own assessment to measure and determine student
proficiency. Student achievement across the states cannot be compared directly, but the results for
migrant students can be tracked over time, providing the state proficiency levels and assessments'
content remain consistent and the disaggregation of assessment data by subgroup is accurate. This
measure will have greater validity and reliability over time as state assessment systems stabilize, include
all migrant students in testing, and properly disaggregate and report results.
Explanation. The annually set state targets for 2002 through 2008 project the number of states that
attain a performance threshold of 50 percent or more of middle school level migrant students at the
proficient or advanced level in reading. Once 80 percent of all states have met the performance threshold
of 50 percent of migrant students at or above the proficient level, the performance threshold will be raised
in increments of 5 percent and the annually set state targets will project an increase in the number of
states meeting the new threshold. The progress of states in moving toward a target can be viewed by
examining the number of states that have increased the percentage of migrant students at the proficient
FY 2006 Program Performance Report
65 11/14/2006
U.S. Department of Education
or advanced level in reading in 2004 upward from the previous year (2003). In that regard, 9 out of 14
states demonstrated a positive increase in the percent proficient or above in grade six, 8 out of 13 states
in grade seven, and 20 out of 30 states in grade eight.
Measure 1.4 of 12: The number of states that reported results for reading proficiency of middle
school migrant students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 15 Measure not in place
1998 18 Measure not in place
1999 18 Measure not in place
2000 23 Measure not in place
2001 21 Measure not in place
2002 25 27 Target Exceeded
2003 29 43 Target Exceeded
2004 32 43 Target Exceeded
2005 34 (December 2006) Pending
2006 36 (July 2007) Pending
2007 45 (July 2008) Pending
2008 47 (July 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Frequency of Data Collection. Annual
Data Quality. Each state has its own assessment to measure and determine student proficiency. This
measure will have greater validity and reliability over time as state assessment systems stabilize, include
all migrant students in testing, and properly disaggregate and report results.
Explanation. The annually set state targets for 2002 through 2008 project an increase in the number of
states that report state assessment results in reading for migrant students in middle school.
Measure 1.5 of 12: The number of states meeting an annually set performance target in
mathematics at the elementary school level for migrant students. (Desired direction: increase)
Year Target Actual Status
(or date expected)
1997 5 Measure not in place
1998 9 Measure not in place
1999 6 Measure not in place
2000 7 Measure not in place
2001 10 Measure not in place
2002 12 6 Did Not Meet Target
2003 14 16 Target Exceeded
2004 18 19 Target Exceeded
2005 20 (December 2006) Pending
2006 22 (July 2007) Pending
2007 24 (July 2008) Pending
2008 26 (July 2009) Pending
Measure 1.6 of 12: The number of states that reported results for mathematics proficiency of
elementary school migrant students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 15 Measure not in place
1998 18 Measure not in place
1999 19 Measure not in place
2000 25 Measure not in place
2001 23 Measure not in place
2002 27 29 Target Exceeded
2003 32 42 Target Exceeded
2004 36 46 Target Exceeded
2005 38 (December 2006) Pending
2006 40 (June 2007) Pending
2007 45 (July 2008) Pending
2008 47 (July 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Data Quality. Each state has its own assessment to measure and determine student proficiency. This
measure will have greater validity and reliability over time as state assessment systems stabilize, include
all migrant students in testing, and properly disaggregate and report results.
Explanation. The annually set state targets for 2002 through 2008 project an increase in the number of
states that report state assessment results in mathematics for migrant students in elementary school.
Measure 1.7 of 12: The number of states meeting an annually set performance target in
mathematics for middle school migrant students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
Measure 1.8 of 12: The number of states that reported results for mathematics proficiency of
middle school migrant students. (Desired direction: increase)
Year Target Actual Status
(or date expected)
1997 15 Measure not in place
1998 18 Measure not in place
1999 18 Measure not in place
2000 22 Measure not in place
2001 20 Measure not in place
2002 24 27 Target Exceeded
2003 28 43 Target Exceeded
2004 32 45 Target Exceeded
2005 34 (December 2006) Pending
2006 36 (July 2007) Pending
2007 45 (July 2008) Pending
2008 47 (July 2009) Pending
Measure 1.9 of 12: The number of states meeting an annually set performance target for
dropout rate for migrant students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline (December 2006) Pending
2005 BL+1 (December 2006) Pending
2006 BL+2 (July 2007) Pending
2007 BL+3 (July 2008) Pending
2008 BL+4 (July 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Frequency of Data Collection. Annual
Data Quality. Information that directly measures the impact of the Title I, Migrant Education Program is
not available. However, each state must report an annual dropout rate for students leaving school.
Variation in the calculation of dropout rates may limit the validity of comparisons across the states.
However, the results for migrant students can be tracked over time, provided that state procedures for
calculating dropout rates remain consistent and the disaggregation of dropout data by subgroup is
accurate. This measure will have greater validity and reliability over time as state procedures for
calculating and reporting dropout rates stabilize, include all migrant students appropriately in the
calculations, and properly disaggregate and report results.
Explanation. The annually set state targets for 2004 through 2008 project the number of states that
attain a performance threshold of 50 percent or fewer migrant students who dropout of school. Once 80
percent of all states have met the performance threshold of 50 percent or fewer migrant students who
dropout of school, the performance threshold will be decreased in increments of 5 percent and the
annually set state targets will project an increase in the number of states meeting the new threshold.
Measure 1.10 of 12: The number of states that reported results for dropout rate of migrant
students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline (December 2006) Pending
2005 BL+1 (December 2006) Pending
2006 BL+2 (July 2007) Pending
2007 BL+3 (July 2008) Pending
2008 BL+4 (July 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Data Quality. Each state must report an annual dropout rate for students leaving school. This measure
will have greater validity and reliability over time as state procedures for calculating and reporting dropout
FY 2006 Program Performance Report
69 11/14/2006
U.S. Department of Education
rates stabilize, include all migrant students appropriately in the calculations, and properly disaggregate
and report results.
Explanation. The annually set state targets for 2004 through 2008 project an increase in the number of
states that report dropout rates for migrant students.
Measure 1.11 of 12: The number of states meeting an annually set performance target for high
school graduation of migrant students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline (December 2006) Pending
2005 BL+1 (December 2006) Pending
2006 BL+2 (July 2007) Pending
2007 BL+3 (July 2008) Pending
2008 BL+4 (July 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Frequency of Data Collection. Annual
Data Quality. Information that directly measures the impact of the Title I, Migrant Education Program is
not available. However, each state must report an annual graduation rate for students who graduate from
a public high school with a diploma. This measure will have greater validity and reliability over time as
state procedures for disaggregating and reporting all migrant students who graduate stabilize.
Explanation. The annually set state targets for 2004 through 2008 project the number of states that
attain a performance threshold of 50 percent or more migrant students graduating from high school. Once
80 percent of all states have met the performance threshold of 50 percent or more migrant students
graduating from high school, the performance threshold will be increased in increments of 5 percent and
the annually set state targets will project an increase in the number of states meeting the new threshold.
Measure 1.12 of 12: The number of states that reported results for high school graduation of
migrant students. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline (December 2006) Pending
2005 BL+1 (December 2006) Pending
2006 BL+2 (July 2007) Pending
2007 BL+3 (July 2008) Pending
2008 BL+4 (July 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Frequency of Data Collection. Annual
Data Quality. Each state must report an annual graduation rate for students who graduate from a public
high school with a diploma. This measure will have greater validity and reliability over time as state
procedures for disaggregating and reporting all migrant students who graduate stabilize.
Explanation. The annually set state targets for 2004 through 2008 project an increase in the number of
states that report graduation rates for migrant students.
Source. Academy for Educational Development-derived tests; the National Assessment of Educational
Progress (NAEP) Test of Writing.
Frequency of Data Collection. Annual
Data Quality. National Writing Project sites measure effectiveness using different instruments so data are
difficult to aggregate.
Measure 1.2 of 2: The percentage of students of National Writing Project (NWP) trained
teachers who demonstrate clear control of the writing conventions of usage, mechanics, and
spelling. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline (June 2006) Pending
2006 BL+1% Pending
2007 BL+2% Pending
Source. Academy for Educational Development-derived tests; the National Assessment of Educational
Progress Test of Writing.
Frequency of Data Collection. Annual
Data Quality. National Writing Project sites measure effectiveness using different instruments so data are
difficult to aggregate.
Program Goal: To ensure that neglected and delinquent children and youth will
have the opportunity to meet the challenging state standards
needed to further their education and become productive
members of society.
Objective 1 of 1: Neglected or delinquent (N or D) students will improve academic and
vocational skills needed to further their education.
Measure 1.1 of 3: The percentage of neglected or delinquent students obtaining a diploma or
diploma equivalent. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline 8 Target Met
2004 8.4 Not Collected Not Collected
2005 8.8 9.31 Target Exceeded
2006 8.8 (June 2007) Pending
2007 10.25 Pending
Measure 1.2 of 3: The percentage of neglected or delinquent students earning high school
course credits. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 41.52 Measure not in place
2006 Set a Baseline (June 2007) Pending
2007 Maintain a Baseline Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.3 of 3: The percentage of neglected or delinquent students who improve academic
skills as measured on approved and validated measures. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline Not Collected Not Collected
2004 Set a Baseline Not Collected Not Collected
2005 Set a Baseline 17.9 Target Met
FY 2006 Program Performance Report
73 11/14/2006
U.S. Department of Education
2006 Maintain a Baseline (June 2007) Pending
2007 Maintain a Baseline Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality. Data from state assessments will be disaggregated at the state agency level and reported
for schools that receive Title I, Part D, funds.
Measure 1.2 of 15: The percentage of grade 2 economically disadvantaged students in Reading
First schools who meet or exceed proficiency in reading on Reading First measures of reading
fluency. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 33 Measure not in place
2005 39 Measure not in place
2006 35 (February 2007) Pending
2007 41 (February 2008) Pending
2008 43 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.3 of 15: The percentage of grade 2 limited English proficient students in schools
participating in Reading First programs who meet or exceed proficiency on Reading First
measures of reading fluency. (Desired direction: increase)
Year Target Actual Status
Measure 1.4 of 15: The percentage of grade 2 African American students in schools Reading
First schools, who meet or exceed proficiency on Reading First outcome measures of reading
fluency. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 34 Measure not in place
2005 37 Measure not in place
2006 36 (February 2007) Pending
2007 39 (February 2008) Pending
2008 41 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.5 of 15: The percentage of grade 2 Hispanic students in Reading First schools who
meet or exceed proficiency on Reading First measures of reading fluency. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2004 30 Measure not in place
2005 39 Measure not in place
2006 32 (February 2007) Pending
2007 41 (February 2008) Pending
2008 43 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.6 of 15: The percentage of grade 2 students with disabilities in Reading First schools
who meet or exceed proficiency on Reading First measures of reading fluency. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2004 17 Measure not in place
2005 23 Measure not in place
2006 19 (February 2007) Pending
Measure 1.7 of 15: The percentage of grade 3 students in Reading First schools who meet or
exceed proficiency on Reading First measures of reading fluency. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 36 Measure not in place
2005 39 Measure not in place
2006 38 (February 2007) Pending
2007 41 (February 2008) Pending
2008 43 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.8 of 15: The number of states reporting an increase in the percentage of grade 1
students in Reading First schools who meet or exceed proficiency on Reading First measures of
reading comprehension. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 2 Measure not in place
2005 14 Measure not in place
2006 5 (February 2007) Pending
2007 19 (February 2008) Pending
2008 24 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report grantee submissions.
Frequency of Data Collection. Annual
Measure 1.9 of 15: The number of states reporting an increase in the percentage of grade 2
economically disadvantaged students in Reading First schools who meet or exceed proficiency
on Reading First measures of reading comprehension. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 4 Measure not in place
2005 14 Measure not in place
2006 7 (February 2007) Pending
2007 19 (February 2008) Pending
2008 24 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.11 of 15: The number of states reporting an increase in the percentage of grade 2
African American students in Reading First schools who meet or exceed proficiency on Reading
First measures of reading comprehension. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 5 Measure not in place
2005 16 Measure not in place
2006 10 (February 2007) Pending
2007 21 (February 2008) Pending
2008 26 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.12 of 15: The number of states reporting an increase in the percentage of grade 2
Hispanic students in Reading First schools who meet or exceed proficiency on Reading First
measures of reading comprehension. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 5 Measure not in place
2005 9 Measure not in place
2006 10 (February 2007) Pending
2007 15 (February 2008) Pending
2008 20 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.13 of 15: The number of states reporting an increase in the percentage of grade 2
students with diabilities in Reading First schools who meet or exceed proficiency on Reading
Measure 1.14 of 15: The number of states reporting an increase in the percentage of grade 3
students in Reading First schools who meet or exceed proficiency on Reading First measures of
reading comprehension. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 7 Measure not in place
2005 19 Measure not in place
2006 12 (February 2007) Pending
2007 24 (February 2008) Pending
2008 29 (February 2009) Pending
Source. U.S. Department of Education, Reading First Annual Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Measure 1.15 of 15: The number of states reporting an increase in the percentage of grade 3
students who score at or above proficient on state assessments in reading. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2004 21 Measure not in place
2005 27 Measure not in place
2006 15 (February 2007) Pending
2007 32 (February 2008) Pending
2008 37 (February 2009) Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Program Goal: The Ready-To-Learn television program will enhance the learning
strategies of preschool and early elementary school children.
Objective 1 of 2: Develop, produce, and distribute high-quality televised educational
programming for preschool and early elementary school children.
Measure 1.1 of 1: The percentage of Ready-To-Learn children's television programming
deemed to be of high quality. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (October 2007) Pending
2007 BL+1% (October 2008) Pending
2008 BL+1% (October 2009) Pending
Source. U.S. Department of Education, Office of Innovation and Improvement, independent review panel
Frequency of Data Collection. Annual
Source.
Data are reported by the States. A total of 3,356 LEAs nationwide received an SRSA award in each of
fiscal years 2002 through 2004. States reported AYP data for 3,217 of these LEAs; 3,053 of the LEAs
made AYP in 2004-05 and 164 failed to do so.
Target Context. The Department's goal for the SRSA program is that all participating LEAs meet their
State’s definition of AYP in each school year. The initial annual AYP target for LEAs participating in SRSA
was one percent over the established baseline. The baseline data now in place from 2005 show that 95
percent of SRSA LEAs made AYP. In order to reach 100 percent by the year 2014, the Department
proposes a one-percentage point increase every 2 years over the 95-percent baseline in the number of
SRSA LEAs who make AYP.
Explanation. By statute, any SRSA LEA that has received funding for three years and fails to make AYP
may use program funds only to carry out improvement activities authorized under Section 1116 of the
Elementary and Secondary Education Act of 1965, as amended by the No Child Left Behind Act of 2001.
The 2004-2005 school year was the third year of the program. Any LEA that failed to make AYP during
the 2004-2005 school year may use SRSA funds only to carry out improvement activities in subsequent
fiscal years until it makes AYP.
Measure 2.2 of 4: The percentage of students enrolled in LEAs participating in the Small, Rural
School Achievement (SRSA) program who score proficient or better on States’ assessments in
mathematics in each year through the 2013-2014 academic year. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2007 999 (August 2008) Pending
2008 999 (August 2009) Pending
2009 999 (August 2010) Pending
2010 999 (August 2011) Pending
2011 999 (August 2012) Pending
2012 999 (August 2013) Pending
2013 999 Pending
2014 100 Pending
Measure 2.4 of 4: Percentage of students enrolled in LEAs participating in the Rural and Low-
Income School (RLIS) program who score proficient or better on States’ assessments in
mathematics in each year through the 2013-2014 academic year. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2007 999 (August 2008) Pending
2008 999 (August 2009) Pending
2009 999 (August 2010) Pending
2010 999 (August 2011) Pending
2011 999 (August 2012) Pending
2012 999 (August 2013) Pending
2013 999 Pending
2014 100 Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
U.S. Department of Education, EDFacts/EDEN, grantee submissions.
U.S. Department of Education, Office of Elementary and Secondary Education, grantee submissions.
Frequency of Data Collection. Annual
Data Quality. States report 2004-05 data in their Consolidated State Performance Report, and as part of
the annual Rural Education data collections.
Explanation. Only districts eligible for the Small Rural Schools Achievement (SRSA) Program are eligible
to utilize the Rural Education Achievement Program flexibility authority. In 2004-05, a total of 4,780 LEAs
nationwide were eligible for REAP-Flex; 2,694 LEAs made use of the authority and 2,086 did not.
Despite outreach with States, professional education organizations, and districts, the Department has not
been able to increase the percentage of eligible school districts utilizing the Rural Education Achievement
Program flexibility authority, indicating that there is not an unmet demand among non-participating
districts.
Source. U.S. Department of Education, Office of Elementary and Secondary Education, School Dropout
Prevention Program, grantee performance report.
Frequency of Data Collection. Annual
Explanation. FY 2006 reports SEA level data, which is the result of a change in the focus of the program.
The dropout rate is an average of the twenty-four grantees as reported in their initial applications. The
grantees use the NCES definition for dropout rates. The performance targets for dropout rates are based
on data from the first cohort of grantees.
Source. U.S. Department of Education, Office of Elementary and Secondary Education, School Dropout
Prevention Program, grantee performance report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Objective 3 of 3: Support effective programs that identify youth who have dropped out of
school and encourage them to reenter school and complete their secondary
education.
Measure 3.1 of 1: The percentage of students reentering schools who complete their secondary
education. (Desired direction: increase)
Year Target Actual Status
(or date expected)
2006 5 Pending
2007 5 Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, School Dropout
Prevention Program, grantee performance report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported by grantee through a Performance Report
Explanation. FY 2006 reports SEA level data, which is the result of a change in the focus of the program.
The dropout rate is an average of the two grantees as reported in their initial applications. The grantees
use the NCES definition for dropout rates. The performance targets for dropout rates are based on data
from the first cohort of grantees.
Program Goal: To increase the number of new, certified principals and assistant
principals, and to improve the skills of current practicing
principals and assistant principals, all serving in high-need
schools in high-need LEAs.
Objective 1 of 2: To recruit, prepare, and support teachers and individuals from other fields to
become principals including assistant principals in high-need schools in high-
need LEAs.
Measure 1.1 of 2: The percentage of those enrolled in Cohort 2 the School Leadership Program
who become certified as principals and assistant principals. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (November 2006) Pending
2007 BL+25% (November 2007) Pending
2008 BL+50% (November 2008) Pending
Source. U.S. Department of Education, School Leadership Program performance report.
Frequency of Data Collection. Annual
Data Quality. Each grantee uses its own method of recording and reporting data and inconsistencies
exist.
Explanation. These data are reported by cohorts depending on the project year. Each grant is for three
years. 26 grants were awarded for cohort 2 in FY 2005. For Cohort 2, data will be collected in project
years 2006, 2007, and 2008.
Measure 1.2 of 2: The percentage of Cohort 2 School Leadership program completers earning
certification as a principal or assistant principal who are employed in those positions in high-
need schools in high-need local educational agencies (LEAs). (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (November 2006) Pending
2007 BL+1% (November 2007) Pending
2008 BL+2% (November 2008) Pending
Source. U.S. Department of Education, School Leadership Program performance report.
Frequency of Data Collection. Annual
Data Quality. Each grantee uses its own method of recording and reporting data and inconsistencies
exist.
Explanation. These data are reported by cohorts depending on the project year. Each grant is for three
years. Twenty grants were awarded for cohort 1 in FY 2002 and data were collected in project years 2004
FY 2006 Program Performance Report
89 11/14/2006
U.S. Department of Education
and 2005. The 2004 actual value for cohort 1 was 38. 26 grants were awarded for cohort 2 in FY 2005.
For Cohort 2, data will be collected in project years 2006, 2007, and 2008.
Measure 1.2 of 5: The percentage of students scoring at or above proficient on state reading
assessments. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 65.7 Measure not in place
2003 66.7 54.9 Made Progress From Prior Year
2004 70 54 Did Not Meet Target
2005 74 (February 2007) Pending
2006 78 (February 2008) Pending
2007 BL+1 (February 2009) Pending
2008 BL+2 (February 2010) Pending
2009 BL+3 (February 2011) Pending
2010 BL+3 (February 2012) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Smaller Learning
Communities, program performance report.
Frequency of Data Collection. Annual
Measure 1.5 of 5: The percentage of students in high schools receiving Smaller Learning
Communities grants who graduate from high school (based on 9th grade enrollment). (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2001 59.2 Measure not in place
2003 60.2 56.6 Made Progress From Prior Year
2004 63 85.98 Target Exceeded
2005 66 (February 2007) Pending
2006 69 (February 2008) Pending
2007 BL+0.25 (February 2009) Pending
2008 BL+0.25 (February 2010) Pending
2009 BL+0.25 (February 2011) Pending
2010 BL+0.25 (February 2012) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Smaller Learning
Communities, program performance report.
Frequency of Data Collection. Annual
Measure 1.2 of 3: The percentage of program participants who become teachers in schools
with 25 percent or more American Indian and Alaska Native students. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2005 23 (December 2006) Pending
2007 23 Pending
Source. U.S. Department of Education, Office of Indian Education, project performance reports; U.S.
Department of Education, National Center for Education Statistics, Schools and Staffing Survey and
National Longitudinal Survey of Schools.
Frequency of Data Collection. Biennial
Measure 1.3 of 3: The percentage of program participants who receive full state licensure.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 75 (December 2006) Pending
2007 75 Pending
Source. U.S. Department of Education, Office of Indian Education, project performance reports.
Frequency of Data Collection. Annual
Data Quality. Sample size is small and it is costly to add supplemental samples to data collection
programs. National sample results in an under-representation in sample count. The program plans to
monitor the number of American Indian and Alaska Native students through LEAs' reporting on program
effectiveness in their Annual Performance Report.
Measure 2.2 of 5: The percentage of 3- to 4-year-old American Indian and Alaska Native
children achieving educationally significant gains on prescribed measure of cognitive skills and
conceptual knowledge, including mathematics, science, and early reading based on curriculum
benchmarks. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 46 (December 2006) Pending
2006 46 (December 2007) Pending
2007 46 (December 2008) Pending
Source. U.S. Department of Education, Office of Indian Education, project performance report.
Frequency of Data Collection. Annual
Data Quality. Office of Indian Education performance report data are supplied by grantees. Substantial
variation will exist in curriculum benchmarks and assessments.
Measure 2.3 of 5: The percentage of 3- to 4-year-old American Indian and Alaska Native
children achieving educationally significant gains on prescribed measure of social development
that facilitates self-regulation of attention, behavior, and emotion based on curriculum
benchmarks. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 46 (December 2006) Pending
2006 46 (December 2007) Pending
2007 46 (December 2008) Pending
Source. U.S. Department of Education, Office of Indian Education, project performance report.
Frequency of Data Collection. Annual
Data Quality. Office of Indian Education performance report data are supplied by grantees. Substantial
variation will exist in curriculum benchmarks and assessments.
Measure 2.4 of 5: The percentage of high school American Indian and Alaska Native students
successfully completing (as defined by a passing grade) challenging core courses, (English,
mathematics, science and social studies). (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 46 (December 2006) Pending
2006 46 (December 2007) Pending
Measure 2.5 of 5: The percentage of American Indian and Alaska Native students participating
in the program that have college assessment scores (ACT, SAT, PSAT) as high or higher than
the district average. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 46 (December 2006) Pending
2006 46 (December 2007) Pending
2007 46 (December 2008) Pending
Source. U.S. Department of Education, Office of Indian Education, project performance report.
Frequency of Data Collection. Annual
Data Quality. Office of Indian Education performance report data are supplied by grantees. Substantial
variation may exist in methods used to assess student performance.
Explanation. Data collection for this program began in FY 2004.
Program Goal: To improve student learning and teaching through the use of
emerging mobile technologies.
Objective 1 of 1: To improve the quality of technology-based applications in core academic
subjects developed through the Star Schools program.
Measure 1.1 of 1: The percentage of Star Schools technology-based applications in core
academic subjects deemed to be of high quality. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (October 2007) Pending
2007 BL+1% (October 2008) Pending
2008 BL+2% (October 2009) Pending
Source. U.S. Department of Education, Office of Innovation and Improvement, research and expert panel
review.
Frequency of Data Collection. Annual
FY 2006 Program Performance Report
95 11/14/2006
U.S. Department of Education
Explanation. In FY 2006, most Star Schools grantees will be in year one of new multi-year awards.
Source. U.S. Department of Education, Standards and Assessment external peer review process; Title I
review processes; staff recommendations; and approval decision by the Secretary.
Explanation. 50 States have either been fully approved or given approval with recommendation,
approval expected, or approval pending status. Two States were not approved but have requested
reconsideration. Depending on the outcome of the reconsideration, their status could change.
Each state has developed a schedule by which its reading/language arts assessments for grades 3-8 and
high school will be developed and field tested, and submitted to the Department for review and approval,
prior to implementation. The Department developed the Standards and Assessment External Review
process to review and approve the state assessments and conducted its first peer review in early 2005.
States are required to have their reading/language arts assessments in place by SY 2005-06. The 2006
performance target of 52 reflects the compliance of the 50 states, Puerto Rico, and the District of
Columbia.
Measure 1.2 of 6: The number of states (including DC and PR) that have mathematics
assessments that align with the state's academic content standards for all students in grades
three through eight and in high school. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
Source. U.S. Department of Education, Standards and Assessment external peer review process; Title I
review processes; staff recommendations; approval decision by the Secretary.
Frequency of Data Collection. Annual
Explanation. 50 States have either been fully approved or given approval with recommendation,
approval expected, or approval pending status. Two States were not approved but have requested
reconsideration. Depending on the outcome of the reconsideration, their status could change.
Each state has developed a schedule by which its mathematics assessments for grades 3-8 and high
school will be developed and field tested, and submitted to the Department for review and approval, prior
to implementation. The Department developed the Standards and Assessment External Peer Review
process to review and approve the state assessments and conducted its first peer review in early 2005.
States are required to have their mathematics assessments in place by SY 2005-06. The 2006
performance target of 52 reflects the compliance of the 50 states, Puerto Rico, and the District of
Columbia.
Measure 1.3 of 6: The number of states (including DC and PR) that have science assessments
that align with the state's academic content standards for all students in in each grade span
(grades 3 through 5, 6 through 8, and high school). (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline 0 Target Met
2005 18 0 Did Not Meet Target
2006 15 (December 2007) Pending
2007 25 (December 2008) Pending
2008 52 (December 2009) Pending
Source. U.S. Department of Education, Standards and Assessment external peer review process; Title I
review processes; staff recommendations; and approval decision by the Secretary.
Frequency of Data Collection. Annual
Explanation. As this is not a requirement until 2008, there is no data for 2006.
Each state has developed a schedule by which its science assessments in each grade spans (3-5, 6-8,
and high school) will be developed and field tested, and submitted to the Department for review and
approval, prior to implementation. The Department developed the Standards and Assessment External
Review process to review and approve the state assessments. No state submitted their science
assessments for review in 2004 or 2005. States are required to have their science assessments in place
by SY 2007-08. The 2008 performance target of 52 reflects the compliance of the 50 states, Puerto Rico,
and the District of Columbia.
Measure 1.4 of 6: The number of states (including DC and PR) that have completed field testing
of the required assessments in reading/language arts. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
Source. U.S. Department of Education, Consolidated State Performance Report grantee submissions;
state Web sites.
Frequency of Data Collection. Annual
Explanation. Field testing is a prerequisite for implementation of new assessments.
Measure 1.5 of 6: The number of states (including DC and PR) that have completed field testing
of the required assessments in mathematics. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 16 Measure not in place
2004 20 Measure not in place
2005 30 47 Target Exceeded
2006 52 52 Target Met
2007 52 Pending
2008 52 Pending
Source. U.S. Department of Education, Consolidated State Performance Report grantee submissions;
state Web sites.
Frequency of Data Collection. Annual
Explanation. Field testing is a prerequisite for implementation of new assessments.
Measure 1.6 of 6: The number of states (including DC and PR) that have completed field testing
of the required assessments in science. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 18 Measure not in place
2004 19 Measure not in place
2005 24 Measure not in place
2006 20 26 Target Exceeded
2007 52 Pending
2008 52 Pending
Source. U.S. Department of Education, Consolidated State Performance Report grantee submissions;
state Web sites.
Frequency of Data Collection. Annual
FY 2006 Program Performance Report
98 11/14/2006
U.S. Department of Education
Explanation. Field testing is a prerequisite for implementation of new assessments.
Program Goal: To support state and local programs that are a continuing source
of innovation and educational improvement.
Objective 1 of 2: To encourage states to use flexibility authorities in ways that will increase
student achievement.
Measure 1.1 of 4: The percentage of districts targeting Title V funds to Department-designated
strategic priorities that achieve AYP. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline 65 Target Met
2004 68 69 Target Exceeded
2005 69 69 Target Met
2006 70 (August 2007) Pending
2007 71 (August 2008) Pending
2008 72 (August 2009) Pending
2009 73 (August 2010) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Frequency of Data Collection. Annual
Data Quality. A district could be counted under more than one of the four strategic priority areas.
Hence, a district could be counted multiple times under the strategic priority areas. ED has requested
OMB's approval to correct this question and eliminate duplicate counts in the Consolidated State
Performance Report for 2005-06 and future years.
Explanation. Strategic priorities include those activities: (1) that support student achievement, enhance
reading and math, (2) that improve the quality of teachers, (3) that ensure that schools are safe and drug
free, (4) and that promote access for all students. A comparison of AYP achievement rates for districts
that used 20% or more (69%) of Title V-A funds for the four strategic priorities versus those districts that
did not (54%) suggests that using Title V-A funds for the strategic priorities makes a positive difference.
Measure 1.2 of 4: The percentage of districts not targeting Title V funds that achieve AYP.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 Set a Baseline 55 Target Met
2004 58 49 Did Not Meet Target
Measure 1.3 of 4: The percentage of combined funds that districts use for the four Department-
designated strategic priorities. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 91 Target Met
2006 92 (August 2007) Pending
2007 93 (August 2008) Pending
2008 94 (August 2009) Pending
2009 95 (August 2010) Pending
Source. U.S. Department of Education, Consolidated State Performance Report.
Frequency of Data Collection. Annual
Data Quality. Each State reported the percentage of Title V-A funds that its LEAs used for the four
strategic priorities. The median average across States is 90.55 percent. ED has requested approval from
OMB to ask States to report actual dollar amounts instead of percentages in the Consolidated State
Performance Report for 2005-06 and future years, so that we can report the LEA Title V-A funds for the
four strategic priorities divided by the total Title V-A LEA funds.
Explanation. Strategic priorities include those activities: (1) that support student achievement, enhance
reading and math, (2) that improve the quality of teachers, (3) that ensure that schools are safe and drug
free, (4) and that promote access for all students. 2005 data (91%) are the baseline. The target for 2006
is baseline plus 1 percentage point (92%), and data will be available in August 2007.
Measure 1.4 of 4: The percentage of participating LEAs that complete a credible needs
assessment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 100 Target Met
2006 100 (August 2007) Pending
2007 100 (August 2008) Pending
2008 100 (August 2009) Pending
2009 100 (August 2010) Pending
In 2006, the program office developed and began a series of innovative, virtual monitoring visits using
videoconferencing to gather the comprehensive information needed for multiple programs at a
significantly lower cost and greater efficiency than traditional on-site visits. Because 2006 was a
developmental year for virtual monitoring visits and follow-up activities, it was unrealistic to establish
a baseline. Instead the program office will use ED's standard of 45 days as the target for 2007 and future
years.
Measure 2.2 of 2: The percentage of States that respond satisfactorily within 30 days to findings
in their State Grants for Innovative Programs monitoring reports. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
2007 Set a Baseline (September 2007) Pending
2008 999 (September 2008) Pending
2009 999 (September 2009) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, State Grants for
Innovative Programs, program office records.
Frequency of Data Collection. Annual
Explanation. In 2006, the program office developed and began a series of innovative, virtual monitoring
visits using videoconferencing to gather the comprehensive information needed for multiple programs at a
significantly lower cost and greater efficiency than traditional on-site visits. Because 2006 was a
developmental year for virtual monitoring visits and follow-up activities, it was unrealistic to establish a
There are no issues. Beginning for SY 2004-05 reporting CSPR data are submitted electronically by
States using EDEN/EDFACTS.
Explanation.
22 out of the 34 States that tested 4th grade students in reading in both 2004 and 2005 reported an
increase in the percentage of these students scoring proficient or above on state assessments in
reading/language arts. Although the target of 25 was just missed, the difference between the target and
the actual is in part a reflection of the fact that only 34 States tested 4th graders in this subject both years.
A solid majority of the 34 States that did test both years showed increases.
Also, beginning next year the reading/language arts measure will change from a State level indicator to
two student level indicators.
Prior to the 2005-2006 school year, States only were required to test one time during grades three - five
and one time during grades six through nine.
Explanation.
30 out of the 42 States that tested 8th grade students in mathematics in both 2004 and 2005 reported an
increase in the percentage of these students scoring proficient or above on state assessments in
mathematics. The target of 25 was exceeded, despite the fact that only 42 States tested 8th graders in
this subject both years.
Also, beginning next year the mathematics measure will change from a State level indicator to two
student level indicators.
Source.
U.S. Department of Education, tracking of the dates of State monitoring visits and the dates that reports
are delivered to the State.
Explanation. Original measure 2.1 (Making AYP: The number of States that report an increase in schools
making AYP) has been replaced by an efficiency measure as part of the PART process.
Measure: The average number of business days used to complete State monitoring reports .
Baseline: 46.3
Target:
Actual: September 2006
Measure 1.3 of 6: The percentage of Transition to Teaching (TTT) teachers of record who teach
in high-need schools in high-need LEAs for at least three years (2002 grantee cohort). (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (November 2006) Pending
Measure 1.4 of 6: The percentage of all Transition to Teaching (TTT) participants who become
teachers of record (TOR) in high-need schools in high-need LEAs (2004 grantee cohort).
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 65 Measure not in place
2006 40 (November 2006) Pending
2007 45 (November 2007) Pending
2008 55 (November 2008) Pending
2009 75 (November 2009) Pending
Source. U.S. Department of Education, Transition to Teaching Program Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. In 2005, grantees from the 2004 cohort participated in the Transition to Teaching Program's
piloting of a uniform reporting system that improved data consistency by creating consistent definitions of
terms, but which required outside contractors to manage (the online report was one part of the TTT
program evaluation). In 2006 the program began to use the Department's standard performance reporting
form (524B) for all grantees. While an improvement over the Department's previous years' narrative
performance reporting formats, the 524B still enables grantees to report data inconsistently from one
another. In response to recommendations identified in the Program Assessment Rating Tool (PART)
process in which the TTT program participated in spring 2005, TTT staff worked in 2006 to verify
previously reported data from grantees in order to ensure their consistency and accuracy. Data in this
report have been updated to reflect this verification. While not a formal measure of validation, the 2004
grantees will also be responsible for providing a three year interim evaluation demonstrating progress
over the first three years of the grant in 2007. As in 2005 with the 2002 grantees, this interim evaluation
may provide a validation of the actual annual performance data for the 2004 grantees.
Explanation. The calculation is the cumulative number of teachers of record in high-need schools/LEAs
for the program over the total number of TTT participants for the program. It should be noted that the data
reported for this cohort are for the first year of program implementation. Only a portion of grantees had
Measure 1.6 of 6: The percentage of Transition to Teaching (TTT) teachers of record who teach
in high-need schools in high-need LEAS for at least three years (2004 grantee cohort).
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2008 Set a Baseline (November 2008) Pending
2009 BL+1% (November 2009) Pending
Source. U.S Department of Education, Transition to Teaching Program Grantee Performance Report.
ESEA: Troops-to-Teachers
FY 2006 Program Performance Report
Strategic Goal 2
Discretionary
ESEA, Title II, Part C-1-A
Document Year 2006 Appropriation: $14,645
CFDA 84.815: Troops to Teachers
Measure 1.3 of 3: The percentage of Troops to Teachers participants who remain in teaching for
three or more years after placement in a teaching position in a high-need LEA. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2005 80 88 Target Exceeded
2006 80 (December 2006) Pending
2007 80 (December 2007) Pending
2008 80 (December 2008) Pending
Source. U.S. Department of Education, Troops to Teachers Program Grantee Performance Report.
Frequency of Data Collection. Annual
Explanation. "Participants" are those receiving financial support from the Troops-to-Teachers program,
either stipend or bonus. Both participants and recruits receive funding from the program and the words
are used interchangeably. "Eligible school district " is a high-need LEA as defined by program regulations.
"Teachers of record" are those Troops participants hired by an eligible school district, and all Troops
teachers are highly qualified. For FY 2006, this measure will report on Troops participants who began
teaching in the 2003-04 school year, for 2007 those who began teaching in 2004-05; for 2008 those who
began teaching in 2005-06. The FY 2005 data were not collected. The goal is to maintain the same
percentage of retention over the years.
Measure 1.2 of 2: The percentage of students participating at Voluntary Public School Choice
sites who exercise school choice by changing schools. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 1 Measure not in place
2005 1.9 Measure not in place
2006 2 (November 2007) Pending
2007 2.5 (November 2008) Pending
2008 2.5 (November 2009) Pending
Source. U.S. Department of Education, Voluntary Public School Choice Grantee Performance Report;
National Evaluation of the Voluntary Public School Choice Program.
Frequency of Data Collection. Annual
Explanation. The calculation is the total number of students who changed schools divided by the total
number of eligible students for the VPSC program across all the grantees that reported enrollment data.
Eleven of the 13 grantees reported data in 2005. This approach is consistent with the national evaluation
of this program. This measure replaces a previous similar measure that was based on an average of
averages across sites. Trend data shown in the table reflect a re-calculation under the new definition.
'School' refers to a day or residential school, as well as schools within a school, off-campus learning and
'alternative' programs. 'Exercising choice' refers to students who moved from their assigned school to a
school of their choice. The targets reflect anticipated full implementation but may decrease over time
because of predicted declining enrollments in some grantee sites.
Measure 1.2 of 2: The percentage of female students served by the Women's Educational
Equity program who indicate increased knowledge of and intent to pursue career options in
mathematics and the sciences (including computer science). (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 BL+10% (February 2007) Pending
2006 BL+17% (February 2008) Pending
2007 BL+20% (February 2009) Pending
2008 BL+25% (February 2010) Pending
Source. U.S. Department of Education, Women's Educational Equity Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Explanation. Data from several WEEA projects has been delayed because much of the information is
being collected as part of the project evaluation. Some projects only had summer participants which
accounts for the delay in providing information.
Independent panel reviews will be conducted as part of a national evaluation of the program through an
ED contract with an outside firm. The evaluation contract was awarded in August 2006, later than
originally anticipated.
Measure 1.2 of 2: The percentage of Comprehensive Centers' products and services deemed
to be of high relevance to educational practice by an independent review panel of qualified
practitioners. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
2007 Set a Baseline (February 2008) Pending
2008 999 (February 2009) Pending
2009 999 (February 2010) Pending
2010 999 (February 2011) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Comprehensive
Centers, independent review panel.
Frequency of Data Collection. Annual
Objective 2 of 2: Technical assistance products and services will be used to improve results
for children in the target areas.
Measure 2.1 of 1: The percentage of all comprehensive centers' products and services that are
deemed to be of high usefulness to educational policy or practice by target audiences. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
2007 Set a Baseline (February 2008) Pending
2008 999 (February 2009) Pending
2009 999 (February 2010) Pending
2010 999 (February 2011) Pending
Source. U.S. Department of Education, Office of Elementary and Secondary Education, Comprehensive
Centers, survey of targeted audiences.
Frequency of Data Collection. Annual
Explanation. Surveys of target audiences will be conducted as part of a national evaluation of the
program through an ED contract with an outside firm. The evaluation contract was awarded in August
2006, later than originally anticipated.
All grantees are funded during a single 12-month budget period, beginning on the award date. Data
showing progress against the performance target is available from all grantees from 18-24 months
following the close of the budget period. The data collection and reporting period varies based on each
grantee’s unique program schedule: some enroll a cohort with fixed start and end dates, others use open
enrollment, where students enter at any time and complete the program in different timeframes. As a
consequence, the period of time when all activities funded by a budget cycle will vary and full data
collection may extend into subsequent budget periods.
Explanation. This is a long-term target. This measure differs from a similar FY 2005 performance
measure in focusing on the percentage of participants who receive the GED, rather than complete the
program and receive the GED, to more accurately reflect data collected from grantees. The calculation
for this revised measure is the number of participants who receive the GED certificate divided by the
number of HEP participants funded to be enrolled in GED Instruction.
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, State Grants for
Incarcerated Youth Offenders Program, periodic reports from grantees.
Frequency of Data Collection. Annual
Data Quality. Data is based on continuous enrollment. Therefore, the current enrollment is being
compared to the outcome of graduates, including individuals served in the prior year and those still
enrolled at year end. This distorts the numbers when the program is either growing or contracting.
Programs differ in objectives and degrees/certificates offered, so very different outcomes are being
combined. Reporting is inconsistent from state to state. Some data being combined may not be reliable.
Explanation. In FY 2005, for the 43 states submitting aggregate data, 20,080 inmates participated in the
program. Of these, 4,633 complete a degree or certificate. The FY 2005 data is more accurate than prior
data so targets for FY 2006 and FY 2008 were revised.
Objective 2 of 2: To reform teacher preparation programs in partnership with high need school
districts and schools of arts and sciences to produce highly qualified
teachers.
Measure 2.1 of 2: Cost per successful outcome: the federal cost per Teacher Quality
Enhancement program completer. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 2,932 Measure not in place
2005 4,728 Measure not in place
2007 999 (December 2007) Pending
2008 999 (December 2008) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Teacher Quality
Enhancement Grants Program Annual Performance Reports.
Data Quality. Data are verified by testing entities and certified by state licensing authorities. The data
collection meets the Title II, Higher Education Act requirement for a natioanl reporting system on the
quality of teacher preparation.
Explanation. The efficiency measure is calculated as the allocation for partnership grants divided by the
number of highly qualified teacher candidates graduating from grantee postsecondary institutions. FY
2004 data were calculated by dividing the appropriation to institutions reporting highly qualified teachers
during the school year 2003-2004 ($4,078,018) by the number of program completers who were certified
as highly qualified teachers (2,125). $4,078,018/2,125 = $2,932. Note: Previously reported data for 2004
have been adjusted to be more accurate. Data for FY 2006 will be available in December 2006. For FY
2006-2008, this is an efficiency measure without targets.
Measure 2.2 of 2: The percentage of program completers who are highly qualified teachers.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 84 Measure not in place
2005 80 95 Target Exceeded
2006 95 (December 2006) Pending
2007 95 (December 2007) Pending
Measure 1.2 of 3: The number of states that serve at least 2 percent of infants and toddlers in
the general population, birth through age 2, through Part C. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 25 Measure not in place
2003 20 27 Target Exceeded
2004 40 28 Made Progress From Prior Year
2005 31 30 Made Progress From Prior Year
2006 31 30 Did Not Meet Target
2007 31 (August 2007) Pending
2008 31 (August 2008) Pending
Source. U.S. Department of Education, Office of Special Education Programs, IDEA, section 618, state-
reported data. U.S. Census Bureau, census data.
Frequency of Data Collection. Annual
Explanation. Actual performance data previously reported for FY 2001-2003 reflected performance in FY
2002-2004 and have been corrected here.
Measure 1.3 of 3: The percentage of children receiving early intervention services in home or in
programs designed for typically developing children. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 58 Measure not in place
1998 63 Measure not in place
1999 67 Measure not in place
2000 67 73 Target Exceeded
2001 69 76 Target Exceeded
2002 71 82 Target Exceeded
2003 78 83 Target Exceeded
2004 79 85 Target Exceeded
2005 83 87 Target Exceeded
2006 85 (August 2007) Pending
2007 89 (August 2008) Pending
2008 90 (August 2009) Pending
2009 91 (August 2010) Pending
Program Goal: Ensure all children with disabilities have available to them a free
appropriate public education to help them meet challenging
standards and prepare them for postsecondary education and/or
competitive employment and independent living by assisting
state and local educational agencies and families.
Objective 1 of 4: All children with disabilities will meet challenging standards as determined by
national and state assessments with accommodations as appropriate.
Measure 1.1 of 6: The number of states reporting an increase in the percentage of fourth-grade
students with disabilities scoring at or above proficient on state assessments in reading.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 24 Measure not in place
2005 25 13 Did Not Meet Target
2006 25 Pending
2007 26 Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Explanation. This measure parallels a measure for the Title I Grants to Local Education Agencies
program under the Elementary and Secondary Education Act.
Measure 1.2 of 6: The number of states reporting an increase in the percentage of eighth-grade
students with disabilities scoring at or above proficient on state assessments in mathematics.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 26 Measure not in place
FY 2006 Program Performance Report
121 11/14/2006
U.S. Department of Education
2005 25 32 Target Exceeded
2006 25 (August 2007) Pending
2007 26 (August 2008) Pending
Source. U.S. Department of Education, Consolidated State Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Explanation. This measure parallels a measure for the Title I Grants to Local Education Agencies
program under the Elementary and Secondary Education Act.
Measure 1.3 of 6: The percentage of fourth-grade students with disabilities scoring at or above
Basic on the National Assessment of Educational Progress (NAEP) in reading. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2000 22 Measure not in place
2002 24 29 Target Exceeded
2003 25 29 Target Exceeded
2005 35 33 Made Progress From Prior Year
2007 35 (November 2007) Pending
2009 37 Pending
2011 39 Pending
Source. U.S. Department of Education, National Center for Education Statistics, National Assessment of
Educational Progress (NAEP).
Frequency of Data Collection. Biennial
Explanation. Targets for FY 2002 and 2003 were adjusted to be consistent with the Department's
Strategic Plan (2002-2007)
Measure 1.4 of 6: The percentage of fourth-grade students with disabilities who were included
in the National Assessment of Educational Progress (NAEP) reading sample, but excluded from
the testing due to their disabilities. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
1998 41 Measure not in place
2002 39 Measure not in place
2003 33 Measure not in place
2005 35 Measure not in place
2007 33 (November 2007) Pending
2009 31 Pending
2011 29 Pending
Source. U.S. Department of Education, National Center for Education Statistics, National Assessment of
Educational Progress (NAEP).
Frequency of Data Collection. Biennial
Data Quality. The NAEP sample does not include schools specifically for students with disabilities.
Measure 1.5 of 6: The percentage of eighth-grade students with disabilities scoring at or above
Basic on the National Assessment of Education Progress (NAEP) in mathematics. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2000 20 Measure not in place
2003 23 29 Target Exceeded
2005 32 31 Made Progress From Prior Year
2007 33 (November 2007) Pending
2009 35 Pending
2011 37 Pending
Source. U.S. Department of Education, National Center for Education Statistics, National Assessment of
Educational Progress.
Frequency of Data Collection. Biennial
Data Quality. Results of the NAEP scores for students with disabilities from this sample cannot be
generalized to the total population of such students.
Explanation. Targets for FY 2002 and 2003 were adjusted to be consistent with the Department's
Strategic Plan (2002-2007)
Measure 1.6 of 6: The percentage of eighth-grade students with disabilities who were included
in the National Assessment of Educational Progress (NAEP) mathematics sample, but excluded
from testing due to their disabilities. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2000 32 Measure not in place
2003 22 Measure not in place
2005 24 Measure not in place
2007 23 (November 2007) Pending
2009 21 Pending
2011 19 Pending
Source. U.S. Department of Education, National Center for Education Statistics, National Assessment of
Educational Progress.
Frequency of Data Collection. Biennial
Explanation. This measure was changed in 2006 to better focus on the percentage of children with
disabilities who are excluded from NAEP testing. Previous year data were recalculated accordingly.
Objective 2 of 4: Secondary school students will complete high school prepared for
postsecondary education and/or competitive employment.
Measure 2.1 of 3: The percentage of students with disabilities with IEPs who graduate from
high school with a regular high school diploma. (Desired direction: increase)
Source. U.S. Department of Education, Office of Special Education Programs, Individuals with
Disabilities Education Act, state-reported data.
Frequency of Data Collection. Annual
Explanation. The graduation rate is calculated by dividing the number of students with disabilities aged
14 and older who graduate with a regular diploma by the total number of students with disabilities in the
same age group who graduate with a regular diploma, receive a certificate of completion, reach the
maximum age for services, die, drop out, or move (not known to have continued in education). This
includes calculations for 57 entities (50 states, DC, Puerto Rico, Guam, American Samoa, Virgin Islands,
N. Marianas and BIA).
Measure 2.2 of 3: The percentage of students with disabilities who drop out of school.
(Desired direction: decrease)
Actual
Year Target Status
(or date expected)
1996 47 Measure not in place
1997 46 Measure not in place
1998 44 Measure not in place
1999 42 Measure not in place
2000 42 Measure not in place
2001 41 Measure not in place
2002 38 Measure not in place
2003 34 Measure not in place
2004 31 Measure not in place
2005 34 28 Did Better Than Target
2006 29 (August 2007) Pending
2007 28 Pending
2008 27 Pending
Source. U.S. Department of Education, Office of Special Education Programs, Individuals with
Disabilities Education Act, state-reported data.
Frequency of Data Collection. Annual
Explanation. The dropout rate is calculated by dividing the number of students with disabilities aged 14
and older who drop out or move (not known to have continued in education) by the total number of
students with disabilities in the same age group who graduate with a regular diploma, receive a certificate
of completion, reach the maximum age for services, die, drop out, or move (not known to have continued
in education). This includes calculations for 57 entities (50 states, DC, Puerto Rico, Guam, American
Samoa, Virgin Islands, N. Marianas and BIA).
Measure 2.3 of 3: The percentage of youth with disabilities who are no longer in secondary
school and who are either competitively employed, enrolled in some type of postsecondary
school, or both, within two years of leaving high school. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 59 Measure not in place
2005 59.5 75 Target Exceeded
2006 60 Pending
2007 Set a Baseline (March 2007) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Postsecondary Outcomes
Center, Annual Evaluation Report.
Frequency of Data Collection. Annual
Explanation. Data for 2004 were gathered by the National Longitudinal Study 2 (NLTS2) from school
year 2003-2004.
In February 2007 States willl submitt the baseline post-school outcome data based indicator 14 of the
State Performance Plan.
Objective 3 of 4: All children with disabilities will receive a free appropriate public education.
Measure 3.1 of 2: The number of states with at least 90 percent of special education teachers
fully certified in the areas in which they are teaching. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 36 Measure not in place
1998 37 Measure not in place
1999 41 36 Did Not Meet Target
2000 42 36 Did Not Meet Target
2001 42 37 Made Progress From Prior Year
2002 42 33 Did Not Meet Target
2003 37 30 Did Not Meet Target
2004 37 36 Made Progress From Prior Year
2005 39 35 Did Not Meet Target
2006 40 Pending
Source. U.S. Department of Education, Office of Special Education Programs, Individuals with
Disabilities Education Act, state-reported data.
Frequency of Data Collection. Annual
Data Quality. Data reflect grades 1-12, not teachers teaching children aged 6-21. State maintain data by
grades taught, not ages of students. State requirements for teacher certification vary widely (i.e., teachers
fully certified in one state might not be considered eligible for full certification in another state).
Explanation. There is a clustering of states around the 90 percent threshold in this indicator, which may
result in unpredictable changes from year to year.
Measure 3.2 of 2: The percentage of children with disabilities served outside of the regular
classroom 60 percent or more of the day due to their disability (as a percentage of the school
population). (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2001 2.85 Measure not in place
2002 2.81 Measure not in place
2003 2.77 Measure not in place
2004 2.67 Measure not in place
2005 2.69 (June 2007) Pending
2006 2.65 Pending
2007 2.64 Pending
Source. U.S. Department of Education, Office of Special Education Programs, Individuals with
Disabilities Education Act, state-reported data.
Frequency of Data Collection. Annual
Explanation.
Measure cannot be calculated until the common core of data is available to determine the demoninator.
Source. U.S. Department of Education, Office of Special Education Programs, program records.
Frequency of Data Collection. Annual
Measure 1.2 of 2: The federal cost per unit of technical assistance provided by the Special
Education Parent Training and Information Centers, by category, weighted by the expert panel
quality rating. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (August 2007) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Special Education Parent
Information Centers, expert panel review.
Frequency of Data Collection. Annual
Explanation. This is a new efficiency measure.
Objective 2 of 2: Parents served by PTI investments will be knowledgeable about their IDEA
rights and responsibilities. (Long-term objective. Target areas: assessment;
literacy; behavior; instructional strategies; early intervention; and inclusive
practices)
Measure 2.1 of 1: The percentage of parents receiving Special Education Parent Information
Objective 2 of 2: Increase the supply of teachers and service providers who are highly
FY 2006 Program Performance Report
128 11/14/2006
U.S. Department of Education
qualified for and serve in positions for which they are trained.
Measure 2.1 of 3: The percentage of Special Education Personnel Preparation funded scholars
who exit training programs prior to completion due to poor academic performance. (Desired
direction: decrease)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 3 Target Met
2006 0.99 (October 2007) Pending
2007 0.99 Pending
Source. U.S. Department of Education, Office of Special Education Programs, Special Education
Personnel Preparation, annual data report.
Frequency of Data Collection. Annual
Explanation. OSEP initially anticipated that the performance target for this measure would be to achieve
decreases in the rate over time. However, because baseline data were better than anticipated (e.g., less
than 1 percent), instead of expecting even further decreases in outyears, OSEP believes that maintaining
a rate of less than 1 percent is desirable.
Measure 2.2 of 3: The percentage of Special Education Personnel Preparation funded
degree/certification program recipients employed upon program completion who are working in
the area(s) in which they were trained. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 79 Measure not in place
2004 68 Measure not in place
2005 82 63 Did Not Meet Target
2006 71 (October 2007) Pending
2007 85 Pending
2008 86 Pending
2009 88 Pending
2010 89 Pending
Source. U.S. Department of Education, Office of Special Education Programs, Special Education
Personnel Preparation, annual data report.
Frequency of Data Collection. Annual
Explanation. The significant decrease from the 2003 actual figure of 79 percent to the 2005 actual figure
of 68 percent is due to refinements to the data collection system that permit a more accurate link between
area of employment and area of training. The FY 2006 target was adjusted based on past performance.
Measure 2.3 of 3: The percentage of Special Education Personnel Preparation funded
degree/certification recipients employed upon program completion who are working in the
area(s) for which they were trained and who are fully qualified under IDEA. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Set a Baseline (October 2007) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Special Education
Personnel Preparation, annual data report.
Program Goal: To help preschool children with disabilities enter school ready to
succeed by assisting states in providing special education and
related services.
Objective 1 of 1: Preschool children with disabilities will receive special education and related
services that result in increased skills that enable them to succeed in school.
Measure 1.1 of 3: The percentage of children with disabilities (aged three through five) who
receive special education and related services in settings with typically developing peers (e.g.,
early childhood settings, home and part-time early childhood/part-time early childhood special
education settings). (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 41 Measure not in place
2000 40 Measure not in place
2001 39 Measure not in place
2002 39 40 Target Exceeded
2003 40 38 Did Not Meet Target
2004 40 37 Did Not Meet Target
2005 41 36 Did Not Meet Target
2006 40 37 Made Progress From Prior Year
2007 40 (August 2007) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Individuals with
Disabilities Education Act (IDEA), state-reported data under section 618.
Frequency of Data Collection. Annual
Data Quality. `
Measure 1.2 of 3: The number of states with at least 90 percent of special education teachers
of children aged three to five who are fully certified in the area in which they are teaching.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 35 Measure not in place
1998 37 Measure not in place
1999 40 34 Did Not Meet Target
2000 41 36 Made Progress From Prior Year
2001 40 35 Did Not Meet Target
FY 2006 Program Performance Report
130 11/14/2006
U.S. Department of Education
2002 40 34 Did Not Meet Target
2003 36 32 Did Not Meet Target
2004 36 34 Made Progress From Prior Year
2005 37 33 Did Not Meet Target
2006 37 (August 2007) Pending
2007 38 (August 2008) Pending
2008 38 (August 2009) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Individuals with
Disabilities Education Act (IDEA), state-reported data under section 618.
Frequency of Data Collection. Annual
Data Quality. States maintain data for students with disabilities by grades taught, not by ages of the
students taught. Therefore, these data are for teachers teaching prekindergarten and kindergarten.
Certification of related services personnel are not included because those requirements vary even more
widely than requirements for teachers (e.g., some states certify sign language interpreters, but other
states do not). The Office of Special Education Programs (OSEP) will implement follow-up actions
regarding increasing emphasis on related services personnel; possibly follow-up on the SPeNSE study.
Explanation. There is a clustering of states around the 90 percent threshold for this measure, which may
result in unpredictable changes from year to year.
Measure 1.3 of 3: The percentage of children with disabilities (aged three through five)
participating in the Special Education Preschool Grants program who demonstrate positive
social-emotional skills (including social relationships); acquire and use knowledge and skills
(including early language/communication and early literacy); and use appropriate behaviors to
meet their needs. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
2007 Set a Baseline (August 2007) Pending
2008 0 (August 2008) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Pre-elementary Education
Longitudinal Study (PEELS).
Frequency of Data Collection. Annual
Explanation. This indicator focuses on early language/ communication, early literacy and social-
emotional skills because these skills are the best indictors of success in later years.
States will collect entry data in calendar year 2006 and that entry data will be reported to OSEP in 2007.
States will collect exit data on those children in calendar year 2007 and report baseline data to OSEP in
2008. That baseline data will also be 2008 baseline data for the Program Performance Plan.
In FY 2006, only 8 of the 49 grants funded under this program were conducting projects under the new
State Personnel Development Grant (SPDG) authority. The other grantees were conducting their projects
under the State Improvement Grant authority. These grants will expire at the end of FY 2007. FY 2006
performance data is based on the 8 SPDG projects.
Source. U.S. Department of Education, Office of Special Education Programs, Indivduals with Disabilities
Education Act (IDEA), Special Education Technical Assistance and Dissemination, panel of experts.
Frequency of Data Collection. Annual
Measure 1.2 of 2: Federal cost per output defined as cost per unit of technical assistance
provided by Special Education Technical Assistance and Dissemination program, by category,
weighted by the expert panel quality rating. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (August 2007) Pending
2007 BL+999 (August 2008) Pending
2008 BL+999 (August 2009) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Indivduals with Disabilities
Education Act (IDEA), Special Education Technical Assistance and Dissemination, panel of experts.
Objective 2 of 2: The Technical Assistance and Dissemination program will identify, implement
and evaluate evidence-based models to improve outcomes for infants,
toddlers, children and youth with disabiltiies. (Long-term objective. Target
areas: assessment; literacy; behavior; instructional strategies; early
intervention; and inclusive practices)
Measure 2.1 of 1: The percentage of Technical Assistance and Dissemination projects
responsible for developing models that identify, implement and evaluate effective models.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (August 2007) Pending
2007 BL+5% (August 2008) Pending
2008 BL+10% (August 2009) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Individuals with
Disabilities Education Act (IDEA), Special Education Special Education Technical Assistance and
Dissemination, panel of experts.
Frequency of Data Collection. Annual
Data Quality. Data will be collected every 2-3 years.
Objective 3 of 3: Investments in the Technology and Media Services Program will make
validated, evidence-based technologies to improve results for infants,
toddlers, children and youth with disabilities available for widespread use.
(Long-term objective. Focus areas: assessment, literacy, behavior,
instructional strategies, early intervention, and inclusive practices)
Measure 3.1 of 1: The percentage of Special Education Technology and Media Services
projects that make technologies that incorporate evidence-based practices available for
widespread use. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (August 2007) Pending
Source. U.S. Department of Education, Office of Special Education Programs, Individuals with
Disabilities Education Act (IDEA), Special Education Technology and Media Services, expert panel
review.
Source. U.S. Department of Education, McKinney-Vento Homeless Assistance Act (MVHAA) Annual
Report.
Frequency of Data Collection. Annual
Data Quality. Data collected by state assessments are validated by the individual state's data quality
standards procedures. Data reflect information principally from LEAs with McKinney-Vento subgrants. For
2004, several states (less than 5) were unable to have their data system extract assessment information
on homeless students, or were impacted by the 2005 hurricanes and unable to meet the data reporting
deadline.
Explanation. The data are collected from LEAs that have subgrantees and are capable of reporting such
data. Approximately 10 percent of all school districts receive subgrant funds.
Measure 1.2 of 4: The percentage of homeless children and youth, grades three through eight,
included in statewide assessments in mathematics, as reported by LEA subgrantees. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2004 15 Measure not in place
2005 16 49 Target Exceeded
2006 52 (December 2006) Pending
2007 60 Pending
Source. U.S. Department of Education, McKinney-Vento Homeless Assistance Act (MVHAA) Annual
Report.
FY 2006 Program Performance Report
137 11/14/2006
U.S. Department of Education
Frequency of Data Collection. Annual
Data Quality. Data collected by state assessments are validated by the individual state's data quality
standards procedures. Data reflect information principally from LEAs with McKinney-Vento subgrants. For
2004, several states (less than 5) were unable to have their data system extract assessment information
on homeless students or were impacted by the 2005 hurricanes and unable to meet the data reporting
deadline.
Explanation. The data are collected from LEAs that have subgrantees and are capable of reporting such
data. Approximately 10 percent of all school districts receive subgrant funds.
Measure 1.3 of 4: The percentage of homeless students, grades three through eight, who meet
or exceed proficiency on state assessments in reading/language arts. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2002 30 Measure not in place
2004 36 Measure not in place
2005 34 42 Target Exceeded
2006 43 (December 2006) Pending
2007 50 Pending
Source. U.S. Department of Education, McKinney-Vento Homeless Assistance Act (MVHAA) Annual
Report.
Frequency of Data Collection. Annual
Data Quality. Data collected by state assessments are validated by the individual state's data quality
standards procedures. Data reflect information principally from LEAs with McKinney-Vento subgrants.
Explanation. Data was reported by 46 states in 2005. Several states impacted by the summer 2005
hurricanes were not able to produce data in time for this report.
Measure 1.4 of 4: The percentage of homeless students, grades three through eight, who meet
or exceed proficiency on state assessments in mathematics. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 24 Measure not in place
2004 36 Measure not in place
2005 26 41 Target Exceeded
2006 43 (December 2006) Pending
2007 50 Pending
Source. U.S. Department of Education, McKinney-Vento Homeless Assistance Act (MVHAA) Annual
Report.
Frequency of Data Collection. Annual
Data Quality. Data collected by state assessments are validated by the individual state's data quality
standards procedures. Data reflect information principally from LEAs with McKinney-Vento subgrants.
Explanation. Several states impacted by the summer 2005 hurricanes were not able to produce data in
time for this report.
Measure 1.3 of 3: The percentage of Tech-Prep students who meet state established academic
standards. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 79 Measure not in place
2002 71 Measure not in place
2003 79 Measure not in place
2004 76 75 Did Not Meet Target
2005 77 77 Target Met
2006 78 (May 2007) Pending
2007 79 (May 2008) Pending
2008 80 (May 2009) Pending
Source. U.S. Department of Education, Office of Vocational and Adult Education, Vocational Technical
Education Annual Performance and Financial Reports, grantee submissions.
Frequency of Data Collection. Annual
Data Quality. Under Perkins III, states are allowed maximum flexibility in their data collection
methodologies and procedures. This flexibility limits data comparability at the national level. The State
Administration and Accountability Group (SAAG) will conduct national and regional training institutes to
improve data collection efforts especially in the areas of special populations and minority students. SAAG
will conduct targeted individual state technical assistance to improve performance for special populations
and minority students. SAAG will collaborate with other divisions and agencies to improve the
performance of CTE students, particularly special population and minority students.
Explanation. FY 2005 target was met. OVAE will continue providing states with technical assistance.
Program Goal: Increase access to and improve programs at the high school, and
community and technical college levels that raise academic
achievement, strengthen workforce preparation, and promote
economic development and lifelong learning.
Objective 1 of 2: The use of rigorous research findings to inform program direction and
improve state and local practices, through the identification of research-
based education practices and communicating what works to practitioners,
parents and policy-makers, will increase.
Measure 1.1 of 4: The percentage of research studies conducted by the National Center for
Research in Career and Technical education with rigorous designs as defined by the
Department's definition of evidence-based research. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 71 Measure not in place
2003 83 Measure not in place
2004 100 100 Target Met
2005 100 100 Target Met
2006 100 (February 2007) Pending
2007 100 (December 2007) Pending
2008 Set a Baseline (February 2009) Pending
Source. U.S. Department of Education, Office of Vocational Education, independent review panel.
Frequency of Data Collection. Annual
Explanation. During 2006, Perkins programs are being reauthorized.
Measure 1.2 of 4: The number of customers receiving electronic materials or information from
the National Centers for Research and Dissemination in Career and Technical Education.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 273,546 Measure not in place
2001 1,569,999 Measure not in place
2002 3,004,898 Measure not in place
2003 6,054,535 Measure not in place
2004 2,300,000 19,904,845 Target Exceeded
2005 2,300,000 32,393,646 Target Exceeded
Measure 1.3 of 4: The number of customers receiving print materials or information from the
National Centers for Research and Dissemination in Career and Technical Education. (Desired
direction: decrease)
Actual
Year Target Status
(or date expected)
2001 131,254 Measure not in place
2002 219,729 Measure not in place
2003 13,567 Measure not in place
2004 100,000 326,757 Did Not Meet Target
2005 50,000 319,876 Made Progress From Prior Year
2006 25,000 (February 2007) Pending
2007 25,000 (December 2007) Pending
2008 0 Not Collected Not Collected
Source. U.S. Department of Education, National Centers for Research and Dissemination in Career and
Technical Education Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality. The number of customers is a duplicated count of individuals receiving information through
the Centers.
Explanation.
The program did not meet its FY 2005 target. The Web site provides viewers with access to research-
based information in many forms, including the webcasts. Viewers frequently download documents or
webcasts for use in a variety of professional development and leadership activities. The actual data
reported is a composite of downloaded documents and printed materials.
Measure 1.4 of 4: The number of customers receiving materials or information (total, electronic
or print) from the National Centers for Research and Dissemination in Career and Technical
Education. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 273,546 Measure not in place
2001 300,000 1,701,253 Target Exceeded
Objective 2 of 2: Improve and expand the use of accountability systems and effective program
strategies at the high school and postsecondary levels that promote student
achievement, performance and successful transition.
Measure 2.1 of 1: The percentage of states that have data systems with the capacity to include
information on all indicators and subindicators for secondary and postsecondary programs.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 92 Measure not in place
2002 97 Measure not in place
2003 98 Measure not in place
2004 100 98 Did Not Meet Target
2005 100 98 Did Not Meet Target
2006 100 (May 2007) Pending
2007 100 (May 2008) Pending
2008 100 (May 2009) Pending
Source.
U.S. Department of Education, Office of Vocational and Adult Education, Vocational Technical Education
Annual Performance and Financial Reports, grantee submissions.
Measure 1.4 of 7: The percentage of vocational concentrators who have completed high school.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 80 Measure not in place
2001 84 Measure not in place
2002 85 84 Did Not Meet Target
2003 86 84 Did Not Meet Target
2004 88 84 Did Not Meet Target
2005 87 84 Did Not Meet Target
2006 88 (May 2007) Pending
2007 89 (May 2008) Pending
2008 90 (May 2009) Pending
Source. U.S. Department of Education, Office of Vocational and Adult Education, Vocational Technical
Education Annual Performance and Financial Reports, grantee submissions.
Frequency of Data Collection. Annual
Data Quality. Under Perkins III, states are allowed maximum flexibility in their data collection
methodologies and procedures. This flexibility limits data comparability at the national level. The State
Administration and Accountability Group (SAAG) will conduct national and regional training institutes to
improve data collection efforts especially in the areas of special populations and minority students. SAAG
will conduct targeted individual state technical assistance to improve performance for special populations
and minority students. SAAG will collaborate with other divisions and agencies to improve the
performance of CTE students, particularly special population and minority students.
Explanation. FY 2005 performance target was not met. This is due to a number of States
having challenges locally. OVAE is providing technical assistance to States that did not meet their
negotiated performance targets. Additionally, States are obtaining assistance from States that have met
their negotiated performance targets. Also, two regional Data Quality Institutes (DQI) were held to provide
States with guidance and technical assistance and for states to collaborate. States were given resource
materials, websites and contact persons to assist with their data collection challenges, as well.
Explanation.
The FY 2005 target was exceeded. The original due date for the 2006 data was July 2006. Due to
summer graduations, complete data will not be available until September 2006. The target for the
percentage of students attaining high school diplomas is set high. The grantee performs well in
developing quality programs and services that assist students in reaching graduation goals.
The FY 2005 target was exceeded. The grantee uses a technology system that tracks students progress
and captures data on a daily basis. Therefore, the technical capability for the grantee to report on this
measure is advanced. Also, the increase in the number of students who receive high diplomas directly
impacts the percentage of students entering postsecondary and advanced programs.
Measure 2.3 of 3: The percentage of Native Hawaiian vocational students who obtained
employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 41 Measure not in place
2004 42 38 Did Not Meet Target
2005 43 33 Did Not Meet Target
2006 30 61 Target Exceeded
2007 40 (September 2007) Pending
2008 43 (August 2008) Pending
In January 2006, ED conducted monthly monitoring and technical assistance calls to discuss data
collection processes, report requirements and the project's newly updated strategic recruitment and
retention plan. The grantee provides progress reports on activities, including steps that will be taken to
meet goals.
Explanation. The program did not meet its FY 2006 target. Since 2004, the grantee continues to make
progress towards meeting targets. The progress can be attributed to a well-designed strategic plan that
focuses on recruiting and retaining students. The plan is updated yearly to consider internal and external
influences.
Measure 3.2 of 2: The percentage of vocational education teachers in Pacific outlying areas
who received professional development. (Desired direction: increase)
Year Target Actual Status
(or date expected)
2003 56 Measure not in place
2004 5 75 Target Exceeded
2005 35 66 Target Exceeded
2006 70 71 Target Exceeded
2007 75 (August 2007) Pending
Quarterly monitoring and technical assistance calls are conducted to discuss training opportunities
avaialble to teachers, and the types of training that are pertinent to the success of the program.
Target Context. The FY 2006 target was exceeded. Progress is made each year in meeting this goal.
Explanation. The FY 2006 target was exceeded. The number of teachers available to attend the
training/development opportunities increased. Additionally, there was an increase in the number of
relevant and pertinent development opportunities available in the geographical area.
Measure 4.2 of 3: The number of NAVTEP students attaining a certificate or degree. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
FY 2006 Program Performance Report
152 11/14/2006
U.S. Department of Education
2002 664 Measure not in place
2003 690 728 Target Exceeded
2004 725 1,598 Target Exceeded
2005 1,478 Measure not in place
2006 1,598 1,609 Target Exceeded
2007 1,620 (January 2008) Pending
2008 1,630 (January 2009) Pending
Source. U.S. Department of Education, Office of Vocational and Adult Education, Native American
Vocational and Technical Education Program, performance report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality.
Data is self-reported by grantee through performance, statistical and evaluation reports. When feasible,
data will be checked by staff during on-site monitoring of projects. ED will continue to request
increased emphais on meeting graduation requirements during clarification conferences with grantees.
Quarterly monitoring and technical assistance calls are conducted to discuss data collection processes,
and report requirements. During calls and/or site visits, grantees provide progress reports, and activities
and steps that will be taken to meet goals.
Explanation.
The target for 2006 was exceeded. The 2006 target was exceeded as a result of students meeting two-
year degree requirements. Data represents students who have been enrolled and re-enrolled since 2002,
and who have met requirements in order to receive certificates or degrees in 2006.
Measure 4.3 of 3: The number of NAVTEP students placed in employment or military services.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 1,606 Measure not in place
2003 1,690 Measure not in place
2004 1,715 1,430 Did Not Meet Target
2005 1,387 Measure not in place
2006 1,430 1,443 Target Exceeded
2007 1,450 (January 2008) Pending
2008 1,460 (January 2009) Pending
Source. U.S. Department of Education, Office of Vocational and Adult Education, Native American
Vocational and Technical Education Program, performance report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality.
Data is self-reported by grantee through performance, statistical and evaluation reports. When feasible,
data will be checked by staff during on-site monitoring of projects. ED will continue to request increased
enrollment numbers during clarification conferences with grantees for new and continuation awards.
Quarterly monitoring and technical assistance calls are conducted to discuss data collection processes,
Explanation.
The FY 2006 target was exceeded. In 2006, there was a significant increase in the number of students
who obtained degrees, causing an increase in the number of students who gained employement.
Additionally, many of the enrolled students are adult individuals who may have practical experiences that
are credited toward a particular certificate or degree, and who are employed while in the program. One of
the challenges for meeting this target is that some students are forced to leave school inorder to work or
will continue to work while in school.
Program Goal: To help reduce alcohol abuse among secondary school students.
Objective 1 of 1: Support the implementation of research-based alcohol abuse prevention
programs in secondary schools.
Measure 1.1 of 6: The percentage of Alcohol Abuse Reduction program grantees whose target
students show a measurable decrease in binge drinking: 2005 cohort. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 BL+25% Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Alcohol Abuse Reduction
Program Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
FY 2006 Program Performance Report
1 11/14/2006
U.S. Department of Education
Explanation. The 2006 target is was to set a baseline.
Grantees will collect data concerning binge drinking behavior of students served by the grant. The FY
2006 target is to gather initial data for the FY 2005 cohort because FY 2006 data cannot be used to
measure progress since the FY 2006 year was the first year of the grant period for this cohort.
Measure 1.2 of 6: The percentage of Alcohol Abuse Reduction program grantees that show a
measurable increase in the percentage of target students who believe that binge drinking is
harmful to their health: 2005 cohort. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 BL+25% (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Alcohol Abuse Reduction
Program Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
Explanation. The 2006 target is: set baseline.
Grantees will collect information about the attitudes of students served under the program relative to
perception of health risk and social disapproval of alcohol abuse. The FY 2006 target is to gather initial
data for the FY 2005 cohort because FY 2006 data cannot be used to measure progress since the FY
2006 year was the first year of the grant period for this cohort.
Measure 1.3 of 6: he percentage of Alcohol Abuse Reduction program grantees that show a
measurable increase in the percentage of target students who disapprove of alcohol abuse:
2005 cohort. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 BL+25% (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Alcohol Abuse Reduction
Program Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
Explanation. The 2006 target is: set baseline.
Grantees will collect information about the attitudes of students served under the program relative to
perception of health risk and social disapproval of alcohol abuse. The FY 2006 target is to gather initial
data for the FY 2005 cohort because FY 2006 data cannot be used to measure progress since the FY
2006 year was the first year of the grant period for this cohort.
Measure 1.4 of 6: The percentage of Alcohol Abuse Reduction program grantees that show a
measurable increase in the percentage of target students who believe that binge drinking is
harmful to their health: 2004 cohort. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 55.56 Measure not in place
Actual performance data are based on the number of grantees who experienced an increase in the
percentage of target students who believe that binge drinking is harmful to their health. Due to the
missing response by one grantee, actual performance for this measure may be as low as 50 percent, and
as high as 60 percent.
OSDFS program and policy staff have developed guidance in the past year to assist grantees in better
understanding their GPRA obligations. The guidance also provides a consistent, and uniform set of
definitions, methods, and reporting standards that aim to increase the quality of GPRA data for the
Alcohol Abuse program. Training on this guidance has been provided at grantee meetings.
Explanation.
55.56 percent, or five of nine grantees, experienced an increase in the percentage of their target students
who believe that binge drinking is harmful to their health.
Data were collected from grantees via annual performance reports that they submitted to program staff in
2005 and 2006. Performance data were calculated by comparing baseline and performance data.
Baseline data varies somewhat, as some grantees were able to collect baseline data in the first year,
while other grantees collected the data at two points of time in the second year of implementation.
Therefore, directional changes in performance may not fully capture changes from pre-implementation to
the time that the 2006 performance reports were submitted. Another area of variation is in the data
elements collected by the grantees. The wording of questions asked by the grantees to survey students to
collect the required data varied. However, only data that fully met the construct measured by each
performance measure were included in the calculation of the program performance data.
Measure 1.5 of 6: The percentage of Alcohol Abuse Reduction grantees whose target students
show a measurable decrease in binge drinking: 2004 cohort. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2006 50 Measure not in place
2007 70 (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Alcohol Abuse Reduction
Program Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
Data Quality. OSDFS program and policy staff have developed guidance in the past year to assist
grantees in better understanding their GPRA obligations. The guidance also provides a consistent, and
uniform set of definitions, methods, and reporting standards that aim to increase the quality of GPRA data
for the Alcohol Abuse program. Training on this guidance has been provided at grantee meetings.
Explanation.
Data were collected from grantees via annual performance reports that they submitted to program staff in
2005 and 2006. Performance data were calculated by comparing baseline and performance data.
Baseline data varies somewhat, as some grantees were able to collect baseline data in the first year,
while other grantees collected the data at two points of time in the second year of implementation.
Therefore, directional changes in performance may not fully capture changes from pre-implementation to
the time that the 2006 performance reports were submitted. Another area of variation is in the data
elements collected by the grantees. The wording of questions asked by the grantees to survey students to
collect the required data varied. However, only data that fully met the construct measured by each
performance measure were included in the calculation of the program performance data.
Measure 1.6 of 6: The percentage of Alcohol Abuse Reduction program grantees that show a
measurable increase in the percentage of target students who disapprove of alcohol abuse:
2004 cohort (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 66.67 Measure not in place
2007 87 (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Alcohol Abuse Reduction
Program Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
Data Quality. OSDFS program and policy staff have developed guidance in the past year to assist
grantees in better understanding their GPRA obligations. The guidance also provides a consistent, and
uniform set of definitions, methods, and reporting standards that aim to increase the quality of GPRA data
for the Alcohol Abuse program. Training on this guidance has been provided at grantee meetings.
Explanation.
66.67 percent, or six of nine grantees show a measurable increase in the percentage of target students
who disapprove of alcohol abuse. Due to one missing valid response, actual performance for this
measure could have been as low as 60 percent, and as high as 70 percent.
Data were collected from grantees via annual performance reports that they submitted to program staff in
2005 and 2006. Performance data were calculated by comparing baseline and performance data.
Baseline data varies somewhat, as some grantees were able to collect baseline data in the first year,
while other grantees collected the data at two points of time in the second year of implementation.
Therefore, directional changes in performance may not fully capture changes from pre-implementation to
the time that the 2006 performance reports were submitted. Another area of variation is in the data
elements collected by the grantees. The wording of questions asked by the grantees to survey students to
collect the required data varied. However, only data that fully met the construct measured by each
performance measure were included in the calculation of the program performance data.
Program Goal: To help promote the development of strong character among the
nation's students.
Objective 1 of 1: Support the development and implementation of high-quality character
education programs.
Measure 1.1 of 1: The proportion of Partnerships in Character Education projects
demonstrating improved student outcomes through valid, rigorous evaluations: 2004 cohort.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Partnerships in Character
Education Program biennial evaluation reports.
Frequency of Data Collection. Biennial
Data Quality. While all grantees are required to conduct evaluations, only those responding to the
competitive preference for rigorous evaluations are actually conducting valid, rigorous evaluations. Thus,
only a subset of Character Education grantees are actually reflected in the data collected under this
measure. Evaluation results will be available at the completion of the each project. Data related to each
cohort are collected biennially.
Explanation. A subset of grantees evaluate their projects using either experimental or quasi-experimental
designs. Evaluation reports will not be available annually. Future year targets will be established as
baseline data become available.
Measure 1.2 of 3: The number of referrals for disciplinary reasons in schools participating in the
Elementary and Secondary School Counseling program: 2005 cohort. (Desired direction:
decrease)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Elementary and
Secondary School Counseling Program Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
Measure 1.3 of 3: The number of suspensions for disciplinary reasons in schools participating in
the Elementary and Secondary School Counseling program: 2005 cohort. (Desired direction:
Measure 1.2 of 5: The number of new partner capabilities among partner museums in the
Exchanges with Historic Whaling and Trading Partners Program. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline (July 2006) Pending
Measure 1.4 of 5: The number of schools, community groups, and family programs involved in
educational and cultural enrichment activities of the Exchanges with Historic Whaling and
Trading Partners Program. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline (July 2006) Pending
2005 BL+10% Pending
2006 BL+15% Pending
2007 1,343 (March 2007) Pending
2008 1,343 (March 2008) Pending
Source. U.S. Department of Education, Historic Whaling Partnerships Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Measure 1.5 of 5: The number of participants in a culturally based youth internship program
under the Exchanges with Historic Whaling and Trading Partners Program involving career
awareness, leadership, and job skills development. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 Set a Baseline 120 Target Met
FY 2006 Program Performance Report
9 11/14/2006
U.S. Department of Education
2005 132 (July 2006) Pending
2006 139 (March 2007) Pending
2007 146 (March 2008) Pending
2008 146 (March 2009) Pending
Source. U.S. Department of Education, Historic Whaling Partnerships Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. Data are self reported by grantees.
Program Goal: To support mentoring programs and activities for children who
are at risk of educational failure, dropping out of school, or
involvement in criminal or delinquent activities, or who lack
strong positive role models.
Objective 1 of 1: Provide grants to community-based organizations and local school districts
to support mentoring programs for high-risk youth.
Measure 1.1 of 4: The percentage of student-mentor matches that are sustained by the
grantees for a period of 12 months: 2004 cohort. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline 44.86 Target Met
2007 BL+50% (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Mentoring Program,
Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
Data Quality. Of 100 reports sampled (from a total of 165 in the 2004 cohort), 65% (n=65) provided valid,
aggregable, data for this measure. In order to be valid and aggregable, grantees needed to report on
matches that were sustained for at least 12 months (for example, some grantees provided the number of
matches that were sustained for nine months, and we were not able to include this in the calculation).
Additionally, grantees needed to provide the total number of matches made, regardless of duration, as a
denominator on which to calculate the percentage.
Target Context.
The grant application for the 2004 Mentoring Program laid out the following targets for this measure:
The percentage of student/mentor matches that are sustained for a period of twelve months will increase
by:
• 0% by 2005;
Explanation. 44.86 percent (2092 of 4663 total matches) of the student-mentor partnerships by the
grantees providing valid, aggregable data were sustained for at least 12 months.
Measure 1.2 of 4: The percentage of mentored students who demonstrate improvement in core
academic subjects as measured by grade point average after 12 months: 2004 cohort.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline 49.62 Target Met
2007 BL+30% (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Mentoring Program,
Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
Data Quality. Of 100 reports sampled (from a total of 165 in the 2004 cohort), 58% (n=58) provided valid,
aggregable, data for this measure. In order to be valid and aggregable, grantees needed to report on the
number of students whose GPA in core academic subjects improved during the 12 months preceding the
performance report. Some inconsistencies which may affect data quality include:
-Some grantees reported on improvement during the school year, rather than comparing grades over a
12-month period. For practical reasons, these data were included in the aggregate.
-Some reports only provided data that were disaggregated by academic subject (i.e., reading, math,
language arts). In this case, for consistency's sake, we used the percentage of students whose GPA in
reading had improved.
Target Context.
The application for the 2004 Mentoring Program stated the following targets for this measure:
The percentage of mentored students who demonstrate improvement in core academic subjects as
measured by grade point average after 12 months will increase:
-5% by 2005;
-15% by 2006;
-30% by 2007.
Measure 1.3 of 4: The percentage of mentored students who have unexcused absences from
school: 2004 cohort. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 39.4 Target Met
2006 47.81 Measure not in place
2007 24 (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Mentoring Program,
Annual Grantee Performance Reports.
Of 100 reports sampled (from a total of 165 in the 2004 cohort), 43% (n=43) provided valid, aggregable,
data for this measure. In order to be valid and aggregable, grantees needed to report on the number of
students with unexcused absences during the reporting period. Some grantees provided data for all
absences, for number of suspensions, or for the number of students who improved their attendance; we
were not able to include these in the aggregate, resulting in the low response rate.
Target Context. The application for the 2004 Mentoring program included the following target:
The percentage of mentored students who have unexcused absences from school wil decrease:
-10% by 2005
-30% by 2006
-40% by 2007
Decreases will be measured from baseline data. Based on these targets, the target for 2006 was 28
percent. The program did not meet this target. The target for 2007 is 24 percent.
Explanation. 47.81 percent-- or 1246 of 2606-- of students served by Mentoring grants submitting valid,
aggregable data had at least one unexcused absence during the reporting period.
This data should be interpreted with caution due to the low response rate and the small amount of
valid aggregable data provided by grantees. Of 80 reports examined from the 2005 cohort, 8 (10%)
provided valid, aggregable data.
Explanation. 78.01 percent, or 330 of 425 students served by mentoring grants reporting valid,
aggregable data for this measure had at least one unexcused absence during the reporting period.
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Physical Education
Program, Annual Grantee Performance Reports.
Frequency of Data Collection. Annual
Data Quality. We sampled grants (n=106) from a list of all PEP grantees that received an initial award in
2004 (n=237). The list was rank-ordered by the first year award amount, and grants were systematically
sampled from that list to ensure that it was representative of the range of award amounts.
The rate of sampled grants submitting valid, aggregable data for this measure in the 2006 GPRA
collection is 38 percent.
Although it is difficult to explain the decrease in students making progress towards physical education
standards, OSDFS' emphasis on data quality in GPRA for the 2006 collection may have played a role. For
example, the number of grants providing valid, aggregable data was much greater in 2006 (n=40) than it
was in 2005 (n=25). This was due to increased follow-up and monitoring to gather GPRA data from PEP
grantees. Additionally, OSDFS program staff and PEP grantees were provided with improved technical
assistance around acceptable definitions and collection methods. These standards were both followed
and applied more consistently in the 2006 PEP data aggregation.
Target Context. No target was established for 2006.
Explanation. 65.06 percent (n=58,389) of students served by the sampled PEP grantees made progress
toward meeting state standards for physical education.
Measure 1.3 of 4: The percentage of students served by the Physical Education Program
grants that are actively participating in physical education activities: 2004 cohort. (Desired
direction: increase)
The rate of sampled grants submitting valid, aggregable data for this measure in the 2006 GPRA
collection is 34 percent.
There are some factors to consider when comparing 2006 and past-year data. OSDFS' emphasis on data
quality in GPRA for the 2006 collection may have caused variance. For example, the number of grants
providing valid, aggregable data was much greater in 2006 (n=36) than it was in 2005 (n=24). This was
due to increased follow-up and monitoring to gather GPRA data from PEP grantees. Additionally, OSDFS
program staff and PEP grantees were provided with improved technical assistance around acceptable
definitions and collection methods. These standards were both followed and applied more consistently in
the 2006 PEP data aggregation.
Explanation. 71.12 percent (n=55,579) of students served by the sampled PEP grantees actively
participated in physical education activities.
Measure 1.2 of 6: The percentage of Safe Schools/Healthy Students grant sites that experience
a decrease in the number of violent incidents at schools during the three-year grant period: 2005
cohort. (Desired direction: increase)
Year Target Actual Status
(or date expected)
2006 Set a Baseline (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities Other National Programs, Annual Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. There are 40 grantees in the 2005 cohort of Safe Schools/Healthy Students. All three
measures established for this program require three years of data, as the performance measures look at
grantee performance over the three-year grant period. Grantees will submit their annual reports,
containing baseline data for this measure, in November 2006. Grantees are anticipated to submit their
annual performance reports in November 2008. At this time, year three data will be compared to year one
data to determine if a decrease in violent incidents was experienced over the three-year grant period.
Measure 1.3 of 6: The percentage of Safe Schools/Healthy Students grant sites that experience
a decrease in substance abuse during the three-year grant period: 2004 cohort. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 0 Target Met
2006 Maintain a Baseline (December 2006) Pending
2007 90 (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities Other National Programs Annual Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. There are 24 grantees in the 2004 cohort of Safe Schools/Healthy Students. All three
measures established for this program require three years of data, as the performance measures look at
grantee performance over the three-year grant period. Grantees submitted their first annual reports in
2005. At this time, baseline data for all measures were collected. Nineteen grantees provided the baseline
data requested, resulting in a 79 percent response rate. These data are reported via school incident
reports and self-report behavioral surveys conducted by evaluators at each site. The next annual reports
for this program are due in November 2006. At that time, year two performance data will be compared to
year one data to determine if sites experienced a decrease in the rate of substance abuse. Year two data
will be considered interim data and will be used to provide feedback to grantees on their performance.
Grantees are anticipated to submit their annual performance reports in 2007. At this time, year three data
will be compared to year one data to determine if a decrease substance abuse was experienced over the
three-year grant period.
Measure 1.4 of 6: The percentage of Safe Schools/Healthy Students grant sites that experience
a decrease in substance abuse during the three-year grant period: 2005 cohort. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities Other National Programs Annual Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. There are 40 grantees in the 2005 cohort of Safe Schools/Healthy Students. All three
measures established for this program require three years of data, as the performance measures look at
grantee performance over the three-year grant period. Grantees will submit their first annual reports,
containing baseline data for this measure, in December 2006. Grantees are anticipated to submit their
annual performance reports in November 2008. At this time, year three data will be compared to year one
data to determine if a decrease the rate of substance abuse was experienced over the three-year grant
period.
Measure 1.5 of 6: The percentage of Safe Schools/Healthy Students grant sites that improve
school attendance during the three-year grant period: 2004 cohort. (Desired direction:
increase)
Measure 1.6 of 6: The percentage of Safe Schools/Healthy Students grant sites that improve
school attendance during the three-year grant period: 2005 cohort. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities Other National Programs, Annual Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. There are 40 grantees in the 2005 cohort of Safe Schools/Healthy Students. All three
measures established for this program require three years of data, as the performance measures look at
grantee performance over the three-year grant period. Grantees will submit their first annual reports,
containing baseline data for this measure, in December 2006. Grantees are anticipated to submit their
annual performance reports in November 2008. At this time, year three data will be compared to year one
data to determine if an improvement in attendance was experienced over the three-year grant period.
Objective 2 of 2: Student drug testing grantees will make substantial progress in reducing
substance abuse incidence among target students.
Measure 2.1 of 4: The percentage of Student Drug Testing grantees that experience a five
percent annual reduction in the incidence of past-month drug use by students in the target
population: 2003 cohort. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Set a Baseline 33 Target Met
2007 50 (December 2007) Pending
There are some major data quality issues to keep in mind when interpreting the data provided for the
measure “Percentage of Student Drug Testing grantees that experience a five percent annual reduction in
the incidence of past-month drug use by students in the target population.” Grantees needed to have
provided two years of data, using the identical measure in both years, on self-reported student past-
month drug use. The decrease between years had to be by at least 5 percent to meet the threshold for
this measure. Of the 8 grantees, 3 provided two years of valid data (38% response rate). Of those, one
experienced a decrease in past-month drug use of 5 percent or more. Due to the very low response rate,
caution is urged in drawing any conclusions about this program’s performance.
Measure 2.2 of 4: The percentage of Student Drug Testing grantees that experience a five
percent annual reduction in the incidence of past-year drug use by students in the target
population: 2003 cohort. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 0 Target Met
2006 Set a Baseline 25 Target Met
2007 50 (December 2007) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities Other National Programs, Annual Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality. There were 8 grantees in the 2003 cohort of Drug Testing grantees. Implementation of all
grants was behind by about a year due to a delay in IRB approval. Because the level of analysis for this
measure is grant sites who experience decreases in student drug use, it requires the comparison of two
years of data. Taking into account the nature of the measure and the IRB-related delay, no data were
available for this measure until 2006.
There are some major data quality issues to keep in mind when interpreting the data provided for the
measure “Percentage of Student Drug Testing grantees that experience a five percent annual reduction in
the incidence of past-year drug use by students in the target population.” Grantees needed to have
provided two years of data, using the identical measure in both years, on self-reported student past-
month drug use. The decrease between years had to be by at least 5 percent to meet the threshold for
this measure. Of the 8 grantees, 4 provided two years of valid data (50% response rate). Of those, one
experienced a decrease in past-year drug use of 5 percent or more. Due to the very low response rate,
caution is urged in drawing any conclusions about this program’s performance.
Measure 2.3 of 4: The percentage of Student Drug Testing grantees that experience a five
percent annual reduction in the incidence of past-month drug use by students in the target
population: 2005 cohort. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
FY 2006 Program Performance Report
18 11/14/2006
U.S. Department of Education
2007 33 (August 2007) Pending
2008 50 (August 2008) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities Other National Programs, Annual Grantee Performance Report.
Frequency of Data Collection. Annual
Measure 2.4 of 4: The percentage of Student Drug Testing grantees that experience a five
percent annual reduction in the incidence of past-year drug use by students in the target
population: 2005 cohort. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2007 25 (August 2007) Pending
2008 50 (August 2008) Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities Other National Programs, Annual Grantee Performance Report.
Frequency of Data Collection. Annual
Data Quality
From the inception of YRBSS, CDC has been committed to ensuring that the data are the highest quality.
High quality data begins with high quality questionnaire items. As described previously, the original
questionnaire was subjected to laboratory and field testing. CDC also has conducted reliability and
validity testing of the 1991 and 1999 versions of the questionnaires. In addition, in 1999, when CDC
changed the YRBS question that assesses race/ethnicity to comply with new standards established by
the Office of Management and Budget (74), CDC conducted a study to assess the effect of the new
race/ethnicity question on reported race/ethnicity. The study indicated that the revised wording had only a
minimal effect on reported race/ethnicity and that trend analyses that included white, black, and Hispanic
subgroups were not affected (22). Another aspect of data quality is the level of nonresponse to questions.
For the 2003 national YRBS, nonresponse attributed to blank responses, invalid responses, out-of-range
responses, and responses that did not meet edit criteria ranged from 0.4% for the question that assesses
respondent age to 15.5% for the question that assesses injurious suicide attempt. For two thirds of all
questions, the nonresponse rate was <1%.
To further ensure data quality, surveys are administered by using standardized procedures. To determine
how using different procedures can affect survey results, CDC conducted two methodologic studies. In
the first study, conducted in 2002, CDC examined how varying honesty appeals,§ the wording of
questions, and data-editing protocols — while holding population, setting, questionnaire context, and
mode of administration constant — affected prevalence estimates (75). The study indicated that different
honesty appeals and data-editing protocols do not have a statistically significant effect on prevalence
estimates. In addition, the study indicated that, although differences in the wording of questions can
create statistically significant differences in certain prevalence estimates, no particular type of wording
consistently produced higher or lower estimates.
In the second study, conducted in 2004, CDC examined how varying the mode and setting of survey
administration might affect prevalence estimates. In this study, the standard paper-and pencil method of
survey administration was compared with computer-assisted self-interviewing (CASI). Researchers
determined from previous studies that, in household settings, adolescents are more likely to report
sensitive behaviors when using CASI than when using paper-and-pencil questionnaires (76,77), but this
effect has not been demonstrated in school settings (78,79). In the 2004 study, CDC also compared
whether prevalence estimates varied by where the questionnaire was administered, in schools or in
students’ homes. Researchers who have compared the results of surveys administered in school versus
in household settings typically have determined that students are more likely to report sensitive behaviors
in school-based settings (21,80,81). However, in these studies, students were not randomly assigned to
setting. In one study in which random assignment to setting was used, these effects were not observed
(82). The CDC study is the first in which both mode and setting were systematically varied, while holding
constant population, questionnaire context, wording of questions, and data-editing protocols. This study is
also the first one in which random assignment to condition was used. Results from this study should be
available in 2005.
Limitations
YRBSS has multiple limitations. First, all YRBS data are selfreported, and the extent of underreporting or
overreporting of behaviors cannot be determined, although measures described in this report
demonstrate that the data are of acceptable quality. Second, the national, state, and local school-based
survey data apply only to youth who attend school and, therefore, are not representative of all persons in
this age group. Nationwide, of persons aged 16–17 years, approximately 6% were not enrolled in a high
school program and had not completed high school (83). The NHIS and Youth Risk Behavior Supplement
conducted in 1992 demonstrated that out-of-school youth are more likely than youth attending school to
engage in the majority of health-risk behaviors (84). Third, because local parental permission procedures
are observed in the schoolbased surveys, procedures are not consistent across sites. However, in a 2004
study, CDC demonstrated that the type of parental permission typically does not affect prevalence
estimates as long as student response rates remain high (85). Fourth, state-level data are not available
FY 2006 Program Performance Report
20 11/14/2006
U.S. Department of Education
for all 50 states. Fifth, when response rates are insufficient to permit weighting, state and local data
represent only those students who participated in the survey and are not generalizable to the entire
jurisdiction. Sixth, whereas YRBSS is designed to produce information to help assess the effect of broad
national, state, and local policies and programs, it was not designed to evaluate the effectiveness of
specific interventions (e.g., a professional development program, school curriculum, or media campaign).
Finally, YRBSS only addresses behaviors that contribute to the leading causes of morbidity and mortality
among youth and adults. However, despite this limited scope, school and community interventions should
focus not only on behaviors but also on the determinants of those behaviors.
Explanation. This is a long-term measure. Data are collected on a calendar-year, not a school-year,
basis from a nationally representative sample of students.
Measure 1.2 of 7: The percentage of students in grades 9-12 who used marijuana one or more
times during the past 30 days. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2001 24 Measure not in place
2003 22 Measure not in place
2005 21 20 Did Better Than Target
2007 19 (September 2008) Pending
2009 18 (September 2010) Pending
2011 17 (September 2011) Pending
Source. U.S. Department of Health and Human Services, Centers for Disease Control, Youth Risk
Behavior Surveillance System (YRBSS).
Frequency of Data Collection. Biennial
Data Quality. The following is excerpted from the Centers for Disease Control and Prevention's
"Methodology of the Youth Risk Behavior Surveillance System" (MMWR 2004;53(No. RR-12):[10].)
Data Quality
From the inception of YRBSS, CDC has been committed to ensuring that the data are the highest quality.
High quality data begins with high quality questionnaire items. As described previously, the original
questionnaire was subjected to laboratory and field testing. CDC also has conducted reliability and
validity testing of the 1991 and 1999 versions of the questionnaires. In addition, in 1999, when CDC
changed the YRBS question that assesses race/ethnicity to comply with new standards established by
the Office of Management and Budget (74), CDC conducted a study to assess the effect of the new
race/ethnicity question on reported race/ethnicity. The study indicated that the revised wording had only a
minimal effect on reported race/ethnicity and that trend analyses that included white, black, and Hispanic
subgroups were not affected (22). Another aspect of data quality is the level of nonresponse to questions.
For the 2003 national YRBS, nonresponse attributed to blank responses, invalid responses, out-of-range
responses, and responses that did not meet edit criteria ranged from 0.4% for the question that assesses
respondent age to 15.5% for the question that assesses injurious suicide attempt. For two thirds of all
questions, the nonresponse rate was
To further ensure data quality, surveys are administered by using standardized procedures. To determine
how using different procedures can affect survey results, CDC conducted two methodologic studies. In
the first study, conducted in 2002, CDC examined how varying honesty appeals,§ the wording of
questions, and data-editing protocols — while holding population, setting, questionnaire context, and
mode of administration constant — affected prevalence estimates (75). The study indicated that different
honesty appeals and data-editing protocols do not have a statistically significant effect on prevalence
estimates. In addition, the study indicated that, although differences in the wording of questions can
create statistically significant differences in certain prevalence estimates, no particular type of wording
consistently produced higher or lower estimates.
FY 2006 Program Performance Report
21 11/14/2006
U.S. Department of Education
In the second study, conducted in 2004, CDC examined how varying the mode and setting of survey
administration might affect prevalence estimates. In this study, the standard paper-and pencil method of
survey administration was compared with computer-assisted self-interviewing (CASI). Researchers
determined from previous studies that, in household settings, adolescents are more likely to report
sensitive behaviors when using CASI than when using paper-and-pencil questionnaires (76,77), but this
effect has not been demonstrated in school settings (78,79). In the 2004 study, CDC also compared
whether prevalence estimates varied by where the questionnaire was administered, in schools or in
students’ homes. Researchers who have compared the results of surveys administered in school versus
in household settings typically have determined that students are more likely to report sensitive behaviors
in school-based settings (21,80,81). However, in these studies, students were not randomly assigned to
setting. In one study in which random assignment to setting was used, these effects were not observed
(82). The CDC study is the first in which both mode and setting were systematically varied, while holding
constant population, questionnaire context, wording of questions, and data-editing protocols. This study is
also the first one in which random assignment to condition was used. Results from this study should be
available in 2005.
Limitations
YRBSS has multiple limitations. First, all YRBS data are self-reported, and the extent of underreporting or
overreporting of behaviors cannot be determined, although measures described in this report
demonstrate that the data are of acceptable quality. Second, the national, state, and local school-based
survey data apply only to youth who attend school and, therefore, are not representative of all persons in
this age group. Nationwide, of persons aged 16–17 years, approximately 6% were not enrolled in a high
school program and had not completed high school (83). The NHIS and Youth Risk Behavior Supplement
conducted in 1992 demonstrated that out-of-school youth are more likely than youth attending school to
engage in the majority of health-risk behaviors (84). Third, because local parental permission procedures
are observed in the school-based surveys, procedures are not consistent across sites. However, in a
2004 study, CDC demonstrated that the type of parental permission typically does not affect prevalence
estimates as long as student response rates remain high (85). Fourth, state-level data are not available
for all 50 states. Fifth, when response rates are insufficient to permit weighting, state and local data
represent only those students who participated in the survey and are not generalizable to the entire
jurisdiction. Sixth, whereas YRBSS is designed to produce information to help assess the effect of broad
national, state, and local policies and programs, it was not designed to evaluate the effectiveness of
specific interventions (e.g., a professional development program, school curriculum, or media campaign).
Finally, YRBSS only addresses behaviors that contribute to the leading causes of morbidity and mortality
among youth and adults. However, despite this limited scope, school and community interventions should
focus not only on behaviors but also on the determinants of those behaviors.
Explanation. Data are collected on a calendar-year, not a school-year, basis from a nationally
representative sample of students.
Measure 1.3 of 7: The percentage of students in grades 9-12 who had five or more drinks of
alcohol in a row (that is, within a couple of hours) one or more times during the past 30 days.
(Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2001 30 Measure not in place
2003 28 Measure not in place
2005 27 26 Did Better Than Target
2007 26 (September 2008) Pending
2009 25 (September 2010) Pending
2011 24 (September 2012) Pending
Data Quality
From the inception of YRBSS, CDC has been committed to ensuring that the data are the highest quality.
High quality data begins with high quality questionnaire items. As described previously, the original
questionnaire was subjected to laboratory and field testing. CDC also has conducted reliability and
validity testing of the 1991 and 1999 versions of the questionnaires. In addition, in 1999, when CDC
changed the YRBS question that assesses race/ethnicity to comply with new standards established by
the Office of Management and Budget (74), CDC conducted a study to assess the effect of the new
race/ethnicity question on reported race/ethnicity. The study indicated that the revised wording had only a
minimal effect on reported race/ethnicity and that trend analyses that included white, black, and Hispanic
subgroups were not affected (22). Another aspect of data quality is the level of nonresponse to questions.
For the 2003 national YRBS, nonresponse attributed to blank responses, invalid responses, out-of-range
responses, and responses that did not meet edit criteria ranged from 0.4% for the question that assesses
respondent age to 15.5% for the question that assesses injurious suicide attempt. For two thirds of all
questions, the nonresponse rate was
To further ensure data quality, surveys are administered by using standardized procedures. To determine
how using different procedures can affect survey results, CDC conducted two methodologic studies. In
the first study, conducted in 2002, CDC examined how varying honesty appeals,§ the wording of
questions, and data-editing protocols — while holding population, setting, questionnaire context, and
mode of administration constant — affected prevalence estimates (75). The study indicated that different
honesty appeals and data-editing protocols do not have a statistically significant effect on prevalence
estimates. In addition, the study indicated that, although differences in the wording of questions can
create statistically significant differences in certain prevalence estimates, no particular type of wording
consistently produced higher or lower estimates.
In the second study, conducted in 2004, CDC examined how varying the mode and setting of survey
administration might affect prevalence estimates. In this study, the standard paper-and pencil method of
survey administration was compared with computer-assisted self-interviewing (CASI). Researchers
determined from previous studies that, in household settings, adolescents are more likely to report
sensitive behaviors when using CASI than when using paper-and-pencil questionnaires (76,77), but this
effect has not been demonstrated in school settings (78,79). In the 2004 study, CDC also compared
whether prevalence estimates varied by where the questionnaire was administered, in schools or in
students’ homes. Researchers who have compared the results of surveys administered in school versus
in household settings typically have determined that students are more likely to report sensitive behaviors
in school-based settings (21,80,81). However, in these studies, students were not randomly assigned to
setting. In one study in which random assignment to setting was used, these effects were not observed
(82). The CDC study is the first in which both mode and setting were systematically varied, while holding
constant population, questionnaire context, wording of questions, and data-editing protocols. This study is
also the first one in which random assignment to condition was used. Results from this study should be
available in 2005.
Limitations
YRBSS has multiple limitations. First, all YRBS data are self-reported, and the extent of underreporting or
overreporting of behaviors cannot be determined, although measures described in this report
demonstrate that the data are of acceptable quality. Second, the national, state, and local school-based
survey data apply only to youth who attend school and, therefore, are not representative of all persons in
this age group. Nationwide, of persons aged 16–17 years, approximately 6% were not enrolled in a high
school program and had not completed high school (83). The NHIS and Youth Risk Behavior Supplement
conducted in 1992 demonstrated that out-of-school youth are more likely than youth attending school to
FY 2006 Program Performance Report
23 11/14/2006
U.S. Department of Education
engage in the majority of health-risk behaviors (84). Third, because local parental permission procedures
are observed in the school-based surveys, procedures are not consistent across sites. However, in a
2004 study, CDC demonstrated that the type of parental permission typically does not affect prevalence
estimates as long as student response rates remain high (85). Fourth, state-level data are not available
for all 50 states. Fifth, when response rates are insufficient to permit weighting, state and local data
represent only those students who participated in the survey and are not generalizable to the entire
jurisdiction. Sixth, whereas YRBSS is designed to produce information to help assess the effect of broad
national, state, and local policies and programs, it was not designed to evaluate the effectiveness of
specific interventions (e.g., a professional development program, school curriculum, or media campaign).
Finally, YRBSS only addresses behaviors that contribute to the leading causes of morbidity and mortality
among youth and adults. However, despite this limited scope, school and community interventions should
focus not only on behaviors but also on the determinants of those behaviors.
Explanation. Data are collected on a calendar-year, not a school-year, basis from a nationally
representative sample of students.
Measure 1.4 of 7: The percentage of students in grades 9-12 who were in a physical fight on
school property one or more times during the past 12 months. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
1999 14 Pending
2001 12 13 Did Not Meet Target
2003 12 13 Did Not Meet Target
2005 12 14 Did Not Meet Target
2007 12 (September 2008) Pending
2009 11 (September 2010) Pending
2011 11 (September 2012) Pending
Source. U.S. Department of Health and Human Services, Centers for Disease Control, Youth Risk
Behavior Surveillance System (YRBSS).
Frequency of Data Collection. Other
Data Quality. The following is excerpted from the Centers for Disease Control and Prevention's
"Methodology of the Youth Risk Behavior Surveillance System" (MMWR 2004;53(No. RR-12):[10].)
Data Quality
From the inception of YRBSS, CDC has been committed to ensuring that the data are the highest quality.
High quality data begins with high quality questionnaire items. As described previously, the original
questionnaire was subjected to laboratory and field testing. CDC also has conducted reliability and
validity testing of the 1991 and 1999 versions of the questionnaires. In addition, in 1999, when CDC
changed the YRBS question that assesses race/ethnicity to comply with new standards established by
the Office of Management and Budget (74), CDC conducted a study to assess the effect of the new
race/ethnicity question on reported race/ethnicity. The study indicated that the revised wording had only a
minimal effect on reported race/ethnicity and that trend analyses that included white, black, and Hispanic
subgroups were not affected (22). Another aspect of data quality is the level of nonresponse to questions.
For the 2003 national YRBS, nonresponse attributed to blank responses, invalid responses, out-of-range
responses, and responses that did not meet edit criteria ranged from 0.4% for the question that assesses
respondent age to 15.5% for the question that assesses injurious suicide attempt. For two thirds of all
questions, the nonresponse rate was
To further ensure data quality, surveys are administered by using standardized procedures. To determine
how using different procedures can affect survey results, CDC conducted two methodologic studies. In
In the second study, conducted in 2004, CDC examined how varying the mode and setting of survey
administration might affect prevalence estimates. In this study, the standard paper-and pencil method of
survey administration was compared with computer-assisted self-interviewing (CASI). Researchers
determined from previous studies that, in household settings, adolescents are more likely to report
sensitive behaviors when using CASI than when using paper-and-pencil questionnaires (76,77), but this
effect has not been demonstrated in school settings (78,79). In the 2004 study, CDC also compared
whether prevalence estimates varied by where the questionnaire was administered, in schools or in
students’ homes. Researchers who have compared the results of surveys administered in school versus
in household settings typically have determined that students are more likely to report sensitive behaviors
in school-based settings (21,80,81). However, in these studies, students were not randomly assigned to
setting. In one study in which random assignment to setting was used, these effects were not observed
(82). The CDC study is the first in which both mode and setting were systematically varied, while holding
constant population, questionnaire context, wording of questions, and data-editing protocols. This study is
also the first one in which random assignment to condition was used. Results from this study should be
available in 2005.
Limitations
YRBSS has multiple limitations. First, all YRBS data are self-reported, and the extent of underreporting or
overreporting of behaviors cannot be determined, although measures described in this report
demonstrate that the data are of acceptable quality. Second, the national, state, and local school-based
survey data apply only to youth who attend school and, therefore, are not representative of all persons in
this age group. Nationwide, of persons aged 16–17 years, approximately 6% were not enrolled in a high
school program and had not completed high school (83). The NHIS and Youth Risk Behavior Supplement
conducted in 1992 demonstrated that out-of-school youth are more likely than youth attending school to
engage in the majority of health-risk behaviors (84). Third, because local parental permission procedures
are observed in the school-based surveys, procedures are not consistent across sites. However, in a
2004 study, CDC demonstrated that the type of parental permission typically does not affect prevalence
estimates as long as student response rates remain high (85). Fourth, state-level data are not available
for all 50 states. Fifth, when response rates are insufficient to permit weighting, state and local data
represent only those students who participated in the survey and are not generalizable to the entire
jurisdiction. Sixth, whereas YRBSS is designed to produce information to help assess the effect of broad
national, state, and local policies and programs, it was not designed to evaluate the effectiveness of
specific interventions (e.g., a professional development program, school curriculum, or media campaign).
Finally, YRBSS only addresses behaviors that contribute to the leading causes of morbidity and mortality
among youth and adults. However, despite this limited scope, school and community interventions should
focus not only on behaviors but also on the determinants of those behaviors.
Explanation. Data are collected on a calendar-year, not a school-year, basis from a nationally
representative sample of students.
Measure 1.5 of 7: The percentage of students in grades 9-12 who carried a weapon such as a
gun, knife, or club on school property one or more times during the past 30 days. (Desired
direction: decrease)
Actual
Year Target Status
(or date expected)
2001 6 Measure not in place
Data Quality
From the inception of YRBSS, CDC has been committed to ensuring that the data are the highest quality.
High quality data begins with high quality questionnaire items. As described previously, the original
questionnaire was subjected to laboratory and field testing. CDC also has conducted reliability and
validity testing of the 1991 and 1999 versions of the questionnaires. In addition, in 1999, when CDC
changed the YRBS question that assesses race/ethnicity to comply with new standards established by
the Office of Management and Budget (74), CDC conducted a study to assess the effect of the new
race/ethnicity question on reported race/ethnicity. The study indicated that the revised wording had only a
minimal effect on reported race/ethnicity and that trend analyses that included white, black, and Hispanic
subgroups were not affected (22). Another aspect of data quality is the level of nonresponse to questions.
For the 2003 national YRBS, nonresponse attributed to blank responses, invalid responses, out-of-range
responses, and responses that did not meet edit criteria ranged from 0.4% for the question that assesses
respondent age to 15.5% for the question that assesses injurious suicide attempt. For two thirds of all
questions, the nonresponse rate was <5%, and for 11% of all questions, the nonresponse rate was <1%.
To further ensure data quality, surveys are administered by using standardized procedures. To determine
how using different procedures can affect survey results, CDC conducted two methodologic studies. In
the first study, conducted in 2002, CDC examined how varying honesty appeals,§ the wording of
questions, and data-editing protocols — while holding population, setting, questionnaire context, and
mode of administration constant — affected prevalence estimates (75). The study indicated that different
honesty appeals and data-editing protocols do not have a statistically significant effect on prevalence
estimates. In addition, the study indicated that, although differences in the wording of questions can
create statistically significant differences in certain prevalence estimates, no particular type of wording
consistently produced higher or lower estimates.
In the second study, conducted in 2004, CDC examined how varying the mode and setting of survey
administration might affect prevalence estimates. In this study, the standard paper-and pencil method of
survey administration was compared with computer-assisted self-interviewing (CASI). Researchers
determined from previous studies that, in household settings, adolescents are more likely to report
sensitive behaviors when using CASI than when using paper-and-pencil questionnaires (76,77), but this
effect has not been demonstrated in school settings (78,79). In the 2004 study, CDC also compared
whether prevalence estimates varied by where the questionnaire was administered, in schools or in
students’ homes. Researchers who have compared the results of surveys administered in school versus
in household settings typically have determined that students are more likely to report sensitive behaviors
in school-based settings (21,80,81). However, in these studies, students were not randomly assigned to
setting. In one study in which random assignment to setting was used, these effects were not observed
(82). The CDC study is the first in which both mode and setting were systematically varied, while holding
constant population, questionnaire context, wording of questions, and data-editing protocols. This study is
also the first one in which random assignment to condition was used. Results from this study should be
available in 2005.
Limitations
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities State Grant Program, forthcoming evaluation study.
Frequency of Data Collection. Annual
Measure 1.7 of 7: The percentage of Safe and Drug-Free Schools and Communities State
Grant funded research-based drug and violence prevention programs/practices that are
implemented with fidelity. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline (September 2007) Pending
2006 Maintain a Baseline Pending
2007 Maintain a Baseline Pending
Source. U.S. Department of Education, Office of Safe and Drug Free Schools, Safe and Drug-Free
Schools and Communities State Grant Program, forthcoming evaluation study.
Frequency of Data Collection. Annual
Program Goal: To prepare and train Indians to serve as teachers and school
administrators.
Objective 1 of 1: Indian Education National Activities focus on research, evaluation, collection,
dissemination and analyses of the educational status, needs and effective
approaches for the education of American Indian and Alaska Native children
and adults.
Measure 1.1 of 2: The number of annual hits on the NCES Web based data tool and the OIE
Web site. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 BL+1% (December 2007) Pending
Source. U.S. Department of Education, National Center for Educational Statistics, Website.
Measure 1.2 of 2: The percentage of high quality national educational studies that oversample
and report statistically reliable data on American Indian and Alaska Natives. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 BL+1% (December 2007) Pending
Source. U.S.Department of Education, National Center for Educational Statistics, Website.
Frequency of Data Collection. Annual
IES researchers evaluate all research and evaluation proposals newly funded by IES to identify projects
that address causal questions and, of those projects, those that use randomized experimental designs to
answer those questions.
Explanation. The target for 2007 of the lower of the-baseline-plus-ten-percent or 75%, and for 2008 of
the lower of the-baseline-plus-20-percent or 75%, recognizes that some high quality research addressing
causal questions will not be able to employ randomized experimental designs. This may particularly be
the case for special education research involving children with low-incidence disabilities. Presence of a
causal question is defined as instances in which the investigation is designed to examine the effects of
FY 2006 Program Performance Report
2 11/14/2006
U.S. Department of Education
one variable on a second variable. A causal relation might be expressed as one variable influencing,
affecting, or changing another variable. A randomized experimental design is defined as instances in
which there are (a) an experimental (treatment) group and one or more comparison groups, and (b)
random assignment of participants to treatment and comparison groups, or random assignment of groups
(e.g., classrooms or schools) to treatment and comparison conditions. If a proposal includes a design in
which two or more groups of participants are compared, but the PI does not explicitly indicate that random
assignment procedures will be used, the proposal is recorded as not using a randomized experimental
design.
Measure 1.2 of 2: The percentage of new research proposals funded by the Department's
National Center for Special Education Research that receive an average score of excellent or
higher from an independent review panel of qualified scientists. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline 89 Target Met
2007 90 (September 2007) Pending
2008 90 (September 2008) Pending
Source. U.S. Department of Education, Institute of Education Sciences, expert panel review.
Frequency of Data Collection. Annual
Data Quality. Evaluations are only as good as the qualifications of the peer review panel. Inclusion of
senior scientists who are leading researchers in their fields ensures the quality of the data.
Objective 2 of 2: Increase the relevance of our research in order to meet the needs of our
customers.
Measure 2.1 of 1: The percentage of new research projects funded by the Department's
National Center for Special Education Research that are deemed to be of high relevance by an
independent review panel of qualified practitioners. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 BL+10% (November 2007) Pending
2008 BL+20% (November 2008) Pending
Source. U.S. Department of Education, Institute of Education Sciences, independent review panel.
Frequency of Data Collection. Annual
Data Quality. Evaluations are only as good as the qualifications of the peer review panel. Inclusion of
experienced practitioners and administrators in education and special education assures the quality of the
data.
Explanation. The target for 2007 of the lower of the-baseline-plus-ten-percent or 75% and for 2008 of the
lower of the-baseline-plus-20-percent or 75% recognizes that some important research may not seem
immediately relevant but will make important contributions over the long term.
Measure 1.2 of 2: Of new research and evaluation projects funded by the Department's
National Center for Education Research and National Center for Education Evaluation and
Regional Assistance that address causal questions, the percentage of projects that employ
randomized experimental designs. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 32 Measure not in place
2002 75 100 Target Exceeded
2003 75 97 Target Exceeded
2004 75 90 Target Exceeded
2005 75 94 Target Exceeded
2006 75 (December 2006) Pending
2007 75 (October 2007) Pending
2008 75 (October 2008) Pending
Source. U.S. Department of Education, Institute of Education Sciences, program report.
Frequency of Data Collection. Annual
Data Quality.
Explanation. The 75 percent target for 2002-06 recognizes that some high-quality research addressing
causal questions will not be able to employ randomized experimental designs. Presence of a causal
question is defined as instances in which the investigation is designed to examine the effects of one
variable on a second variable. A causal relation might be expressed as one variable influencing, affecting,
or changing another variable. A randomized experimental design is defined as instances in which there
are (a) an experimental (treatment) group and one or more comparison groups, and (b) random
assignment of participants to treatment and comparison groups, or random assignment of groups (e.g.,
classrooms or schools) to treatment and comparison conditions. If a proposal includes a design in which
two or more groups of participants are compared, but the PI does not explicitly indicate that random
assignment procedures will be used, the proposal is recorded as not using a randomized experimental
design.
Objective 2 of 2: Increase the relevance of our research in order to meet the needs of our
customers.
Measure 2.1 of 3: The percentage of new research projects funded by the Department's
National Center for Education Research and National Center for Education Evaluation and
Regional Assistance that are deemed to be of high relevance to education practices as
determined by an independent review panel of qualified practitioners. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2001 21 Measure not in place
2002 25 25 Target Met
2003 37 60 Target Exceeded
2004 50 50 Target Met
2005 65 (December 2006) Pending
2006 75 (March 2007) Pending
2007 75 (March 2008) Pending
2008 75 (March 2009) Pending
Source. U.S. Department of Education, Institute of Education Sciences, expert panel of qualified
practitioners.
Frequency of Data Collection. Annual
Data Quality. Evaluations are only as good as the qualifications of the external review panel. Inclusion of
experienced practitioners and administrators in education and special education assures the quality of the
data.
Explanation. The target of 75 percent for 2006 recognizes that some important research may not seem
immediately relevant but will make important contributions over the long term.
Measure 2.2 of 3: The number of annual hits on the What Works Clearinghouse Web site.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 1,000,000 1,522,922 Target Exceeded
2004 2,000,000 4,249,668 Target Exceeded
FY 2006 Program Performance Report
6 11/14/2006
U.S. Department of Education
2005 4,500,000 5,706,257 Target Exceeded
2006 5,000,000 6,794,141 Target Exceeded
2007 5,500,000 (October 2007) Pending
2008 5,800,000 (October 2008) Pending
Source. U.S. Department of Education, Institute of Education Sciences, What Works Clearinghouse Web
site.
Frequency of Data Collection. Annual
Data Quality. A Web based program automatically counts the hits on this Web site.
ESRA: Statistics
FY 2006 Program Performance Report
Strategic Goal 4
Other
ESRA, Part C
Document Year 2006 Appropriation: $90,022
When changing to the on-line survey, NCES also modified the questions asked of respondents. The
Department no longer collects information specifically on the comprehensiveness of NCES data files.
Prior year data was validated by using NCES review procedures and by applying NCES statistical
standards.
Measure 1.2 of 7: The percentage of customer respondents satisfied or very satisfied with the
timeliness of NCES data files. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 52 Measure not in place
1999 85 67 Made Progress From Prior Year
2001 90 66 Made Progress From Prior Year
2004 90 78 Made Progress From Prior Year
2006 90 86 Made Progress From Prior Year
2007 90 (July 2007) Pending
2008 90 (July 2008) Pending
Source. U. S. Department of Education, National Center for Education Statistics, customer satisfaction
survey.
Frequency of Data Collection. Annual
Data Quality. In 2006, NCES changed its customer service survey data collection to an on-line random
sample survey of NCES customers who visited the NCES Web-site. From 1997 through 2004 NCES
administered a biennial pen & pencil paper survey with a telephone follow-up to over 3,900 academic
researchers, education associations, education journalists, users of the NCES's National Education Data
Resource Center, and Federal, State, and local policymakers. The survey format using pen and paper
with the telephone follow-up was becoming increasingly expensive vs. the limited costs of conducting on-
line surveys.
When changing to the on-line survey, NCES also modified the questions asked of respondents. Given
these changes, data collected prior to 2006 are not comparable to those collected in 2006 and future
years.
Data will be validated by using NCES review procedures and by applying NCES statistical standards. The
NCES Monitoring System will yield annual updates on the use and applications of NCES data.
Explanation.
NCES expects that each year, all user manuals for NCES public-use data files will be available on the
Web, at least 50 percent of its public-use data files will be available on the Web, and 75 percent of
nonassessment surveys will be administered either through the use of computerized interviews or directly
over the Web. NCES views Web release of its reports as a source of increased efficiency and is
committed to releasing at least 90 percent of its reports on the Web. These efficiency steps will facilitate
easier, quicker, and wider access to NCES products.
When changing to the on-line survey, NCES also modified the questions asked of respondents.
Measure 1.4 of 7: The percentage of customer respondents satisfied or very satisfied with the
timeliness of NCES publications. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 72 Measure not in place
1999 85 77 Made Progress From Prior Year
2001 90 74 Made Progress From Prior Year
2004 90 78 Made Progress From Prior Year
2006 90 85 Made Progress From Prior Year
2007 90 (July 2007) Pending
2008 90 (July 2008) Pending
Source. U.S. Department of Education, National Center for Education Statistics, customer satisfaction
survey.
Frequency of Data Collection. Annual
Data Quality. In 2006, NCES changed its customer service survey data collection to an on-line random
sample survey of NCES customers who visited the NCES Web-site. From 1997 through 2004 NCES
administered a biennial pen & pencil paper survey with a telephone follow-up to over 3,900 academic
researchers, education associations, education journalists, users of the NCES's National Education Data
Resource Center, and Federal, State, and local policymakers. The survey format using pen and paper
with the telephone follow-up was becoming increasingly expensive vs. the limited costs of conducting on-
line surveys.
When changing to the on-line survey, NCES also modified the questions asked of respondents. Given
Data will be validated by using NCES review procedures and by applying NCES statistical standards. The
NCES Monitoring System will yield annual updates on the use and applications of NCES data.
Explanation. NCES expects that each year, all user manuals for NCES public-use data files will be
available on the Web, at least 50 percent of its public-use data files will be available on the Web, and 75
percent of nonassessment surveys will be administered either through the use of computerized interviews
or directly over the Web. NCES views Web release of its reports as a source of increased efficiency and
is committed to releasing at least 90 percent of its reports on the Web. The efficiency steps will facilitate
easier, quicker, and wider access to NCES products.
Measure 1.5 of 7: The percentage of customer respondents satisfied or very satisfied with the
utility of NCES publications. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 86 Measure not in place
1999 85 89 Target Exceeded
2001 90 90 Target Met
2004 90 90 Target Met
2006 90 Not Collected Not Collected
Source. U.S. Department of Education, National Center for Education Statistics, customer satisfaction
survey.
Data Quality. In 2006, NCES changed its customer service survey data collection to an on-line random
sample survey of NCES customers who visited the NCES Web-site. From 1997 through 2004 NCES
administered a biennial pen & pencil paper survey with a telephone follow-up to over 3,900 academic
researchers, education associations, education journalists, users of the NCES's National Education Data
Resource Center, and Federal, State, and local policymakers. The survey format using pen and paper
with the telephone follow-up was becoming increasingly expensive vs. the limited costs of conducting on-
line surveys.
When changing to the on-line survey, NCES also modified the questions asked of respondents.
The Department no longer collects information specifically on the utility of NCES publications. Prior year
data was validated by using NCES review procedures and by applying NCES statistical standards.
Measure 1.6 of 7: The percentage of customer respondents satisfied or very satisfied with the
comprehensiveness of NCES services. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 85 93 Target Exceeded
2001 90 83 Made Progress From Prior Year
2004 90 92 Target Exceeded
2006 90 Not Collected Not Collected
Source. U.S. Department of Education, National Center for Education Statistics, customer satisfaction
survey.
Data Quality. In 2006, NCES changed its customer service survey data collection to an on-line random
sample survey of NCES customers who visited the NCES Web-site. From 1997 through 2004 NCES
FY 2006 Program Performance Report
10 11/14/2006
U.S. Department of Education
administered a biennial pen & pencil paper survey with a telephone follow-up to over 3,900 academic
researchers, education associations, education journalists, users of the NCES's National Education Data
Resource Center, and Federal, State, and local policymakers. The survey format using pen and paper
with the telephone follow-up was becoming increasingly expensive vs. the limited costs of conducting on-
line surveys.
When changing to the on-line survey, NCES also modified the questions asked of respondents.
The Department no longer collects information specifically on the comprehensiveness of NCES services.
Prior year data was validated by using NCES review procedures and by applying NCES statistical
standards.
Measure 1.7 of 7: The percentage of customer respondents satisfied or very satisfied with the
timeliness of NCES services. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 89 Measure not in place
1999 85 93 Target Exceeded
2001 90 88 Made Progress From Prior Year
2004 90 84 Made Progress From Prior Year
2006 90 92.1 Target Exceeded
2007 90 (July 2007) Pending
2008 90 (July 2008) Pending
Source. U.S. Department of Education, National Center for Education Statistics, customer satisfaction
survey.
Frequency of Data Collection. Annual
Data Quality. In 2006, NCES changed its customer service survey data collection to an on-line random
sample survey of NCES customers who visited the NCES Web-site. From 1997 through 2004 NCES
administered a biennial pen & pencil paper survey with a telephone follow-up to over 3,900 academic
researchers, education associations, education journalists, users of the NCES's National Education Data
Resource Center, and Federal, State, and local policymakers. The survey format using pen and paper
with the telephone follow-up was becoming increasingly expensive vs. the limited costs of conducting on-
line surveys.
When changing to the on-line survey, NCES also modified the questions asked of respondents. Given
these changes, data collected prior to 2006 are not comparable to those collected in 2006 and future
years.
Data will be validated by using NCES review procedures and by applying NCES statistical standards. The
NCES Monitoring System will yield annual updates on the use and applications of NCES data.
Explanation. NCES expects that each year, all user manuals for NCES public-use data files will be
available on the Web, at least 50 percent of its public-use data files will be available on the Web, and 75
percent of nonassessment surveys will be administered either through the use of computerized interviews
or directly over the Web. NCES views Web release of its reports as a source of increased efficiency and
is committed to releasing at least 90 percent of its reports on the Web. The efficiency steps will facilitate
easier, quicker, and wider access to NCES products.
Program Goal: To conduct high-quality research and related activities that lead
to high-quality products.
Objective 1 of 4: Advance knowledge through capacity building: Increase capacity to conduct
and use high-quality and relevant disability and rehabilitation research and
related activities designed to guide decisionmaking, change practice, and
improve the lives of individuals with disabilities.
Measure 1.1 of 4: The percentage of National Institute on Disability and Rehabilitation
Research (NIDRR) grantees that are conducting at least one multisite, collaborative controlled
trial. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline Not Collected Not Collected
2006 Set a Baseline (December 2006) Pending
2007 BL+2PP (December 2007) Pending
2015 10 Pending
Source. U.S. Department of Education, National Institute on Disability and Rehabilitation Research,
annual grantee performance report.
Frequency of Data Collection. Other
Data Quality. This measure applies only to RERCs, RRTCs, Model Systems grants, and DRRPs.
Explanation. This is an output-oriented capacity-building measure.
Source. U.S. Department of Education, National Institute on Disability and Rehabilitation Research,
expert panel review.
Frequency of Data Collection. Annual
Explanation. This FY 2006 measure is a revision of a similar prior measure and its prior year data has
been preserved. The FY 2006 target is to establish a baseline at the completion of the first three-year
cycle of assessments, in which a judgmentally selected sample of grantee nominated "discoveries" will be
reviewed. Approximately 1/3 of NIDRR's grants will be reviewed annually as part of the new portfolio
assessment process. This is an outcome-oriented research and development measure. The FY 2015
target is the baseline plus at least 20 percent.
Measure 2.3 of 4: The average number of publications per award based on National Institute on
Disability and Rehabilitation Research (NIDRR)-funded research and development activities in
refereed journals. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 2.91 Measure not in place
2003 8 3.38 Made Progress From Prior Year
Measure 2.4 of 4: The percentage of new National Institute on Disability and Rehabilitation
Research grants that assess the effectiveness of interventions, programs, and devices using
rigorous and appropriate methods. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 65 Measure not in place
2003 59 Measure not in place
2004 59 Measure not in place
2005 Set a Baseline 49 Target Met
2006 65 (April 2007) Pending
2007 65 (April 2008) Pending
Source. U.S. Department of Education, Grants and Payment System (GAPS); grant applications.
Frequency of Data Collection. Annual
Explanation. Beginning in FY 2006, preliminary data reported for 2002-2005 based on staff reviews of
grants abstracts will be updated with external expert assessments to ensure "effectiveness studies" using
"rigorous and appropriate methods." For FY 2005, 77 newly-funded grants contained at least 1
"effectiveness study." This is an output-oriented research and development measure.
Program Goal: To support adult education systems that result in increased adult
learner achievement in order to prepare adults for family, work,
citizenship, and future learning.
Objective 1 of 1: Provide adult learners with opportunities to acquire basic foundation skills
(including English language acquisition), complete secondary education, and
transition to further education and training and to work.
Measure 1.1 of 5: The percentage of adults enrolled in English literacy programs who acquire
the level of English language skills needed to complete the levels of instruction in which they
enrolled. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 28 Measure not in place
1998 28 Measure not in place
Program monitoring and data review and analysis are conducted by Department staff through Data
Quality Certification Process. Data are verified by electronic checks and expert staff anaylsis, and by
requiring confirmation and attestation of data by state directors. State data are also checked
independently by staff from the Department's Office of Vocational and Adult Education during onsite
monitoring and state audit reviews.Total data quality and fully systems development are dependent on
investments of staff and resources by states to adopt and adapt the models developed and promoted by
the Department's Office of Vocational and Adult Education. The Department supports states' data quality
efforts by providing technical assistance and training.
Explanation. As of 2000, data reflect the percentage of English literacy learners (adults with minimal
English language skills) who demonstrated a level of English language proficiency needed to advance to
the next educational functioning level. Educational functioning levels range from beginning-level English
literacy through advanced-level English literacy. Target is difficult to meet due to large number of
participants who are not literate in their native language and the large number of participants who stay in
the program only long enough to acquire the language skills needed to enter the workforce.
Measure 1.2 of 5: The percentage of adults in adult basic education programs who acquire the
level of basic skills needed to complete the level of instruction in which they enrolled. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
1997 40 Measure not in place
1998 31 Measure not in place
1999 44 Measure not in place
2000 40 26 Did Not Meet Target
2001 45 36 Made Progress From Prior Year
2002 40 37 Made Progress From Prior Year
2003 41 38 Made Progress From Prior Year
2004 42 38 Did Not Meet Target
2005 42 40 Made Progress From Prior Year
2006 39 (December 2006) Pending
Program monitoring and data review and analysis are conducted by Department staff through the Data
Quality Certification Process. Data are verified by electronic checks and expert staff analysis, and by
requiring confirmation and attestation of data by state directors. State data are also checked
independently by staff fromt the Department's Office of Vocational and Adult Education during onsite
monitoring and state audit reviews. Total data quality and full systems development are dependent on
investments of staff and resources by states to adopt and adapt the models developed and promoted by
the Department's Office of Vocational and Adult Education. The Department supports states' data quality
efforts by providing technical assistance and training.
Explanation. The program did not meet its FY 2005 target. Although the targets were not met,
performance continues to improve as states and local programs increase the hours of instruction students
receive which, in turn, increases the number of students being assessed for educational gain and the
number of students demonsrating signficant learning gains.
As of 2000, data reflect the percentage of adult education learners (adults with limited basic skills) who
demonstrated a level of basic skill proficiency needed to advance to the next educational functioning
level. Educational functioning levels range from beginning literacy through high school.
Measure 1.3 of 5: The percentage of adults with a high school completion goal who earn a high
school diploma or recognized equivalent. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 37 Measure not in place
1998 33 Measure not in place
1999 34 Measure not in place
2000 40 34 Did Not Meet Target
2001 45 33 Did Not Meet Target
2002 40 42 Target Exceeded
2003 41 44 Target Exceeded
2004 42 45 Target Exceeded
2005 46 51 Target Exceeded
2006 46 (December 2006) Pending
2007 52 (December 2007) Pending
2008 53 (December 2008) Pending
2009 54 (December 2009) Pending
Source. U.S. Department of Education, Office of Vocational and Adult Education, Adult Education Annual
Program Performance Report, grantee submisssions.
Frequency of Data Collection. Annual
Data Quality.
FY 2006 Program Performance Report
3 11/14/2006
U.S. Department of Education
Program monitoring and data review and analysis are conducted by Department staff through the Data
Quality Certification Process. Data are verified by electronic checks and expert staff analyis, and by
requiring confirmation and attestation of data by state directors. State data are checked independently by
staff from the Department's Office of Vocational and Adult Education during onsite monitoring and state
audit reviews. Total data quality and full systems development are dependent on investments of staff and
resources by states to adopt and adapt the models developed and promoted by the Department's Office
of Vocational and Adult Education. The Department supports states' data quality efforts by providing
technical assistance and training.
Explanation. The FY 2005 target was exceeded. We believe part of the explanation for the
increase was improved data collection methods used by the states to collect and report on this measure.
As of 2000, the performance data reflect the percentage of adult learners with a goal to complete high
school in secondary level programs of instruction who, upon exit, had earned their high school diploma or
GED credential within the reporting period.
Measure 1.4 of 5: The percentage of adults with a goal to enter postsecondary education or
training who enroll in a postsecondary education or training program. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2001 25 Measure not in place
2002 25 30 Target Exceeded
2003 26 30 Target Exceeded
2004 27 30 Target Exceeded
2005 30 34 Target Exceeded
2006 33 (December 2006) Pending
2007 37 (December 2007) Pending
2008 39 (December 2008) Pending
2009 41 (December 2009) Pending
Source. U.S. Department of Education, Office of Vocational and Adult Education, Adult Education Annual
Program Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality.
Program monitoring and data review and analysis are conducted by Department staff through the Data
Quality Certification Process. Data are verified by electronic checks and expert staff analysis, and by
requiring confirmation and attestation of data by state directors. State data are also checked
independently by staff from the Department's Office of Vocational and Adult Education during onsite
monitoring and state audit reviews. Total data quality and full systems development are dependent on
investments of staff and resources by states to adopt and adapt the models developed and promoted by
the Department's Office of Vocational and Adult Education. The Department supports states' data quality
efforts by providing technical assistance and training.
Explanation. The Fy 2005 traget was exceeded. Exceeding the performance target for 2005 was the
result of the improved follow-up methodologies implemented by the states and training and technical
assistance provided by the Office of Vocational and Adult Education on transitioning adult students into
post-secondary education and training opportunities.
Measure 1.5 of 5: The percentage of adults with an employment goal who obtain a job by the
end of the first quarter after their program exit quarter. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 36 Measure not in place
2002 36 39 Target Exceeded
2003 37 37 Target Met
2004 38 36 Did Not Meet Target
2005 40 37 Made Progress From Prior Year
2006 40 (December 2006) Pending
2007 41 (December 2007) Pending
2008 41 (December 2008) Pending
2009 42 (December 2009) Pending
Source. U.S. Department of Education, Office of Vocational and Adult Education, Adult Education Annual
Program Performance Report, grantee submissions.
Frequency of Data Collection. Annual
Data Quality.
Program monitoring and data review and analysis are conducted by Department staff through the Data
Quality Certification Process. Data are verified by electronic checks and expert staff analysis, and by
requiring confirmation and attestation of data by state directors. State data are also checked
independently by staff from the Department's Office of Vocational and Adult Education during onsite
monitoring and state audit reviews. Total data quality and full systems development are dependent on
investments of staff and resources by states to adopt and adapt teh models developed and promoted by
the Office of Vocational and Adult Education. The Department supports states' data quality efforts by
providing technical assistance and training.
Explanation. The program did not meet its FY 2005 target. However, progress was made from the
previous year. States and local programs continue to work to identify follow-up methodologies that will
prove to be both reliable and valid. Approximately, one-half of the states collect employment status
through the use of a follow-up survey which continues to provide sporadic response rates that impact both
the quanity and quality of data collected.
As of 2001, performance data reflect the percentage of adult learners with an employment goal who, upon
exit from an adult education program, obtain a job.
Source. U.S. Department of Education, National Institute for Literacy, technical assistance participant
evaluations.
Frequency of Data Collection. Other
Data Quality. Not everyone who receives technical assistance will complete an evaluation.
Explanation. LINCS and Bridges training/technical assistance activities will be assessed by participants.
Measure 1.2 of 2: The percentage of individuals who receive National Institute for Literacy
technical assistance who can demonstrate that they implemented instructional practices
grounded in scientifically based research within six months of receiving the technical assistance.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Pending
2007 BL+1% Pending
Source. U.S. Department of Education, National Institute for Literacy, technical assistance participant
evaluations.
Frequency of Data Collection. Other
Data Quality. Not everyone who receives technical assistance will complete an evaluation.
Explanation. LINCS and Bridges training/technical assistance activities will be assessed by participants.
Source. U.S. Department of Education, National Institute for Literacy, Panel of experts.
Frequency of Data Collection. Annual
Explanation. The 5 most-requested products available on the National Institute for Literacy Web-site will
be evalutated.
Program Goal: To increase availability of, funding for, access to, and provision
of assistive technology (AT) devices and assistive technology
services.
Objective 1 of 1: Reduce barriers associated with the cost of assistive technology devices and
services for individuals with disabilities.
Measure 1.1 of 1: The cumulative amount loaned per $1 million cumulative federal Alternative
Funding Program investment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 0.61 Measure not in place
2002 0.45 Measure not in place
2003 0.89 Measure not in place
2004 0.5 Measure not in place
2005 0.676 Measure not in place
2006 0.75 (December 2006) Pending
2007 0.75 (December 2007) Pending
2008 0.8 (December 2008) Pending
Source.
(1) Database that provides case records on individual AFP borrowers and (2) Annual survey of grantees.
Note: AFP grants are provided for 1 year only but data collection responsibilities continue in perpetuity.
Therefore, grantees do not file "annual reports" but do provide cumulative data into a database on an
ongoing basis.
Several AFPs still were not fully operational during FY 2005, meaning they were not providing loans or
began providing loans later in the reporting period. The loan volume will likely increase as all programs
become operational.
A data collection system into which states can provide their annual reports was not approved by OMB
during FY 2006. Therefore, FY 2006 data could not be collected and could not be used to establish a
baseline.
RSA anticipates receiving approval on the data collection system by the end of FY 2006. States will enter
data for October 1, 2006 through September 30, 2007 into the system. This data will be used to establish
the baseline in FY 2007 rather than FY 2006.
Measure 1.2 of 3: The percentage of targeted individuals and entities who obtained an assistive
technology device or service for employment purposes through state financing activities or
reutilization programs, who would not have obtained the device or service. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
2007 Set a Baseline (December 2007) Pending
2008 Maintain a Baseline (December 2008) Pending
Source.
A data collection system into which states can provide their annual reports was not approved by OMB
during FY 2006. Therefore, FY 2006 data could not be collected and could not be used to establish a
baseline.
RSA anticipates receiving approval on the data collection system by the end of FY 2006. States will enter
data for October 1, 2006 through September 30, 2007 into the system. This data will be used to establish
the baseline in FY 2007 rather than FY 2006.
Measure 1.3 of 3: The percentage of targeted individuals and entities who obtained assistive
technology device or service for community living through state financing activities or
reutilization programs who would not have obtained the device or service. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
2007 Set a Baseline (December 2007) Pending
2008 Maintain a Baseline (December 2008) Pending
Source.
A data collection system into which states can provide their annual reports was not approved by OMB
during FY 2006. Therefore, FY 2006 data could not be collected and could not be used to establish a
baseline.
RSA anticipates receiving approval on the data collection system by the end of FY 2006. States will enter
data for October 1, 2006 through September 30, 2007 into the system. This data will be used to establish
the baseline in FY 2007 rather than FY 2006.
A data collection system into which states can provide their annual reports was not approved by OMB
during FY 2006. Therefore, FY 2006 data could not be collected and could not be used to establish a
baseline.
RSA anticipates receiving approval on the data collection system by the end of FY 2006. States will enter
data for October 1, 2006 through September 30, 2007 into the system. This data will be used to establish
the baseline in FY 2007 rather than FY 2006.
Measure 2.2 of 4: The percentage of targeted individuals and entities who accessed assistive
technology device demonstrations and/or device loan programs, and made a decision about the
assistive technology device or services for community living purposes, as a result of the
assistance they received from the Assistive Technology Program. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
2007 Set a Baseline (December 2007) Pending
2008 Maintain a Baseline (December 2008) Pending
Source.
A data collection system into which states can provide their annual reports was not approved by OMB
during FY 2006. Therefore, FY 2006 data could not be collected and could not be used to establish a
baseline.
RSA anticipates receiving approval on the data collection system by the end of FY 2006. States will enter
data for October 1, 2006 through September 30, 2007 into the system. This data will be used to establish
the baseline in FY 2007 rather than FY 2006.
Measure 2.3 of 4: The percentage of targeted individuals and entities who accessed assistive
technology device demonstrations and/or loan programs, and made a decision about an
assistive technology device or service for educational purposes as a result of the assistance
they received from the Assistive Technology Program. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
A data collection system into which states can provide their annual reports was not approved by OMB
during FY 2006. Therefore, FY 2006 data could not be collected and could not be used to establish a
baseline.
RSA anticipates receiving approval on the data collection system by the end of FY 2006. States will enter
data for October 1, 2006 through September 30, 2007 into the system. This data will be used to establish
the baseline in FY 2007 rather than FY 2006.
Measure 2.4 of 4: The percentage of targeted individuals and entities who accessed assistive
technology device demonstrations and/or device loan programs, and made a decision about the
assistive technology device or services for telecommunications purposes as a result of the
assistance they received from the Assistive Technology Program. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Not Collected Not Collected
2007 Set a Baseline (December 2007) Pending
2008 Maintain a Baseline (December 2008) Pending
Source.
A data collection system into which states can provide their annual reports was not approved by OMB
during FY 2006. Therefore, FY 2006 data could not be collected and could not be used to establish a
baseline.
RSA anticipates receiving approval on the data collection system by the end of FY 2006. States will enter
data for October 1, 2006 through September 30, 2007 into the system. This data will be used to establish
the baseline in FY 2007 rather than FY 2006.
Program Goal: To challenge students who are deaf, graduate students who are
deaf, and graduate students who are hearing to achieve their
academic goals and obtain productive employment, provide
leadership in setting the national standard for best practices in
education of the deaf and hard of hearing, and establish a
sustainable resource base.
Objective 1 of 3: The University Programs and the Model Secondary School for the Deaf and
the Kendall Demonstration Elementary School will optimize the number of
students completing programs of study.
Measure 1.1 of 10: The enrollment in Gallaudet University's undergraduate programs.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
1998 1,339 Measure not in place
1999 1,250 1,300 Target Exceeded
2000 1,250 1,318 Target Exceeded
2001 1,250 1,321 Target Exceeded
2002 1,250 1,243 Did Not Meet Target
2003 1,250 1,243 Did Not Meet Target
2004 1,250 1,236 Did Not Meet Target
2005 1,250 1,207 Did Not Meet Target
2006 1,250 1,274 Target Exceeded
2007 1,250 1,206 Did Not Meet Target
2008 1,250 (October 2007) Pending
Source. Gallaudet University, Collegiate Office of Enrollment Services; Annual Report.
Frequency of Data Collection. Annual
Explanation. Gallaudet has established minimum enrollment targets based on long-standing enrollment
targets and historical trends, recognizing that actual figures vary from year to year.
The total undergraduate enrollment for the Fall 2006 (FY 2007) decreased compared to last year.
Measure 1.2 of 10: The enrollment in Gallaudet University's graduate programs. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
For Fall 2006 (FY 2007), graduate enrollment decreased compared to last year.
In FY 2003, Gallaudet University changed how they calculate students enrolled in graduate courses or
enrolled in professional studies (PST) courses. Previously, Gallaudet had counted enrollment of graduate
and professional studies students based on the delivery system for courses. Because the same student
could be enrolled in courses with different delivery systems during any given semester, at times the same
student would be double-counted.
Under the new counting system for graduate and professional studies enrollment, figures are now based
on three types of students: 1) graduate degree-seeking, 2) graduate special (takes credit bearing course,
but not degree-seeking), and 3) professional studies (earns professional studies credit but cannot apply
that to a degree). If a degree-seeking student or a graduate special student is also enrolled in a
professional studies course, that student will be counted only once -- as degree-seeking or graduate
special respectively.
The new counting method caused a significant drop in both graduate and professional studies enrollment
numbers from FY 2003 to FY 2004.
Measure 1.3 of 10: The enrollment in Gallaudet University's professional studies programs.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
1998 92 Measure not in place
1999 70 70 Target Met
2000 70 86 Target Exceeded
2001 70 93 Target Exceeded
2002 70 92 Target Exceeded
2003 70 154 Target Exceeded
2004 70 70 Target Met
2005 70 176 Target Exceeded
2006 175 173 Did Not Meet Target
2007 175 187 Target Exceeded
FY 2006 Program Performance Report
15 11/14/2006
U.S. Department of Education
2008 175 (October 2007) Pending
Source. Gallaudet University, Collegiate Office of Enrollment Services; Annual Report.
Frequency of Data Collection. Annual
Explanation. Gallaudet has established minimum enrollment targets based on long-standing enrollment
targets and historical trends, recognizing that actual figures vary from year to year. A degree-seeking
student who is dually-enrolled in Professional Studies course is only counted under one of the categories.
The number of professional studies (PST) reported (187) reflects the Fall 2006 enrollment (FY 2007) only
as of September 12, 2006. In Spring 2006, 264 PST students were enrolled and in Summer 2006, 409
PST students were enrolled.
Measure 1.4 of 10: The enrollment in the Model Secondary School for the Deaf established by
Gallaudet University. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1998 224 Measure not in place
1999 225 209 Did Not Meet Target
2000 225 219 Made Progress From Prior Year
2001 225 205 Did Not Meet Target
2002 225 188 Did Not Meet Target
2003 225 190 Made Progress From Prior Year
2004 225 186 Did Not Meet Target
2005 225 182 Did Not Meet Target
2006 225 226 Target Exceeded
2007 225 221 Did Not Meet Target
2008 225 (October 2007) Pending
Source. Gallaudet University, Clerc Center student database; Annual Report.
Frequency of Data Collection. Annual
Explanation.
For FY 2007, KDES and MSSD enrollment numbers are below their targets.
To address enrollment goals and to maintain enrollment figures for MSSD and KDES, for FY 2007 and
beyond, the Clerc Center expanded and enhanced the work of the enrollment committee by:
1) Offering and promoting the Honors Program at both Demonstration Schools;
2) Redesigning the Clerc Center Web site;
3) Redesigning the KDES administrative structure to better serve students and families;
4) Developing a DVD, with a description of the programs offered at MSSD, to be sent with the application
packet for prospective students; and
5) Holding open houses at MSSD for prospective students to see the program first hand.
The Clerc Center will also review the Literacy and Emotional Intelligence Action Plans, developed for the
Accreditation for Growth, to ensure appropriate monitoring of student and institutional progress.
Measure 1.5 of 10: The enrollment in the Kendall Demonstration Elementary School
Measure 1.6 of 10: The Gallaudet University undergraduate persistence rate. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
1998 72 Measure not in place
1999 75 73 Made Progress From Prior Year
2000 76 72 Did Not Meet Target
2001 76 71 Did Not Meet Target
2002 76 73 Made Progress From Prior Year
2003 79 71 Did Not Meet Target
2004 79 73 Made Progress From Prior Year
2005 79 76 Made Progress From Prior Year
2006 79 72 Did Not Meet Target
2007 79 (October 2007) Pending
2008 80 (October 2008) Pending
2009 80 (October 2009) Pending
2010 80 (October 2010) Pending
2011 80 (October 2011) Pending
2012 80 (October 2012) Pending
Source. Gallaudet University, Collegiate Office of the Register, records.
Frequency of Data Collection. Annual
Explanation. For FY 2006 this measure changed from retention rates to persistence rates. This measure
was designated as a long-term measure.
To date, the calculation for undergraduate student persisence rate has been defined as year-to-year
persistence of students, with the exception of graduating seniors.
Measure 1.7 of 10: The Gallaudet University graduate student persistence rate. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2000 78 Measure not in place
2001 82 Measure not in place
2002 98 Measure not in place
2003 86 Measure not in place
2004 86 89 Target Exceeded
2005 86 93 Target Exceeded
2006 86 82 Did Not Meet Target
2007 86 (October 2007) Pending
2008 88 (October 2008) Pending
2009 88 (October 2009) Pending
2010 88 (October 2010) Pending
2011 89 (October 2011) Pending
2012 89 (October 2012) Pending
Source. Gallaudet University, Collegiate Office of the Register, records.
Frequency of Data Collection. Annual
Explanation. For FY 2006 this measure changed from retention rates to persistence rates. This measure
was designated as a long-term measure.
Graduate student persistence rates are calculated as the ratio of the number of returning graduate
students in a particular fall to the number of graduate students "available to return." (The former number
is calculated as the difference between the total Fall degree-seeking enrollment and the number of new
degree-seeking matriculations. The latter number is calculated by subtracting the number of
graduate degrees granted the previous year from total degree-seeking enrollment from the previous year.)
The overall persistence rate for graduate students decreased 11%. Gallaudet University is committed to
examining reasons for this decrease and take steps to increase the persistence rate for next year.
Measure 1.8 of 10: The graduation rate of Gallaudet University undergraduates. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
1998 41 Measure not in place
1999 41 42 Target Exceeded
2000 42 41 Did Not Meet Target
2001 43 41 Did Not Meet Target
2002 44 42 Made Progress From Prior Year
Gallaudet University will submit a proposal in October 2006 on how this indicator may be revised with
corresponding new targets.
Measure 1.9 of 10: The graduation rate of Gallaudet University graduate students. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2000 82 Measure not in place
2001 82 Measure not in place
2002 82 Measure not in place
2003 82 Measure not in place
2004 82 84 Target Exceeded
2005 83 86 Target Exceeded
2006 83 91 Target Exceeded
2007 83 (October 2007) Pending
2008 83 (October 2008) Pending
2009 83 (October 2009) Pending
2010 83 (October 2010) Pending
2011 84 (October 2011) Pending
2012 84 (October 2012) Pending
Source. Gallaudet University, Collegiate Office of the Register, records.
Frequency of Data Collection. Annual
Explanation. Gallaudet University reported that they are not able to accurately calculate graduation rates
for gradute students for FY 2000 - 2007 for the following reasons:
1) Accessible data through PeopleSoft is relatively recent, and many students graduating during this
period matriculated before this database was implemented;
2) Time-to-degree for graduate students varies widely, especially those who are Ph.D. students; and
3) There is a mix of full- and part-time students in graduate programs.
Gallaudet estimates the graduation rate of .91, by dividing the total number of degrees awarded during
2000 - 2007 period (956) by the number of new students matriculated during this period (1,045).
Percentage of Seniors Who Graduate in 4 Years and Those who Graduate in 5 Years
Year 4 Year Seniors 5 Year Seniors
2002 76 14
2003 68 21
2004 58 29
2005 71
2006 70
The senior class for FY 2006 included 71 students. Fifty students graduated by August 31, 2006.
Therefore for FY 2006, MSSD had a 70% graduation rate. Eleven seniors (15%) elected to return in FY
2007 for a second senior year in order to complete graduation requirements and IEP goals.
The Department is working with Gallaudet on developing an alternative to this measure that would assess
the impact of scientifically based research projects, other scholarly activities, and demonstration and
program development activities on improveing educational outcomes for students who are deaf and hard
of hearing.
Objective 3 of 3: Curriculum and extracurricular activities prepare students to meet the skill
requirements of the workplace or to continue their studies.
Measure 3.1 of 4: The percentage of Gallaudet University Bachelor graduates who are
employed during their first year after graduation. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 90 Measure not in place
2002 89 Measure not in place
2003 79 Measure not in place
2004 80 73 Did Not Meet Target
2005 81 69 Did Not Meet Target
2006 82 84 Target Exceeded
2007 82 (October 2007) Pending
2008 82 (October 2008) Pending
Source. Gallaudet University, study on the status of graduates' employment and advanced studies.
Frequency of Data Collection. Annual
Data Quality. The source of this data is from an annual survey sent to students who have graduated from
Gallaudet University within the previous year, inquiring about their employment and advanced education
or training status. Each year, about 30 to 35% of the graduates respond to the survey (N = approximately
50 students).
Explanation. This indicator was changed in FY 2003 to report separately the percentage of students
employed and the percentage of students who received advanced education or training, during their first
year after graduation. The percentages total more than 100%, as some respondents were employed
Measure 3.2 of 4: The percentage of Gallaudet University Bachelor graduates who are in
advanced education or training during their first year after graduation. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2001 50 38 Made Progress From Prior Year
2002 49 Measure not in place
2003 40 Measure not in place
2004 40 38 Did Not Meet Target
2005 41 36 Did Not Meet Target
2006 41 36 Did Not Meet Target
2007 42 (October 2007) Pending
2008 42 (October 2008) Pending
Source. Gallaudet University, study on the status of graduates' employment and advanced studies.
Frequency of Data Collection. Annual
Data Quality. The source of this data is from an annual survey sent to students who have graduated from
Gallaudet University within the previous year, inquiring about their employment and advanced education
or training status. Each year, about 30 to 35% of the graduates respond to the survey (N = approximately
50 students).
Explanation. This indicator was changed in FY 2003 to report separately the percentage of students
employed and the percentage of students who received advanced education or training, during their first
year after graduation. The percentages total more than 100%, as some respondents were employed while
enrolled in a program of advanced education or training within the same year.
Advanced education or training includes students enrolled in a master's or Ph.D. program, a vocational or
technical program, or another type of program (e.g., law school or medical school).
Students receiving baccalaureate degrees from Gallaudet University are more likely than students from
other public universities and colleges throughout the country to enter advanced education or training
programs. More than 36% of deaf or hard of hearing individuals with baccalaureate degrees from
Gallaudet University, each year, enter higher education or training, as compared to the national average
of 25% of individuals without hearing loss entering higher education or training. Gallaudet University will
propose revised targets for FY 2007 and beyond for this indicator.
Measure 3.3 of 4: The percentage of Gallaudet University Bachelor graduates who are not
employed nor in advanced education or training during their first year after graduation. (Desired
direction: decrease)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline 11 Target Met
2007 Maintain a Baseline (October 2007) Pending
2008 Maintain a Baseline (October 2008) Pending
Source. Gallaudet University, study on the status of graduates' employment and advanced studies.
Measure 3.4 of 4: The percentage of Model Secondary School graduates who are in jobs or
postsecondary programs four months after graduation. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 74 Measure not in place
2001 72 Measure not in place
2002 80 90 Target Exceeded
2003 80 82 Target Exceeded
2004 80 83 Target Exceeded
2005 81 83 Target Exceeded
2006 83 77 Did Not Meet Target
Source. Gallaudet University, Clerc Center Office of Exemplary Programs and Research, survey of
graduates' status.
Frequency of Data Collection. Annual
Data Quality. The FY 2006 data is based on the 43 out of 50 MSSD seniors who graduated in Spring
2006 and who responded to the four-month follow-up survey.
Target Context. Gallaudet University states that the target for FY 2006 is 81%, not the 83% listed in the
above table. As such, with the 77% of MSSD graduates (who are enrolled in postsecondary programs,
working, or engaged in vocational training ) included in this report, Gallaudet University missed the target
by 4%.
Explanation. In FY 2007 this indicator will be disaggregated to three categories of students: 1) those who
are employed, 2) those who are in post-secondary education or training, and 3) those who are not
engaged in either activity.
In addition, in FY 2008, the three indicators will also be changed to the percentages of MSSD graduates
who are employed, who are in post secondary education or training, and those who are not engaged in
either activity during their first year after graduation.
The NTID sub-baccalaureate programs are experiencing increased competition from the growth of
services for deaf and hard of hearing students at community colleges throughout the country. At the
same time, the number of deaf and hard of hearing students in other baccalaureate programs at RIT
continues to grow.
Measure 1.2 of 3: The number of students enrolled at National Technical Institute for the Deaf's
educational interpreters program. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 72 Measure not in place
1998 84 Measure not in place
1999 100 93 Made Progress From Prior Year
2000 100 77 Did Not Meet Target
2001 100 75 Did Not Meet Target
2002 100 53 Did Not Meet Target
2003 100 65 Made Progress From Prior Year
2004 100 92 Made Progress From Prior Year
2005 100 100 Target Met
2006 100 116 Target Exceeded
2007 100 130 Target Exceeded
FY 2006 Program Performance Report
24 11/14/2006
U.S. Department of Education
2008 100 (October 2007) Pending
Source. National Technical Institute for the Deaf, registrar office records.
Frequency of Data Collection. Annual
Explanation. While NTID exceeded the target for the Educational Interpreter program, the Department of
Education is working with NTID to re-define its enrollment targets for FY 2007 and beyond.
Measure 1.3 of 3: The number of students enrolled in National Technical Institute for the Deaf's
graduate/Master's in Special Education program. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 32 Measure not in place
1998 36 Measure not in place
1999 50 50 Target Met
2000 50 59 Target Exceeded
2001 50 55 Target Exceeded
2002 75 60 Made Progress From Prior Year
2003 75 73 Made Progress From Prior Year
2004 75 114 Target Exceeded
2005 90 126 Target Exceeded
2006 120 127 Target Exceeded
2007 120 101 Did Not Meet Target
2008 120 (October 2007) Pending
Source. National Technical Institute for the Deaf, registrar office records.
Frequency of Data Collection. Annual
Explanation. The Department of Education is working with NTID to re-define its enrollment targets for FY
2007 and beyond.
The graduation rate calculated by NTID, is based on an average of three years of cohorts. The FY 2006
graduation rate for sub-baccalaureate students is based on students entering NTID during the years of
2001, 2002, and 2003.
The Department of Education has agreed with NTID to re-define its graduation targets for FY 2007 and
beyond, for both sub-baccalaureate and baccalaureate students.
The FY 2007 graduation rate target for students in sub-baccalaureate programs has been revised from
53% to 50%. The target for FY 2008 and subsequent years is revised from 54% to 51%.
The graduation rate for students in sub-baccalureate programs at NTID is significantly higher than other
comparable two-year institutions; that is, all two-year institutions have an average of 33% graduation rate,
two-year public colleges have a graduation rate of 24.1%, and two-year private colleges have a
graduation rate of 55.9%.
Measure 2.2 of 4: The National Technical Institute for the Deaf baccalaureate graduation rate.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 51 Measure not in place
1998 57 Measure not in place
1999 61 Measure not in place
2000 61 63 Target Exceeded
2001 61 64 Target Exceeded
2002 61 66 Target Exceeded
2003 61 68 Target Exceeded
2004 69 68 Did Not Meet Target
2005 69 69 Target Met
2006 70 70 Target Met
2007 70 (October 2007) Pending
2008 71 (October 2008) Pending
2009 71 (October 2009) Pending
2010 71 (October 2010) Pending
2011 71 (October 2011) Pending
2012 71 (October 2012) Pending
Source. National Technical Institute for the Deaf, Registrar Office records.
The graduation rate calculated by NTID, is based on an average of three years of cohorts. The FY 2006
graduation rate for baccalaureate students is based on students entering NTID during the years of 1997,
1998, and 1999.
The Department of Education has agreed with NTID to re-define its graduation targets for FY 2007 and
beyond, for both sub-baccalaureate and baccalaureate students.
The FY 2007 graduation rate target for students in baccalaureate programs has been revised from 71% to
70%. The target for FY 2008 and subsequent years is revised from 72% to 71%.
The graduation rate for students in baccalureate programs at NTID is significantly higher than other
comparable four-year institutions; that is, all four-year institutions have an average of 55% graduation
rate, four-year public colleges have a graduation rate of 51.9%, and four-year private colleges have a
graduation rate of 63.3%.
Measure 2.3 of 4: The retention percentage of first-year National Technical Institute for the Deaf
sub-baccalaureates. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 85 Measure not in place
1998 73 Measure not in place
1999 69 Measure not in place
2000 73 69 Did Not Meet Target
2001 74 68 Did Not Meet Target
2002 74 72 Made Progress From Prior Year
2003 74 70 Did Not Meet Target
2004 74 70 Did Not Meet Target
2005 74 70 Did Not Meet Target
2006 74 70 Did Not Meet Target
2007 72 (October 2007) Pending
2008 72 (October 2008) Pending
2009 72 (October 2009) Pending
2010 72 (October 2010) Pending
2011 72 (October 2011) Pending
2012 72 (October 2012) Pending
Source. National Technical Institute for the Deaf, Registrar Office records.
Frequency of Data Collection. Annual
Explanation. This is a long-term measure.
The retention rate is an average of three years of cohorts moving from their first-year into their second-
year. The FY 2006 report includes entering students from 2003, 2004, and 2005.
The Department of Education has agreed with NTID to re-define its retention targets for FY 2007 and
beyond, for both sub-baccalaureate and baccalaureate students.
Recent comparisons with two-year public and private colleges indicate that NTID retention rate is
significantly higher; that is two-year public colleges have an average retention rate of 52.5%, and two-
year private colleges have a retention rate of 60.1%.
Measure 2.4 of 4: The retention percentage of first-year National Technical Institute for the Deaf
baccalaureates. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 84 Measure not in place
1998 81 Measure not in place
1999 84 Measure not in place
2000 84 85 Target Exceeded
2001 84 86 Target Exceeded
2002 84 87 Target Exceeded
2003 84 86 Target Exceeded
2004 84 86 Target Exceeded
2005 86 85 Did Not Meet Target
2006 86 86 Target Met
2007 86 (October 2007) Pending
2008 86 (October 2008) Pending
2009 87 (October 2009) Pending
2010 87 (October 2010) Pending
2011 87 (October 2011) Pending
2012 87 (October 2012) Pending
Source. National Technical Institute for the Deaf, Registrar Office records.
Frequency of Data Collection. Annual
Explanation. This is a long-term measure.
The retention rate is an average of three years of cohorts moving from their first-year into their second-
year. The FY 2006 report includes entering students from 2003, 2004, and 2005.
The Department of Education has agreed with NTID to re-define its retention targets for FY 2007 and
beyond, for both sub-baccalaureate and baccalaureate students.
For FY 2008, the target for student retention rate in the baccalaureate programs is being maintained at
86% for that year. The target is being increased by 1% to 87% in FY 2009, rather than in FY 2008. Recent
comparisons with four-year public and private colleges indicate that NTID retention rate is significantly
higher; that is four-year public colleges have an average retention rate of 69.9%, and four-year private
colleges have a retention rate of 70.6%.
In FY 2005, 198 students graduated from NITD. Valid data exists on 188 graduates, which were
collected for the FY 2006 report. Of the 188 graduates, 117 students are in the workforce with 111
students actually employed (62%), 62 students are in higher education or training (33%), and 9 students
are not employed or in higher education or training (5%).
NTID has been calculating the placement rates were as the percentage of graduates who are employed
among those who choose to pursue employment. Individuals who are continue their education or are not
seeking employment were not previously included.
NTID calculates the FY 2006 placement rate as 95% (computed from 111 students actually employed
divided by 117 students in the workforce). NTID caculates the FY 2006 higher education or training rate
of those not in the workforce as 87% (computed from 62 students in higher education divided by 188
students minus 117). NTID also calculates the FY 2006 perecentage of those not in the workforce or
higher education/training as 5% (nine students divided by 188 students).
The FY 2006 report is missing an indicator on the placement rate of NTID's graduates.
For FY 2006 and for previous years, NTID reported the following:
In FY 2005, 198 students graduated from NITD. Valid data exists on 188 graduates, which were collected
for the FY 2006 report. Of the 188 graduates, 117 students are in the workforce with 111 students actually
employed (62%), 62 students are in higher education or training (33%), and 9 students are not employed
or in higher education or training (5%).
NTID has been calculating the placement rates were as the percentage of graduates who are employed
among those who choose to pursue employment. Individuals who are continue their education or are not
seeking employment were not previously included.
NTID calculates the FY 2006 placement rate as 95% (computed from 111 students actually employed
divided by 117 students in the workforce). NTID caculates the FY 2006 higher education or training rate of
those not in the workforce as 87% (computed from 62 students in higher education divided by 188
students minus 117). NTID also calculates the FY 2006 perecentage of those not in the workforce or
higher education/training as 5% (nine students divided by 188 students).
The FY 2006 report is missing an indicator on the placement rate of NTID's graduates. (See measure 3.1
for complete information on NTID's FY 2006 report on their placement rate.)
This program has a long-term target of 813,326 for FY 2009. We will continue to report data annually, but
there are no intermediate annual targets. The target is derived by applying the difference between
regression-based predicted values from Title IV institutions and actual grantee values for school year
2002-03, which was 5.1 percent. Therefore, the HSI program actual enrollment of 773,859 in FY 2003
was multiplied by 1.051 to generate the long-term target of 813,326. Data for FY 2006 will be available in
December 2006.
Data for FY 2005 will be available in December 2006 and will be used to set the baseline. There will be no
data for FY 2006, because enrollment data by field of study is provided only biennially in IPEDS. The
target for FY 2007 will be to maintain the baseline. There will, again, be no data for FY 2008.
Objective 2 of 3: To increase the graduation rate for students in the fields of engineering, or
physical or biological sciences, at minority-serving institutions.
Measure 2.1 of 2: The percentage of minority students enrolled at four-year minority-serving
institutions in the fields of engineering or physical or biological sciences who graduate within six
years of enrollment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (December 2007) Pending
2007 Maintain a Baseline (December 2008) Pending
2008 999 (December 2009) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation.
Data for FY 2005 will be available in December 2006 and will be used as the baseline. The target for FY
2007 and FY 2008 will be to maintain the baseline.
The FY 2005 data will be used as the baseline and will be available in December 2006. The target for FY
2007 and FY 2008 will be to maintain the baseline.
FY 2006 data will be used as the baseline and will be available in December 2006. The target for FY 2007
and FY 2008 will be to maintain the baseline.
This program has a long term target of 13,700 for FY09. We will continue to report data annually, but
there are no intermediate annual targets. Data for FY 2006 will be available in December 2006.
Objective 2 of 3: Maintain or increase the persistence rate for students at Alaska Native and
Native Hawaiian Serving Institutions.
Measure 2.1 of 1: The percentage of full-time undergraduate students who were in their first
year of postsecondary enrollment in the previous year and are enrolled in the current year at the
same Alaska Native/Native Hawaiian institution. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 46 Measure not in place
2004 45.5 Measure not in place
2005 61.5 Measure not in place
2006 46 (December 2006) Pending
2007 46 (December 2007) Pending
2008 46 (December 2008) Pending
2009 46 (December 2009) Pending
2010 46 (December 2010) Pending
2011 46 (December 2011) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Persistence data first became available from IPEDS in 2003-04. Institutions report a persistence rate, not
the numerator and denominator. As a result, the persistence rate for the AN/NH institutions is calculated
as a median. The measure for 2003-04 is only based on data for two institutions.
Objective 3 of 3: Maintain or increase the graduation rate at Alaska Native and Native
Hawaiian Serving Institutions.
Measure 3.1 of 3: Cost per successful outcome: the federal cost for undergraduate degree at
Alaska Native and Native Hawaiian Serving Institutions. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 1,940 Measure not in place
2004 2,532 Measure not in place
2007 999 (December 2008) Pending
2008 999 (December 2009) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. This measure is calculated as appropriation for the Strengthening AN/NH Institutions
program divided by the number of undergraduate and graduate degrees awarded. The $1,940 value for
the efficiency measure for 2003 reflects an appropriation of $8,180,479 divided by 4,216 graduates. The
$2,532 value for 2004 reflects an appropriation of $10,935,100 divided by 4,318 graduates. For FY 2006-
2008, this is an efficiency measure without targets. Data for FY 2005 will be available in December 2006.
Measure 3.2 of 3: The percentage of undergraduate students at four-year Alaska Native and
Native Hawaiian Serving Institutions who graduate within six years of enrollment. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2003 27 Measure not in place
2004 28 Measure not in place
2006 27 (December 2007) Pending
2007 27 (December 2008) Pending
2008 27 (December 2009) Pending
2009 27 (December 2010) Pending
2010 27 (December 2011) Pending
2011 27 (December 2012) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Data for FY 2005 will be available in December 2006.
FY 2006 Program Performance Report
37 11/14/2006
U.S. Department of Education
Measure 3.3 of 3: The percentage of students enrolled at two-year Alaska Native and Native
Hawaiian Serving Institutions who graduate within three years of enrollment. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2003 16 Measure not in place
2004 14 Measure not in place
2006 16 (December 2007) Pending
2007 16 (December 2008) Pending
2008 16 (December 2009) Pending
2009 16 (December 2010) Pending
2010 16 (December 2011) Pending
2011 16 (December 2012) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Data for FY 2005 will be available in December 2006.
This program has a long term target of 231,443 for FY 2009. We will report data annually but there are no
intermediate annual targets. The target is derived by applying the difference between regression-based
predicted values from Title IV institutions and actual HBCU grantee values for school year 2002-03, which
was 12.1 percent. Therefore, the HBCU program actual enrollment of 206,332 in FY 2003 was multiplied
by 1.121 to generate the long-term target of 231,443. Data for FY 2006 will be available in December
2006.
Actual data and targets were calculated using IPEDS fall enrollment data for all graduate students. These
values replace data previously reported from other sources. This program has a long term target of
14,148 for FY 2009. We will continue to report data annually, but there are no intermediate annual targets.
The target was derived by applying an estimated annual rate of increase, based on program experience,
to the period between FY 2003 and FY 2009. Annual increases are estimated to be 1.0 percent through
2009 and 0.5 percent beginning in 2010.
Objective 2 of 3: Increase the persistence rate for students enrolled at SIP Institutions.
Measure 2.1 of 1: The percentage of full-time undergraduate students who were in their first
year of postsecondary enrollment in the previous year and are enrolled in the current year at the
same SIP institution. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 66 Measure not in place
2005 63 Measure not in place
2006 68 (December 2006) Pending
2007 68 (December 2007) Pending
2008 68 (December 2008) Pending
2009 68 (December 2009) Pending
2010 68 (December 2010) Pending
2011 68 (December 2011) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas/.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Institutions report a persistence rate, not the numerator and denominator. As a result, the
persistence rate for the SIP program is calculated as a median. The target is derived by applying the
FY 2006 Program Performance Report
42 11/14/2006
U.S. Department of Education
difference between regression-based predicted values from Title IV institutions and actual values for
school year 2002-03, which was 1.67 percent. Therefore, the SIP program actual persistence rate of 67
percent in FY 2003 was multiplied by 1.0167 to generate the long-term target (for 2009) of 68 percent.
Annual increases are estimated to be 0.3 percent each year through 2009 and 0.2 percent beginning in
2010.
Objective 3 of 3: Increase the graduation rate for students enrolled at SIP Institutions.
Measure 3.1 of 3: Cost per successful program outcome: the average federal cost for
undergraduate and graduate degrees at SIP institutions. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 3,975 Measure not in place
2004 3,678 Measure not in place
2006 999 (December 2007) Pending
2007 999 (December 2008) Pending
2008 999 (December 2009) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). U.S. Department of Education, Office of Chief Financial
Officer, Grant Administration and Payment System (GAPS).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. This measure is calculated as the appropriation for the Strengthening Institutions Program
divided by the number of undergraduate and graduate degrees awarded. The 2003 actual value reflects
an appropriation of $81.467 million divided by 20,495 graduates. Data for FY 2005 are estimated to be
available in December 2006. For FY 2206-2008, this is an efficiency measure without targets.
Measure 3.2 of 3: The percentage of students enrolled at four-year SIPs graduating within six
years of enrollment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 45 Measure not in place
2004 47 Measure not in place
2006 47 (December 2007) Pending
2007 47 (December 2008) Pending
2008 48 (December 2009) Pending
2009 48 (December 2010) Pending
2010 48 (December 2011) Pending
2011 48 (December 2012) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas/.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. The target for the four-year graduation rate is derived by applying the difference between
regression-based predicted values from Title IV institutions and actual values for school year 2002-03,
which was 6.33 percent. Therefore, the SIP program actual four-year graduation rate of 45 percent in FY
Measure 3.3 of 3: The percentage of students enrolled at two-year SIPs who graduate within
three years of enrollment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 25 Measure not in place
2004 26 Measure not in place
2006 25 (December 2007) Pending
2007 26 (December 2008) Pending
2008 26 (December 2009) Pending
2009 26 (December 2010) Pending
2010 26 (December 2011) Pending
2011 26 (December 2012) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas/.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Program experience was used to estimate targets. An increase of 0.5 percent was used to
generate annual targets each year through 2009 and an increase of 0.3 percent was used beginning in
2010. Data for FY 2005 will be available in December 2006.
This program has a long-term target of 10,000 for FY 2009. We will continue to report data annually, but
there are no intermediate annual targets. Data for FY 2006 will be available in April 2007.
Objective 2 of 3: Maintain or increase the persistence rate for students enrolled at TCCUs.
Measure 2.1 of 1: The percentage of full-time undergraduate students who were in their first
year of postsecondary enrollment in the previous year and are enrolled in the current year at the
same Tribally Controlled Colleges and Universities institution. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 41 Measure not in place
2005 48 Measure not in place
2006 41 44 Target Exceeded
2007 41 (December 2007) Pending
2008 41 (December 2008) Pending
2009 42 (December 2009) Pending
2010 42 (December 2010) Pending
2011 43 (December 2011) Pending
2012 43 (December 2012) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Institutions report a persistence rate, not the numerator and denominator. As a result, the
persistence rate for the TCCUs is calculated as a median.
Objective 3 of 3: Maintain or increase the graduation rate for students enrolled at TCCUs.
Measure 3.1 of 2: The percentage of students enrolled at four-year Tribally Controlled Colleges
and Universities graduating within six years of enrollment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 23 Measure not in place
2004 32 Measure not in place
2006 32 (December 2007) Pending
Measure 3.2 of 2: The percentage of students enrolled at two-year Tribally Controlled Colleges
and Universities who graduate within three years of enrollment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 40 Measure not in place
2004 34 Measure not in place
2006 29 (December 2007) Pending
2007 29 (December 2008) Pending
2008 29 (December 2009) Pending
2009 29 (December 2010) Pending
2010 29 (December 2011) Pending
2011 29 (December 2012) Pending
Source. U. S. Department of Education, National Center for Education Statistics, Integrated
Postsecondary Education Data System (IPEDS). Web Site: http://nces.ed.gov/ipedspas.
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation.
Graduation rate data first became available from the Integrated Postsecondary Education Data System
(IPEDS) for FY 2003. Data for FY 2005 will be available in December 2006.
Measure 1.2 of 2: The percentage of Stupak scholarship recipients in their senior year of study
that graduate. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (March 2008) Pending
2007 Maintain a Baseline (March 2009) Pending
2008 Maintain a Baseline (March 2010) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, B.J. Stupak Olympic
Scholarships Program Annual Performance Report.
Frequency of Data Collection. Annual
Explanation. The PY 2005-2006 performance report has been revised to start collecting data on
students' academic standing (i.e., first year, sophomore, junior, and senior) and graduation. Data for FY
2005 will establish the baseline and will be available in March 2007.
Objective 3 of 3: The Byrd Honors Scholarships Program will increase its efficiency
Measure 3.1 of 1: The cost per successful outcome: the federal cost per Byrd recipient student
who successfully persists or graduates. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 1,866 Measure not in place
2006 999 (September 2007) Pending
2007 999 (September 2008) Pending
2008 999 (September 2009) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Robert C. Bryd Honors
Scholarship Program Annual Performance Report. U.S. Department of Education, Office of Chief
Financial Officer, Grant Administration and Payment System (GAPS).
Frequency of Data Collection. Annual
Data Quality. Data are based on state reports of varying quality and accuracy. The program office is
revising the grantee report forms to improve the quality and comprehensiveness of data.
Explanation. The efficiency measure for Byrd Honors Scholarships for FY 2004 was calculated by
dividing the appropriation for FY 2003 by the number of students persisting and completing during the
2003-04 school year. $40,734,493/21,830 = $1,886.
Objective 2 of 2: A majority of CAMP students who successfully complete their first year of
college will continue in postsecondary education.
Measure 2.1 of 1: The percentage of College Assistance Migrant Program (CAMP) participants
who, after completing first year of college, continue their postsecondary education. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2001 78 Measure not in place
Explanation. Although no target was established for FY 2003, data will be collected.
Program Goal: To improve the quality of higher education for students with
disabilities.
Objective 1 of 1: Ensure that faculty and administrators in institutions of higher education
increase their capacity to provide a high-quality education to students with
disabilities.
Measure 1.1 of 2: The percentage of faculty trained through project activities who incorporate
elements of their training into their classroom teaching. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 Maintain a Baseline (December 2007) Pending
2008 Maintain a Baseline (December 2008) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Demonstration Projects to
Ensure Quality Higher Education for Students with Disabilities Program Grantee Performance Reports.
Measure 1.2 of 2: The difference between the rate at which students with documented
disabilities complete courses taught by faculty trained through project activities and the rate at
which other students complete the same courses. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (December 2006) Pending
2007 Maintain a Baseline (December 2007) Pending
2008 Maintain a Baseline (December 2008) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Demonstration Projects to
Ensure Quality Higher Education for Students with Disabilities Program Grantee Performance Reports.
Frequency of Data Collection. Annual
Explanation. Program staff changed reporting requirements to include this data beginning with the 2005-
2006 project year.
Measure 1.2 of 2: The percentage of GEAR UP students who passed Algebra 1 by the end of
the 9th grade. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 19 30 Target Exceeded
Data for 2003 reflect the percentage of GEAR UP students who were successfully enrolled in Algebra 1
by the end of 9th grade. Data beginning in 2004 are collected on successful completion of core academic
subjects and other college preparatory courses. Standards to enter and complete above grade level math
courses are becoming more rigorous. This practice may limit the percentage of students in many schools
served by GEAR UP who are entering and completing such courses.
Measure 2.2 of 3: The percentage of GEAR UP students who have knowledge of necessary
academic preparation for college. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
FY 2006 Program Performance Report
56 11/14/2006
U.S. Department of Education
2001 50 Measure not in place
2002 53 Measure not in place
2003 54 57 Target Exceeded
2004 56 62 Target Exceeded
2005 61 63.05 Target Exceeded
2006 64 64 Target Met
2007 75 (August 2007) Pending
2008 999 (August 2008) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Gaining Early Awareness
and Readiness for Undergraduate Programs (GEAR-UP) annual program performance reports.
Frequency of Data Collection. Annual
Data Quality. Program staff review performance report data for quality, clarity, and consistency and to
assess extent to which project objectives are being accomplished.
Explanation. Data reflect the percentages of GEAR UP students who have talked to school counselors,
advisors, or someone else about academic preparation for college and college entrance requirements.
Measure 2.3 of 3: The percentage of parents of GEAR UP students who have knowledge of
necessary academic preparation for college. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 31 Measure not in place
2002 39 Measure not in place
2003 40 43 Target Exceeded
2004 42 42 Target Met
2005 46 49.02 Target Exceeded
2006 47 38.4 Did Not Meet Target
2007 48 (August 2007) Pending
2008 999 (August 2008) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Gaining Early Awareness
and Readiness for Undergraduate Programs (GEAR-UP) annual program performance reports.
Frequency of Data Collection. Annual
Data Quality. Program staff review performance report data for quality, clarity, and consistency and to
assess extent to which project objectives are being accomplished.
Explanation. Data reflect the percentages of GEAR UP students' parents who have talked to school
counselors, advisors, or someone else about academic preparation for college and college entrance
requirements.
Measure 1.2 of 8: Percentage of GAANN fellows who are American Indian or Alaska Native by
grantee cohort enrolled in a terminal degree program in the designated areas of national need.
(Desired direction: increase)
Year Target Actual Status
Measure 1.3 of 8: Percentage of GAANN fellows who are Asian/Pacific Islander by grantee
cohort enrolled in a terminal degree program in the designated areas of national need. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
1999 10 Measure not in place
2001 7 Measure not in place
2002 11 Measure not in place
2003 Set a Baseline 6 Target Met
2004 6 9 Target Exceeded
2005 8 9 Target Exceeded
2006 11 (June 2007) Pending
2007 11 (June 2008) Pending
2008 11 (June 2009) Pending
2009 11 (June 2010) Pending
2010 11 (June 2011) Pending
2011 11 (June 2012) Pending
Source. U. S Department of Education, Office of Postsecondary Education, GAANN Final Performance
Report.
Frequency of Data Collection. Annual
Measure 1.4 of 8: Percentage of GAANN fellows who are Black or African American by grantee
cohort enrolled in a terminal degree program in the designated areas of national need. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
1999 7 Measure not in place
2001 7 Measure not in place
2002 10 Measure not in place
2003 Set a Baseline 7 Target Met
2004 7 7 Target Met
2005 7 17.6 Target Exceeded
2006 10 (June 2007) Pending
2007 10 (June 2008) Pending
2008 10 (June 2009) Pending
2009 10 (June 2010) Pending
2010 10 (June 2011) Pending
2011 10 (June 2012) Pending
Source. U. S Department of Education, Office of Postsecondary Education, GAANN Final Performance
Report.
Frequency of Data Collection. Annual
Data Quality. Program data are supplied by institutions, which certify the accuracy of the data.
Explanation. The authorizing legislation of the GAANN program recommends, but does not mandate,
that institutional grantees seek individuals from traditionally underrepresented groups when awarding
fellowships. In responding to the grant application selection criteria, however, grantees must address
plans to include students from underrepresented groups. The 2002 data reflects the 1997 cohort only.
Succeeding years represent two cohorts: 2003 data reflect the fellows from the 1998 and 2000 cohorts.
No grants were awarded in 1999. The 2004 data reflect fellows from the 2000 and 2001 cohorts.
Measure 1.5 of 8: Percentage of GAANN fellows who are Hispanic or Latino by grantee cohort
enrolled in a terminal degree program in the designated areas of national need. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
1999 4 Measure not in place
2001 7 Measure not in place
2002 5 Measure not in place
2003 Set a Baseline 2 Target Met
2004 2 9 Target Exceeded
FY 2006 Program Performance Report
60 11/14/2006
U.S. Department of Education
2005 6 6.8 Target Exceeded
2006 5 (June 2007) Pending
2007 5 (June 2008) Pending
2008 5 (June 2009) Pending
2009 5 (June 2010) Pending
2010 5 (June 2011) Pending
2011 5 (June 2012) Pending
Source. U. S Department of Education, Office of Postsecondary Education, GAANN Final Performance
Report.
Frequency of Data Collection. Annual
Data Quality. Program data are supplied by institutions, which certify the accuracy of the data.
Explanation. The authorizing legislation of the GAANN program recommends, but does not mandate,
that institutional grantees seek individuals from traditionally underrepresented groups when awarding
fellowships. In responding to the grant application selection criteria, however, grantees must address
plans to include students from underrepresented groups. The 2002 data reflects the 1997 cohort only.
Succeeding years represent two cohorts: 2003 data reflect the fellows from the 1998 and 2000 cohorts.
No grants were awarded in 1999. The 2004 data reflect fellows from the 2000 and 2001 cohorts.
Measure 1.6 of 8: The percentage of GAANN fellows who are women. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
1999 37 Measure not in place
2001 39 Measure not in place
2002 38 Measure not in place
2003 Set a Baseline 35 Target Met
2004 35 41 Target Exceeded
2005 39 41 Target Exceeded
2006 39 (June 2007) Pending
2007 40 (June 2008) Pending
2008 40 (June 2009) Pending
2009 41 (June 2010) Pending
2010 41 (June 2011) Pending
2011 42 (June 2012) Pending
Source. U. S Department of Education, Office of Postsecondary Education, GAANN Final Performance
Report.
Frequency of Data Collection. Annual
Data Quality. Program data are supplied by institutions, which certify the accuracy of the data.
Explanation. The authorizing legislation of the GAANN program recommends, but does not mandate,
that institutional grantees seek individuals from traditionally underrepresented groups when awarding
fellowships. In responding to the grant application selection criteria, however, grantees must address
plans to include students from underrepresented groups. The 2002 data reflects the 1997 cohort only.
Succeeding years represent two cohorts: 2003 data reflect the fellows from the 1998 and 2000 cohorts.
No grants were awarded in 1999. The 2004 data reflect fellows from the 2000 and 2001 cohorts.
Measure 1.8 of 8: Federal cost of GAANN Ph.D.s and those who pass preliminary exams over
the life of the grant. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2002 92,557 Measure not in place
2003 127,514 Measure not in place
2005 70,359 Measure not in place
2006 127,500 (September 2007) Pending
2008 92,000 (September 2009) Pending
2009 91,000 (September 2010) Pending
2011 89,000 (September 2012) Pending
2012 88,000 (September 2013) Pending
Source. U. S Department of Education, Office of Postsecondary Education, GAANN Final Performance
Report.
Frequency of Data Collection. Annual
Data Quality. The number of successful GAANN fellows are supplied by institutions, which certify the
accuracy of the data.
Explanation. The program office has developed a database to collect this information. This measure is
derived by taking the total funding for years one, two, and three divided by the number of GAANN Ph.D.s
and those that pass preliminary exams during that period. The 2002 information is based on the 1997
cohort. The 2003 information was based on the 1998 cohort and 2000 cohorts; information for 2004 was
based on 2000 and 2001 cohorts. No new grants are awarded each third year, so that there were no
FY 2006 Program Performance Report
62 11/14/2006
U.S. Department of Education
cohorts of new fellows in 1999 or 2002. Targets have been established based on an estimated maximum
stipend, an estimated institutional payment, and the completion rate measure targets.
Program Goal: To meet the nation's security and economic needs through the
development of a national capacity in foreign languages, and
area and international studies.
Objective 1 of 9: The National Resource Centers (NRC) Program provides grants to
institutions of higher education or consortia of institutions of higher education
to establish, strengthen, and operate comprehensive and undergraduate
language and area/international studies centers.
Measure 1.1 of 2: The percentage of National Resource Center Ph.D. graduates who find
employment in higher education, government, and national security. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2001 48.5 Measure not in place
2002 53.7 Measure not in place
2003 55 Measure not in place
2004 47 71.8 Target Exceeded
2005 47.5 (December 2006) Pending
2006 48 (December 2007) Pending
2007 48.5 (December 2008) Pending
2008 49 (December 2009) Pending
2009 49.5 (December 2010) Pending
2010 50 (December 2011) Pending
2011 50.5 (December 2012) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, National Resource Centers
(NRC) Program, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Government employment reflects employment in federal government. Employment in
national security is represented by military employment. This measure will be phased out in future years.
Objective 2 of 9: The Foreign Language and Area Studies (FLAS) Fellowship Program
provides academic year and summer fellowships to institutions of higher
education to assist graduate students in foreign language and either area or
international studies.
Measure 2.1 of 3: The average competency score of Foreign Language and Area Studies
(FLAS) Fellowship Program recipients at the end of one full year of instruction minus the
average score at the beginning of the year. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2003 1.3 Measure not in place
2004 1.2 1.22 Target Exceeded
2005 1.2 1.2 Target Met
2006 1.2 (December 2006) Pending
2007 1.2 (December 2007) Pending
2008 1.2 (December 2008) Pending
2009 1.2 (December 2009) Pending
2010 1.2 (December 2010) Pending
2011 1.2 (December 2011) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Foreign Language and Areas
Studies (FLAS) Fellowship Program, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Overall change in the language competency self-assessment reflects a mix of different
levels of improvement at all stages (beginner, intermediate, advanced) of the three modalities of language
acquisition that the assessment measures (reading, writing, speaking). Beginning language students may
be expected to make larger advances over a given time period (and therefore have larger change scores)
Measure 2.2 of 3: Cost per successful outcome: the federal cost per Foreign Language and Area
Studies (FLAS) Fellowship Program recipient to increase their average competency score by at least one
point. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2004 17,439 Measure not in place
2006 999 (December 2007) Pending
2007 999 (December 2008) Pending
2008 999 (December 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Foreign Language and Areas
Studies (FLAS) Fellowship Program, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. The calculation is the annual appropriation for FLAS divided by the number of FLAS
fellowship recipients increasing their average competency score by at least one point from pre- to post-
test. Data for FY 2005 will be available in December 2006. For FY 2006-2008, this will be an efficiency
measure without targets.
Measure 2.3 of 3: The percentage of foreign language language and area studies (FLAS)
fellowship program Ph.D. graduates who find employment in higher education, government, and
national security. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2004 16 Measure not in place
2006 17 (December 2007) Pending
2007 18 (December 2008) Pending
2008 999 (December 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluations of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Foreign Language and Areas
Studies (FLAS) Fellowship Program, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Government employment reflects employment in federal government. Employment in
national security is represented by military employment.
Measure 3.2 of 2: The percentage of Centers for International Business Education Ph.D.
graduates who find employment in higher education and government. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2004 77.9 Measure not in place
2006 77.9 (December 2007) Pending
2007 77.9 (December 2008) Pending
2008 999 (December 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Centers for International
Business Education Program, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Government employment reflects employment in federal government. Data for FY 2005 will
be available in December 2006.
Objective 4 of 9: The International Research and Studies (IRS) Program supports surveys,
studies, and instructional materials development to improve and strengthen
instruction in modern foreign languages, area studies, and other international
fields to provide full understanding of the places in which the foreign
languages are commonly used.
Measure 4.1 of 3: Percentage of International Research and Studies Program projects judged
to be successful by the program officer, based on a review of information provided in annual
performance reports. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (January 2007) Pending
2007 Maintain a Baseline (January 2008) Pending
2008 999 (January 2009) Pending
Measure 4.2 of 3: Number of outreach activities that result in adoption or further dissemination
within a year, divided by the total number of IRS outreach activities conducted in the current
reporting period. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (January 2007) Pending
2007 Maintain a Baseline (January 2008) Pending
2008 999 (January 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, International Research and
Studies (IRS) Program, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Data for FY 2005 will be available in March 2007.
Measure 4.3 of 3:
Measure 5.2 of 3: Cost of Language Resource Centers project that results in adoption or further
dissemination within a year. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (March 2008) Pending
2007 Maintain a Baseline (March 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Language Resource Centers
Programs, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. The calculation is the number of project activities that result in adoption or further
dissemination within a year, divided by the total number of LRC projects funded in the same year. FY
2005 data will be used as the baseline.
Measure 5.3 of 3:
Objective 7 of 9: The Business and International Education (BIE) Program provides funds to
institutions of higher education that enter into an agreement with a trade
association and/or business for two purposes: to improve the academic
teaching of the business curriculum and to conduct outreach activities that
expand the capacity of the business community to engage in international
economic activities.
Measure 7.1 of 3: Percentage of Business and International Education Program projects judged
to be successful by the program officer, based on a review of information provided in annual
performance reports. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (December 2007) Pending
2007 Maintain a Baseline (December 2008) Pending
2008 999 (December 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Business and International
Education Program, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. FY 2005 data will be used as the baseline and will be available in December 2006.
Measure 7.2 of 3: Number of outreach activities that result in adoption or further dissemination
within a year, divided by the total number of BIE outreach activities conducted in the current
Measure 9.2 of 3: Percentage of scholars who indicated they were "highly satisfied" with the
services the Center provided. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (December 2007) Pending
2007 Maintain a Baseline (December 2008) Pending
2008 999 (December 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, American Overseas Research
Centers, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. FY 2005 data will be used as the baseline and will be available in March 2007.
Program Goal: To meet the nation's security and economic needs through the
development of a national capacity in foreign languages, and
area and international studies.
Objective 1 of 2: Maintain a U.S. higher education system able to produce experts in less
commonly taught languages and area studies who are capable of
contributing to the needs of the U.S. Government, and national security.
Measure 1.1 of 1: Percentage of employed Institute for International Public
Policy graduates in government or international service. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (December 2007) Pending
2007 Maintain a Baseline (December 2008) Pending
2008 Maintain a Baseline (December 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, International Education and
Foreign Language Studies Institute for International Public Policy Program, annual and final reports (Web
Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation.
FY 2005 data will be used to establish a baseline and will be available in December 2006. For FY 2006
and FY 2007, the target is to maintain the baseline.
Measure 2.2 of 2: Efficiency measure: cost per International Education and Foreign Language
Studies Institute for International Public Policy graduate employed in government or international
service. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (December 2007) Pending
2007 Maintain a Baseline (December 2008) Pending
2008 999 (December 2009) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, International Education and
Foreign Language Studies Institute for International Public Policy Program, annual and final reports (Web
Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality. Data are supplied by institutions, which certify the accuracy of the data.
Explanation. The calculation is the grant allocation amount divided by the numbers of Master's degrees
completed by program participants. FY 2005 data will be used to establish a baseline and will be available
in December 2006.
Measure 1.2 of 3: The average time in years to degree completion for Javits fellows. (Desired
direction: decrease)
Actual
Year Target Status
(or date expected)
2003 6.3 Measure not in place
2004 6.3 Measure not in place
2005 6.3 (February 2007) Pending
2006 6.3 (February 2008) Pending
2007 6.2 (February 2009) Pending
2008 6.2 (February 2010) Pending
FY 2006 Program Performance Report
79 11/14/2006
U.S. Department of Education
2009 6.1 (February 2011) Pending
2010 6.1 (February 2012) Pending
2011 6 (February 2013) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Jacob K. Javits Fellowship
Program Annual Performance Report.
Frequency of Data Collection. Annual
Data Quality. Program data are supplied by institutions, which certify the accuracy of the data.
Measure 1.3 of 3: The Federal cost per terminal degree (PhD/MFA) awarded for the Javits
Fellowship Program. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2003 109,873 Measure not in place
2004 110,000 Measure not in place
2006 999 (December 2007) Pending
2007 999 (December 2008) Pending
2008 999 (December 2009) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Jacob K. Javits Fellowship
Program Annual Performance Report; U.S. Department of Education, Office of Chief Financial Officer,
Grant Administration and Payment System (GAPS).
Frequency of Data Collection. Annual
Data Quality. Program data are supplied by institutions, which certify the accuracy of the data.
Explanation. Efficiency data are determined by calculating the total dollars allocated to the cohorts
divided by the total number of Javits Fellows receiving a terminal degree during this same time frame.
The baseline was calculated using appropriation amounts for fiscal years 1998 through 2001, and school
year data for 1998-99 through 2001-02. Over time, the uses for this efficiency measure may include
examining the cost per successful outcome for the Javits Program as compared with other comparable
programs.
Measure 1.2 of 5: The percent reduction of origination and disbursement unit costs, compared
to FY 2005. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (January 2007) Pending
2008 10 Pending
Measure 1.3 of 5: The percent reduction of Direct Loan Servicing unit costs, compared to FY
2005. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (January 2007) Pending
2008 12 Pending
2010 12 Pending
Measure 1.4 of 5: The percent reduction of Collections unit costs, compared to FY 2005.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (January 2007) Pending
2008 14 Pending
2010 14 Pending
Measure 1.5 of 5: The President's Management Agenda Scorecard rating for the Improper
Payments Initiative. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 1 Measure not in place
2006 2 Pending
2010 3 Pending
Source. Executive Office of the President, Office of Management and Budget, President's Management
Agenda Scorecard.
Frequency of Data Collection. Annual
Explanation. In the first quarter of FY 2005, OMB introduced a new President's Management Agenda
(PMA) initiative, Eliminating Improper Payments, to support agency efforts to meet the Improper
Payments Information Act of 2002 (IPIA) reporting requirements. This initiative makes it easier for
agencies to track the progress of activities aimed at identifying, reporting on and reducing improper
payments. At the same time, it provides for more comprehensive agency accountability to OMB through
quarterly PMA scorecards. Federal Student Aid is working closely with OMB to develop an action plan
designed to (a) reduce the amount of improper payments in our programs, (b) lower the risk of improper
payments in our programs and (c) improve the accuracy of our improper payment estimates. In FY 2005,
FSA received red and the FY 2006 target is yellow and the FY 2010 target is green.
Measure 1.2 of 2: The percentages of TRIO McNair participants persisting in graduate school.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 48 Measure not in place
2000 48 75 Target Exceeded
2001 66 Measure not in place
2002 48 65 Target Exceeded
2003 75 78 Target Exceeded
2004 75 77.7 Target Exceeded
2005 70 80 Target Exceeded
2006 79 (December 2007) Pending
2007 79 (December 2008) Pending
2008 79.5 (December 2009) Pending
2009 79.5 (December 2010) Pending
2010 80 (December 2011) Pending
2011 80 (December 2012) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Ronald E. McNair
Postbaccalaureate Achievement Program Annual Performance Report.
Frequency of Data Collection. Annual
Data Quality. The primary data source is the annual performance report that comprises self-reported
data.
Measure 1.3 of 4: The percentage of Student Support Services first-year students completing a
Bachelor's degree at original institution within six-years. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 29 Measure not in place
2001 29 Pending
2002 29 Pending
2003 29.5 Pending
2004 30 28.1 Made Progress From Prior Year
2005 30.5 29.4 Made Progress From Prior Year
2006 28 (December 2007) Pending
2007 29 (December 2008) Pending
2008 29 (December 2009) Pending
2009 29.5 (December 2010) Pending
2010 29.5 (December 2011) Pending
2011 30 (December 2012) Pending
2012 30 (December 2013) Pending
Source. U.S. Department of Education, Office of Postsecondary Education, Student Support Services
Program Annual Performance Report.
Frequency of Data Collection. Annual
Data Quality. The annual performance reports are comprised of self-reported data; a variety of data
quality checks are used to assess the completeness and reasonableness of the data submitted.
Explanation. 2004 is first year for which graduation data for four-year schools were available from the
annual performance reports.
Measure 1.4 of 4: The gap between the cost per successful outcome and the cost per program
Measure 1.2 of 3: The percentage of TRIO Talent Search participants enrolling in college.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 73 Measure not in place
2001 77 Measure not in place
2002 78 Measure not in place
2003 79 Measure not in place
2004 73.5 77.6 Target Exceeded
2005 74 77.8 Target Exceeded
2006 78.5 (December 2007) Pending
2007 79 (December 2008) Pending
2008 79 (December 2009) Pending
2009 79.5 (December 2010) Pending
2010 79.5 (December 2011) Pending
2011 80 (December 2012) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Talent Search Program
Annual Performance Reports.
Frequency of Data Collection. Annual
Data Quality. The annual performance reports are comprised of self-reported data; a variety of data
quality checks are used to assess the completeness and reasonableness of the data submitted.
Explanation. Future targets were recalculated in FY 2006 as the enrollment rate has increased
significantly from 2000, the year from which the targets were initially set.
Measure 1.3 of 3: The percentage of TRIO Talent Search participants applying for financial aid.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 82 Measure not in place
2001 86 Measure not in place
2002 86 Measure not in place
2003 85.6 Measure not in place
2004 85.1 Measure not in place
2005 85.4 Measure not in place
2006 86 (December 2007) Pending
2007 86.5 (December 2008) Pending
2008 86.5 (December 2009) Pending
2009 87 (December 2010) Pending
The gap is the difference between the cost per output, which is the annual allocated appropriation divided
by the number of students receiving services, and cost per successful outcome, which is the annual
allocated appropriation divided by the number of students who persist in high school and enroll in college.
The cost per successful outcome has been calculated for 2003, 2004 and 2005 and is $6,340, $6,579 and
$6,138, respectively. For FY 2007-2008, this is an efficiency measure without targets.
Measure 1.3 of 3: The percentage of higher-risk Upward Bound participants enrolling in college.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 34 Measure not in place
2003 35 Not Collected Not Collected
2004 35.5 (November 2007) Pending
2005 36 (November 2008) Pending
2006 36.5 (November 2009) Pending
2007 37 (November 2010) Pending
2008 999 (November 2011) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Upward Bound Program
Annual Performance Report.
Data Quality. The annual performance report is comprised of self-reported data; a variety of data quality
checks are used to assess the completeness and reasonableness of the data submitted. The definition of
higher-risk student used in the 2001 national evaluation is somewhat different than the criteria used by
Upward Bound projects funded under the Upward Bound Initiative.
Program Goal: Individuals who are deaf-blind will become independent and
function as full and productive members of their local
community.
Objective 1 of 2: Individuals who are deaf-blind received the specialized services and training
they need to become as independent and self-sufficient as possible.
Measure 1.1 of 4: The number of Helen Keller National Center adult consumers served at
headquarters. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 85 75 Made Progress From Prior Year
2000 90 82 Made Progress From Prior Year
2001 90 87 Made Progress From Prior Year
Measure 1.2 of 4: The percentage of adult consumers served by Helen Keller National Center
headquarters who successfully achieve identified training goals. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 84 Pending
2000 85 Pending
2001 86 92 Target Exceeded
2002 90 Measure not in place
2003 88 Measure not in place
2004 88 90 Target Exceeded
2005 88 89 Target Exceeded
2006 88 93 Target Exceeded
2007 90 (October 2007) Pending
2008 90 (October 2008) Pending
2009 90 (October 2009) Pending
2010 90 (October 2010) Pending
2011 90 (October 2012) Pending
Source. Helen Keller National Center for Deaf-Blind Youths and Adults, internal client caseload reports.
Frequency of Data Collection. Annual
Data Quality. Final transition plans for each client will include the employment and living situations each
client will be entering upon completion of training. Data are self-reported.
Measure 1.3 of 4: The percentage of adult consumers served by Helen Keller National Center
headquarters who are successfully placed in less restrictive settings. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
1999 25 Pending
2000 49 Pending
2001 59 71 Target Exceeded
Measure 1.4 of 4: The percentage of adult consumers served by Helen Keller National Center
headquarters who are successfully placed in employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 38 45 Target Exceeded
2000 45 52 Target Exceeded
2001 45 38 Did Not Meet Target
2002 45 27 Did Not Meet Target
2003 42.5 Measure not in place
2004 45 46 Target Exceeded
2005 45 41 Did Not Meet Target
2006 45 46 Target Exceeded
2007 45 (October 2007) Pending
2008 45 (October 2008) Pending
2009 45 (October 2009) Pending
2010 45 (October 2010) Pending
2011 45 (October 2011) Pending
Source. Helen Keller National Center for Deaf-Blind Youths and Adults, internal client caseload reports.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Explanation. Final transition plans for each client will include the employment and living situations each
client will be entering upon completion of training.
Measure 2.2 of 4: The percentage of consumers served by Helen Keller National Center
regional programs who successfully secure employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Pending
2007 Set a Baseline (October 2007) Pending
2008 BL+1% (October 2008) Pending
2009 BL+2% (October 2009) Pending
Source. U.S. Department of Education, Helen Keller National Center for Deaf-Blind Youths and Adults,
annual report.
Frequency of Data Collection. Annual
Measure 2.3 of 4: The percentage of consumers served by Helen Keller National Center
regional programs who successfully retain employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Pending
2007 Set a Baseline (October 2007) Pending
2008 Maintain a Baseline (October 2008) Pending
2009 Maintain a Baseline (October 2009) Pending
Source. U.S. Department of Education, Helen Keller National Center for Deaf-Blind Youths and Adults,
annual report.
Frequency of Data Collection. Annual
Measure 2.4 of 4: The percentage of consumers served by Helen Keller National Center
regional programs who successfully achieve/maintain independent living outcomes. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline Pending
2007 Set a Baseline (October 2007) Pending
2008 Maintain a Baseline (October 2008) Pending
2009 Maintain a Baseline (October 2009) Pending
Program Goal: To meet the nation's security and economic needs through the
development of a national capacity in foreign languages, and
area and international studies.
Objective 1 of 4: The Fulbright-Hays Doctoral Dissertation Research Abroad (DDRA)
program, provides grants to colleges and universities to fund individual
doctoral students to conduct research in other countries in modern foreign
languages and area studies for periods of 6- to -12 months.
Measure 1.1 of 3: The average language competency score of Fulbright-Hays Doctoral
Dissertation Research Abroad (DDRA) fellowship recipients at the end of their period of
instruction minus their average score at the beginning of the period. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (December 2007) Pending
2007 0.75 (December 2008) Pending
2008 0.75 (December 2009) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Fulbright-Hays Doctoral
Dissertation Research Abroad, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality.
Data are supplied by institutions, which certify the accuracy of the data.
Explanation. FY 2005 data will be available in December 2006 and will be used to establish the baseline
and validate the FY 2007 and FY 2008 targets.
Data are supplied by institutions, which certify the accuracy of the data. The target for FY 2006 and 2007
is maintain the baseline.
Explanation.
FY 2005 data will be available by April 2007 and will be used to establish the baseline. For FY 2006 and
2007, the target is to maintain the baseline.
Measure 1.3 of 3: Efficiency measure: cost per grantee increasing language competency by at
least one level in one (or all three) area. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (December 2007) Pending
2007 Maintain a Baseline (December 2008) Pending
2008 Maintain a Baseline (December 2009) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Fulbright-Hays Doctoral
Dissertation Research Abroad, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality.
Data are supplied by institutions, which certify the accuracy of the data.
Explanation.
The calculation is the annual appropriation for DDRA divided by the number of DDRA recipients who
increase their language competency appropriately. FY 2005 data will be available in December 2006 and
will establish the baseline. For FY 2007, the target is to maintain the baseline.
The average language competency score of Fulbright-Hays Faculty Research Abroad program
recipients at the end of their period of instruction minus their average language competency at
the beginning of the period.
Data are supplied by institutions, which certify the accuracy of the data.
Explanation. Data for FY 2005 will used as the baseline and will be available in December 2006.
Measure 2.2 of 3: Percentage of all Fulbright-Hays Faculty Research Abroad program projects
judged to be successful by the program officer, based on a review of information provided in
annual performance reports. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Maintain a Baseline (April 2008) Pending
2007 Maintain a Baseline (April 2009) Pending
2008 Maintain a Baseline (April 2010) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Evaluation of Exchange,
Language, International, and Area Studies (EELIAS) Reporting System, Fulbright-Hays Faculty Research
Abroad, annual and final reports (Web Site: http://www.eelias.org).
Frequency of Data Collection. Annual
Data Quality.
Data are supplied by institutions, which certify the accuracy of the data.
Explanation.
FY 2005 data will be available by April 2007 and will be used to establish the baseline. For FY 2007, the
target is to maintain the baseline.
Data are supplied by institutions, which certify the accuracy of the data.
Explanation. The calculation is the annual appropriation for FRA divided by the number of FRA recipients
who increase their language competency by at least one level in any of the three components of the
language proficiency assessment at the end of their period of instruction. 2004-05 data will establish a
baseline and will be available in December 2006.
Objective 3 of 4: The Fulbright-Hays Group Projects Abroad (GPA) program provides grants
to support overseas projects in training, research, and curriculum
development in modern foreign languages and area studies by teachers,
students, and faculty engaged in a common endeavor.
Measure 3.1 of 3:
The difference between the average language competency of Fulbright-Hays Group Projects
Abroad program recipients at the end of their period of instruction and their average competency
at the beginning of the period.
Data are supplied by institutions, which certify the accuracy of the data.
Explanation. FY 2005 data will be used as the baseline and will be available in December 2006.
Measure 3.2 of 3:
Data are supplied by institutions, which certify the accuracy of the data.
Explanation.
FY 2005 data will be available by April 2007 and will establish the baseline. For FY 2007, the target is to
maintain the baseline.
Measure 3.3 of 3:
Efficiency measure: cost per grantee increasing language competency by at least one level in
one (or all three) area.
Data are supplied by institutions, which certify the accuracy of the data.
Explanation.
FY 2005 data will be available by April 2007 and will establish the baseline. For FY 2007, the target is to
maintain the baseline.
Data are supplied by institutions, which certify the accuracy of the data.
Explanation. The calculation is the annual appropriation for SA divided by the number of successfully
completed projects. FY 2005 data will be used as the baseline and will be available in December 2006.
Objective 2 of 2: Accurately identify problem areas requiring systemic change and engage in
systemic activity to improve services under the Rehabilitation Act.
Measure 2.1 of 1: The percentage of Client Assistance Programs (CAPs) that reported that their
systemic advocacy resulted in a change in policy or practice. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1999 43 Measure not in place
2000 44 44 Target Met
2001 45 45 Target Met
2002 46 54 Target Exceeded
2003 47 48 Target Exceeded
2004 49 57 Target Exceeded
FY 2006 Program Performance Report
107 11/14/2006
U.S. Department of Education
2005 50 (December 2006) Pending
2006 54 (December 2007) Pending
2007 60 (December 2008) Pending
2008 60 (December 2009) Pending
2009 60 (December 2010) Pending
2010 60 (December 2011) Pending
Source. U.S. Department of Education, Client Assistance Program, Annual Performance Report, RSA-
227, narrative section.
Frequency of Data Collection. Annual
Data Quality. Department of Education program specialists conduct appropriate reviews of annual data.
On-site, files are randomly sampled and cross-checked with reported data to verify data quality.
Measure 1.2 of 4: The percentage of Independent Living Centers consumers who report having
access to previously unavailable appropriate accommodations to receive health care services,
as a result of direct services provided by an Independent Living Center (including referral to
another service provider). (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (May 2007) Pending
2007 Maintain a Baseline (May 2008) Pending
2008 BL+2% (May 2009) Pending
2009 BL+4% (May 2010) Pending
2010 BL+5% (May 2011) Pending
FY 2006 Program Performance Report
109 11/14/2006
U.S. Department of Education
2011 BL+6% (August 2012) Pending
Source. U.S. Department of Education, Independent Living Centers, Annual 704 report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Measure 1.3 of 4: The percentage of Independent Living Centers consumers who report having
access to previously unavailable assistive technology which results in increased independence
in at least one significant life area, as a result of direct services provided by an Independent
Living Center (including referral to another service provider). (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (May 2007) Pending
2007 Maintain a Baseline (May 2008) Pending
2008 BL+2% (May 2009) Pending
2009 BL+4% (May 2010) Pending
2010 BL+6% (May 2011) Pending
2011 BL+8% (May 2012) Pending
Source. U.S. Department of Education, Independent Living Centers, Annual 704 report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Measure 1.4 of 4: The percentage of Independent Living Centers consumers who report having
access to previously unavailable transportation, as a result of direct services provided by an
Independent Living Center (including referral to another service provider). (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (May 2007) Pending
2007 Maintain a Baseline (May 2008) Pending
2008 BL+2% (May 2009) Pending
2009 BL+4% (May 2010) Pending
2010 BL+6% (May 2011) Pending
2011 BL+8% (May 2012) Pending
Source. U.S. Department of Education, Independent Living Centers, Annual 704 report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Measure 2.2 of 4: The percentage of Independent Living Centers' staff, board members and/or
consumers creating/participating on community committees, advocacy initiatives, public
information campaigns, and/or other community events designed to develop relationships with
health care providers within the community. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (May 2007) Pending
2007 Maintain a Baseline (May 2008) Pending
2008 BL+2% (May 2009) Pending
2009 BL+3% (May 2010) Pending
2010 BL+4% (May 2011) Pending
2011 BL+5% (May 2012) Pending
Source. U.S. Department of Education, Independent Living Centers, Annual 704 report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Measure 2.3 of 4: The percentage of Independent Living Centers' staff, board members and/or
consumers creating/participating on community committees, advocacy initiatives, public
information campaigns, and/or other community events designed to increase the availability
/access to assistive technology within the community. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (May 2007) Pending
2007 Maintain a Baseline (May 2008) Pending
2008 BL+2% (May 2009) Pending
2009 BL+3% (May 2010) Pending
2010 BL+4% (May 2011) Pending
2011 BL+5% (May 2012) Pending
Source. U.S. Department of Education, Independent Living Centers, Annual 704 report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Objective 3 of 3: Improve the efficiency and transparency of the Centers for Independent
Living Program.
Measure 3.1 of 1: The number of months between the due date for Independent Living Centers
data and the release of the data to the public. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2004 7 Measure not in place
2005 5 6 Made Progress From Prior Year
2006 5 (May 2007) Pending
2007 5 (May 2008) Pending
2008 5 (May 2009) Pending
2009 5 (May 2010) Pending
2010 5 (May 2011) Pending
2011 5 (May 2012) Pending
Source. U.S. Department of Education, Rehabilitation Services Administration, program files.
Frequency of Data Collection. Annual
Explanation. FY 2004 data established the baseline.
Measure 1.2 of 2: The percentage of Independent Living Title VII, Chapter 2, consumers who
report an improvement in activities of daily living skills. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 Set a Baseline 53 Target Met
2006 BL+1% (May 2007) Pending
2007 55 (May 2008) Pending
2008 56 (May 2009) Pending
2009 57 (May 2010) Pending
2010 58 (May 2011) Pending
2011 59 (May 2012) Pending
Source. U.S. Department of Education, Independent Living Services for Older Blind Individuals, Annual
7-OB report.
Frequency of Data Collection. Annual
Objective 2 of 2: Improve the efficiency and transparency of the IL Title VII, Chapter 2 Older
Blind Program
Measure 2.1 of 1: The number of months between when the Title VII, Chapter 2, data are due
and the release of the data to the public. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
Measure 1.2 of 4: The percentage of Independent Living Part B consumers who report having
access to previously unavailable appropriate accommodations to receive health care services.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (May 2007) Pending
2007 Maintain a Baseline (May 2008) Pending
2008 BL+2% (May 2009) Pending
2009 BL+3% (May 2010) Pending
2010 BL+4% (May 2011) Pending
FY 2006 Program Performance Report
115 11/14/2006
U.S. Department of Education
2011 BL+5% (May 2012) Pending
Source. U.S. Department of Education, Independent Living State Grants, Annual 704 report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Measure 1.3 of 4: The percentage of Independent Living Part B consumers who report having
access to previously unavailable assistive technology which resulted in increased independence
in at least one significant life area. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (May 2007) Pending
2007 Maintain a Baseline (May 2008) Pending
2008 BL+2% (May 2009) Pending
2009 BL+3% (May 2010) Pending
2010 BL+4% (May 2011) Pending
2011 BL+5% (May 2012) Pending
Source. U.S. Department of Education, Independent Living State Grants, Annual 704 report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Measure 1.4 of 4: The percentage of consumers who are receiving or have received
Independent Living services who report satisfaction with the Independent Living services they
received. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (May 2007) Pending
2007 Maintain a Baseline (May 2008) Pending
2008 BL+2% (May 2009) Pending
2009 BL+3% (May 2010) Pending
2010 BL+4% (May 2011) Pending
2011 BL+5% (May 2012) Pending
Source. U.S. Department of Education, Independent Living State Grants, Annual 704 report.
Frequency of Data Collection. Annual
Data Quality. Data are self-reported.
Objective 2 of 2: Improve the efficiency and transparency of the IL Title VII, Part B
Independent Living Program.
Measure 2.1 of 1: The number of months between the due date for Independent Living Centers-
Part B data and the release of the data to the public. (Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2004 7 Measure not in place
Measure 1.2 of 2: The percentage of migrant or seasonal farmworkers with disabilities served
by VR Migrant and Seasonal Farmworkers projects who were placed in employment. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2002 53 Measure not in place
2003 59 Measure not in place
2004 53 65 Target Exceeded
2005 53 60.3 Target Exceeded
2006 53 (August 2007) Pending
2007 53 (August 2008) Pending
2008 53 (August 2009) Pending
2009 53 (August 2010) Pending
Program Goal: To create and expand job and career opportunities for individuals
with disabilities in the competitive labor market by engaging the
participation of business and industry in the rehabilitation
process.
Objective 1 of 2: Ensure that Project with Industry (PWI) services (through partnerships with
business and industry) result in competitive employment, increased wages,
and job retention for individuals with disabilties.
Measure 1.1 of 3: The percentage of individuals served by Projects with Industry who were
placed in competitive employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 59 Measure not in place
1998 49 Measure not in place
1999 61 59 Made Progress From Prior Year
2000 61 61.9 Target Exceeded
2001 62 62.4 Target Exceeded
2002 62.2 63.2 Target Exceeded
2003 62.4 54.2 Did Not Meet Target
2004 62.7 61.5 Made Progress From Prior Year
2005 63 51.9 Did Not Meet Target
2006 63 (March 2007) Pending
2007 55 (March 2008) Pending
2008 56 (March 2009) Pending
Source. U.S. Department of Education, Projects With Industry Compliance Indicators and Annual
Evaluation Plan Report
Frequency of Data Collection. Annual
Data Quality.
The Web-based system makes a limited number of automatic edit checks. Staff also check data for
"reasonableness." The primary limitation of the data is that they are self-reported.
The PWI data collection instrument was revised for FY 2005 reporting. The reporting change resulted in a
significant reduction in the reporting of the number of “new” individuals served in the reporting period and
inconsistencies in the grantee reported data on number served as compared to previous years. To correct
for this problem, the FY 2005 “placement rate” has been calculated as the “percentage of individuals
served who were placed into competitive employment” of the total number of individuals served by the
projects during the reporting period. In prior fiscal years, the placement rate was calculated based on
grantee reported data on the total “new” individuals served in the reporting period. This change in
calculation resulted in a significantly lower placement rate as compared to previous years.
FY 2006 Program Performance Report
120 11/14/2006
U.S. Department of Education
Explanation. FY 2006 is the beginning of a new 3-year grant cycle.
Measure 1.2 of 3: The percentage of exiting Projects with Industry participants who are placed
in competitive employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (March 2007) Pending
2007 Maintain a Baseline (March 2008) Pending
2008 Maintain a Baseline (March 2009) Pending
Source. U.S. Department of Education, Projects With Industry Compliance Indicators and Annual
Evaluation Plan Report
Frequency of Data Collection. Annual
Data Quality. FY 2006 is the first year for which data on number of individuals who exited the program
was collected. Based on previous experience, we expect that inconsistencies in the reporting of this data
may continue in 2007 and 2008.
Explanation. The measure will be calculated by dividing the number of participants placed in competitive
employment by the total number of participants who exited the program.
Measure 1.3 of 3: The average increase in weekly earnings for Projects with Industry
participants who were placed in competitive employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 207 Measure not in place
1998 209 Measure not in place
1999 209 226 Target Exceeded
2000 218 252 Target Exceeded
2001 218 236 Target Exceeded
2002 226 234 Target Exceeded
2003 231 242 Target Exceeded
2004 233 247 Target Exceeded
2005 238 253 Target Exceeded
2006 245 (March 2007) Pending
2007 248 (March 2008) Pending
2008 250 (March 2009) Pending
2009 250 (March 2010) Pending
2010 250 (March 2011) Pending
Source. U.S. Department of Education, Projects With Industry Compliance Indicators and Annual
Evaluation Plan Report
Frequency of Data Collection. Annual
Data Quality. A web-based reporting system conducts automatic edit checks.
Explanation. FY 2006 is the beginning of a new 3-year grant cycle.
Objective 2 of 2: Ensure that all PWI projects demonstrate effective fiscal management.
Program Goal: To support the protection and advocacy system in each state to
protect the legal and human rights of individuals with disabilities.
Objective 1 of 1: Identify problem areas requiring systemic change and engage in systemic
activities to address those problems.
Measure 1.1 of 1: The percentage of PAIRs that reported that their systemic advocacy resulted
in a change in policy or practice. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 54 Measure not in place
2001 68 Measure not in place
2002 56 81 Target Exceeded
2003 75 Measure not in place
2004 77 86 Target Exceeded
2005 79 89 Target Exceeded
2006 80 (April 2007) Pending
2007 83 (April 2008) Pending
2008 83 (April 2009) Pending
2009 83 (April 2010) Pending
2010 83 (April 2011) Pending
Source. U.S. Department of Education, Annual Protection and Advocacy of Individual Rights (PAIR)
Program, Annual Performance Report, RSA Form 509.
Frequency of Data Collection. Annual
Data Quality. Appropriate reviews of annual data are conducted by Department program specialists.
Onsite, a random sample of files is cross-checked with reported data for verification.
Measure 1.3 of 3: The percentage of individuals referred from state VR agencies to the VR
Demonstration and Training projects. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 58 35.64 Made Progress From Prior Year
2002 58 37.34 Made Progress From Prior Year
2003 60 27.55 Did Not Meet Target
2004 62 31.44 Made Progress From Prior Year
2005 33 40.71 Target Exceeded
2006 33 (March 2007) Pending
2007 33 (March 2008) Pending
2008 33 (March 2009) Pending
Source. U.S. Department of Education, Rehabilitation Services Administration, Web-based Annual
Performance Report.
Frequency of Data Collection. Annual
Data Quality. The Web-based system provides raw data but does not aggregate all the numbers needed,
which has resulted in hand counting to obtain the information required.
Explanation. Actual performance data were re-calculated for FY 2001 through 2004 to include only
projects with employment outcomes.
The Program Assessment Rating Tool assessment noted that these outcomes may be inflated due to the
variation in practices with respect to the closure of an individual's service record. Continued technical
assistance will help to ensure that grantees are providing uniform data.
Objective 2 of 2: Ensure that all AIVRS projects demonstrate effective fiscal management.
Measure 2.1 of 1: The percentage of Vocational Rehabilitation Grants for Indians projects that
Measure 1.2 of 6: The percentage of state vocational rehabilitation agencies for the blind that
assist at least 68.9 percent of individuals receiving services to achieve employment. (Desired
direction: increase)
Actual
Year Target Status
(or date expected)
2001 75 Measure not in place
2002 75 Measure not in place
2003 58 Measure not in place
Measure 1.3 of 6: The percentage of general and combined state vocational rehabilitation
agencies for which at least 80 percent of the individuals achieving competitive employment have
significant disabilities. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 75 Measure not in place
2003 82 Measure not in place
2004 86 Measure not in place
2005 88 Measure not in place
2006 88 (April 2007) Pending
2007 89 (April 2008) Pending
2008 90 (April 2009) Pending
Source. U.S. Department of Education, Rehabilitation Services Administration, RSA-911 report.
Frequency of Data Collection. Annual
Data Quality. Verified by the Department's attestation process and the Department's Standards for
Evaluating Program Performance Data. Accuracy/consistency of reporting is contingent upon counselors'
interpretations of definitions.
Explanation. This indicator is derived from state VR agency performance on indicator 1.4, in Section 106
of the Rehabilitation Act. For each VR agency, RSA examines the percentage of individuals achieving
competitive employment who have significant disabilities. To pass the Section 106 indicator, a
general/combined agency must achieve a rate of 62.4 percent. For purposes of this measure, beginning
with the FY 2006, RSA decided that the criteria were too low, and therefore has increased the rates to 80
percent for general and combined agencies.
Measure 1.4 of 6: The percentage of state vocational rehabilitation agencies for the blind for
which at least 90 percent of the individuals achieving competitive employment have significant
disabilities. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 88 Measure not in place
2003 88 Measure not in place
FY 2006 Program Performance Report
131 11/14/2006
U.S. Department of Education
2004 100 Measure not in place
2005 100 Measure not in place
2006 96 (April 2007) Pending
2007 100 (April 2008) Pending
2008 100 (April 2009) Pending
Source. U.S. Department of Education, Rehabilitation Services Administration, RSA-911 report.
Frequency of Data Collection. Annual
Data Quality. Verified by the Department's attestation process and the Department's Standards for
Evaluating Program Performance Data. Accuracy/consistency of reporting is contingent upon counselors'
interpretations of definitions.
Explanation. This indicator is derived from state VR agency performance on indicator 1.4, in Section 106
of the Rehabilitation Act. For each VR agency, RSA examines the percentage of individuals achieving
competitive employment who have significant disabilities. To pass the Section 106 indicator an agency for
the blind must achieve a rate of 89 percent. For purposes of this measure, beginning with the FY 2006,
RSA decided that the criteria were too low, and therefore has increased the rate to 90 percent for
agencies for the blind.
Measure 1.5 of 6: The percentage of general and combined state vocational rehabilitation
agencies assisting at least 85 percent of individuals with employment outcomes to achieve
competitive employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 62.5 Measure not in place
2002 88 Measure not in place
2003 93 Measure not in place
2004 67 95 Target Exceeded
2005 89 95 Target Exceeded
2006 96 (April 2007) Pending
2007 98 (April 2008) Pending
2008 98 (April 2009) Pending
2009 98 (April 2010) Pending
2010 98 (April 2011) Pending
Source. U.S. Department of Education, Rehabilitation Services Administration, RSA-911 report.
Frequency of Data Collection. Annual
Data Quality. Verified by the Department's attestation process and the Department's Standards for
Evaluation Program Performance Data. Accuracy/consistency of reporting is contingent upon counselors'
interpretations of definitions.
Explanation. This long-term indicator is derived from state VR agency performance on indicator 1.3 in
Section 106 of the Rehabilitation Act. For each VR agency, RSA examines the percentage of individuals
who achieve competitive employment of all individuals who achieve employment. To pass the Section 106
indicator, a general/combined agency must achieve a rate of 72.6 percent. For purposes of this measure,
beginning with the FY 2004 plan, RSA decided that the criteria were too low, and therefore increased the
rates to 85 percent for general and combined VR agencies.
Measure 1.6 of 6: The percentage of state vocational rehabilitation agencies for the blind that
FY 2006 Program Performance Report
132 11/14/2006
U.S. Department of Education
assist at least 65 percent of individuals with employment outcomes to achieve competitive
employment. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2001 41.7 Measure not in place
2002 50 Measure not in place
2003 54 Measure not in place
2004 48 71 Target Exceeded
2005 54 75 Target Exceeded
2006 71 (April 2007) Pending
2007 75 (April 2008) Pending
2008 75 (April 2009) Pending
2009 79 (April 2010) Pending
2010 79 (April 2011) Pending
Source. U.S. Department of Education, Rehabilitation Services Administration, RSA-911 report.
Frequency of Data Collection. Annual
Data Quality. Verified by the Department's attestation process and the Department's Standards for
Evaluation Program Performance Data. Accuracy/consistency of reporting is contingent upon counselors'
interpretations of definitions.
Explanation. This long-term indicator is derived from state VR agency performance on indicator 1.3 in
Section 106 of the Rehabilitation Act. For each VR agency, RSA examines the percentage of individuals
who achieve competitive employment of all individuals who achieve employment. To pass the Section 106
indicator an agency for the blind must achieve a rate of 35.4 percent. For purposes of this measure,
beginning with the FY 2004 plan, RSA decided that the criteria were too low, and therefore increased the
rates to 65 percent for agencies for the blind.
Measure 2.3 of 4: Percentage of general and combined State VR agencies that demonstrate an
average annual consumer expenditure rate of at least 83 percent. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 75 Measure not in place
2006 Set a Baseline (May 2007) Pending
2007 77 (May 2008) Pending
2008 78 (May 2009) Pending
Source. U.S. Department of Education, Rehabilitation Services Administration, RSA-2 Cost Report.
Frequency of Data Collection. Annual
Explanation. This is a new efficiency measure. Consumer service expenditure rate is calculated by
dividing the state VR agency's total program expenditures by consumer service expenditures.
Establishing the baseline includes both specifying the range and determing the percentage of agencies.
Measure 2.4 of 4:
Percentage of State VR agencies for the Blind that demonstrate an average annual consumer
expenditure rate of at least 70 percent.
Program Goal: To provide the public vocational rehabilitation (VR) sector with
well-trained staff and to maintain and upgrade the skills of
current staff through continuing education.
Objective 1 of 2: To provide graduates who work within the vocational rehabilitation (VR)
system to help individuals with disabilities achieve their goals.
Measure 1.1 of 4: The number of Rehabilitative Services Administration (RSA)-supported
scholarships. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
1997 1,600 Measure not in place
1998 1,550 Measure not in place
1999 1,473 1,665 Target Exceeded
2000 1,391 2,390 Target Exceeded
2001 2,540 Measure not in place
2002 2,000 2,232 Target Exceeded
2003 2,050 2,378 Target Exceeded
2004 2,050 2,051 Target Exceeded
2005 2,100 2,332 Target Exceeded
2006 2,000 (March 2007) Pending
2007 2,254 (March 2008) Pending
2008 2,254 (March 2009) Pending
2009 2,254 (March 2010) Pending
2010 2,254 (March 2011) Pending
2011 2,254 (March 2012) Pending
2012 2,254 (March 2013) Pending
Measure 1.2 of 4: The number of RSA-supported scholars who graduate. (Desired direction:
increase)
Actual
Year Target Status
(or date expected)
1997 800 Measure not in place
1998 817 Measure not in place
1999 729 832 Target Exceeded
2000 688 764 Target Exceeded
2001 841 Measure not in place
2002 700 817 Target Exceeded
2003 725 802 Target Exceeded
2004 725 796 Target Exceeded
2005 725 901 Target Exceeded
2006 725 (March 2007) Pending
2007 833 (March 2008) Pending
2008 833 (March 2009) Pending
2009 833 (March 2010) Pending
2010 833 (March 2011) Pending
2011 833 (March 2012) Pending
2012 833 (March 2013) Pending
Source. U.S. Department of Education, Vocational Rehabilitation Training, Annual performance report,
grantee submissions.
Frequency of Data Collection. Annual
Measure 1.4 of 4: The percentage of public vocational rehabilitation training participants who
report an improvement in their knowledge and skills acquisition. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2006 Set a Baseline (October 2007) Pending
2007 Maintain a Baseline (October 2008) Pending
2008 Maintain a Baseline (October 2009) Pending
Source. U.S. Department of Education, Vocational Rehabilitation Training, Annual performance report,
grantee submissions.
Frequency of Data Collection. Annual
Data Quality. Evaluation instruments vary across projects.
Explanation. Out year targets will be set after baseline data have been collected.
Objective 2 of 2: Maintain and upgrade the knowledge and skills of personnel currently
employed in the public VR system.
Measure 2.1 of 3: The percentage of currently employed state Vocational Rehabilitation agency
counselors who meet their state's Comprehensive System of Personnel Development (CSPD)
standards. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 69 Measure not in place
2001 70 71 Target Exceeded
2002 75 63 Did Not Meet Target
2003 77 67 Made Progress From Prior Year
2004 79 67 Did Not Meet Target
2005 70 73 Target Exceeded
2006 70 (March 2007) Pending
2007 72 (March 2008) Pending
2008 73 (March 2009) Pending
2009 74 (March 2010) Pending
2010 75 (March 2011) Pending
2011 76 (March 2012) Pending
2012 77 (March 2013) Pending
Source. U.S. Department of Education, Vocational Rehabilitation Training, Annual grantee in-service
progress report, grantee submissions.
Frequency of Data Collection. Annual
Explanation. Anticipate a leveling off in performance as staff turnover is at an all-time high due to
retirements, and there is an insufficient pool of qualified candidates to replenish the staff positions.
Source. U.S. Department of Education, Vocational Rehabilitation Training, Annual performance report,
grantee submissions.
Frequency of Data Collection. Other
Data Quality. Evaluation instruments vary across projects.
Source. U.S. Department of Education, Vocational Rehabilitation Training, Annual performance report,
grantee submissions.
Frequency of Data Collection. Other
Data Quality. The degree of assessment comprehensiveness varies across projects.
The program has a long-term target of 7,334 for FY 2009. We will continue to report data annually, but
there are no intermediate annual targets. Target is derived from project experience and applies an
estimated 1.0 percent annual rate of increase to the period between FY 2003 and FY 2009. Data for FY
2006 will be available in April 2007.
Note: 6,419 represents a corrected number of students enrolled for 2003. The previously published
number incorrectly included part-time students.
Institutions only report a persistence rate, not the numerator and denominator. As a result, the persistence
rate for the Howard University is calculated as a median. The persistence rate for Howard is high
compared to other institutions, so that maintaining the present rate is viewed as an ambitious goal. Data
for FY 2006 will be available in April 2007. Note: The 90% persistence rate for Fy 2005 had previously
been incorrectly reported for FY 2003.
Data are supplied by the instiution, which certifies the accuracy of the data.
Explanation.
The graduation rate for Howard is high compared to other institutions, so that maintaining (or slightly
increasing) the present rate is viewed as an ambitious goal. Graduation data will be monitored and
reported annually. Previously published graduation rates for years 1997 - 2002 have been deleted
because the graduation rate prior to 2003 was calculated differently. Graduation rate data for 2005 will be
available in April 2006.
Program Goal: To increase access to and improve vocational education that will
strengthen workforce preparation, employment opportunities,
and lifelong learning in the Indian community.
Objective 1 of 1: Ensure that vocational students served in tribally controlled postsecondary
vocational and technical institutions make successful transitions to work or
continuing education.
Measure 1.1 of 1: The percentage of vocational students in the Tribally Controlled
Postsecondary Vocational and Technical Institutions Programs who earn an A.A. degree or
certificate. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
Technical assistance continues to be provided to the grantees to assist in improving reporting procedures
and documentation.
Explanation.
The program did not meet its FY 2005 target. Target was not met because one grantee has not
sumbitted performance data. However, to improve this situation, the grantee will provide status updates
on completing the performance reporting requirements. Data will be available by September 1, 2006.
Source. U.S. Department of Education, Office for Civil Rights, Case Management System.
Frequency of Data Collection. Annual
Target Context. Data represent the first three quarters of the fiscal year.
Explanation. In FY 2006, OCR changed this objective to measure new complaints that resolve during the
fiscal year and added an additional measure for complaints pending over 180 days. Together these
measures cover OCR's entire complaint workload.
Measure 2.2 of 2: Percentage of pending civil rights complaints that are over 180 days old.
(Desired direction: decrease)
Actual
Year Target Status
(or date expected)
2006 25 21 Did Better Than Target
2007 25 Pending
Source. U.S. Department of Education, Office for Civil Rights, Case Management System.
Measure 1.2 of 2: The percentage of written reports that meet OIG timeliness standards.
(Desired direction: increase)
Actual
Year Target Status
(or date expected)
2005 75 67 Made Progress From Prior Year
2006 75 84 Target Exceeded
2007 75 (October 2007) Pending
2008 75 (October 2008) Pending
Source. U.S. Department of Education, Office of Inspector General, audit, investigation and inspection
reports.
Frequency of Data Collection. Annual
Data Quality. Validation done internally by each OIG component.
Program Goal: To help American Indian and Alaska Native children achieve to
the same challenging standards expected of all students by
supporting access to programs that meet their unique
educational and culturally related academic need.
Objective 1 of 1: American Indian and Alaska Native students served by LEAs receiving
Indian Education Formula Grants will progress at rates similar to those for all
students in achievement to standards, promotion, and graduation.
Measure 1.1 of 4: The percentage of American Indian and Alaska Native students in grade four
who scored at or above basic level in reading on NAEP. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 63 Measure not in place
2002 60 51 Made Progress From Prior Year
2003 62 47 Did Not Meet Target
2005 53 48 Made Progress From Prior Year
2007 50 (August 2008) Pending
Source. U.S. Department of Education, National Center for Education Statistics, National Assessment of
Educational Progress, National Indian Education Study 2005
Frequency of Data Collection. Biennial
Data Quality. Data validated by National Center for Education Statistics review procedures and statistical
standards.
Explanation. NAEP assessments for reading and mathematics are not administered annually. The next
assessment in reading and math is scheduled for 2007. American Indian and Alaska Native students
were oversampled for the 2005 NAEP assessments in reading and mathematics to increase the reliability
of the data.
Measure 1.2 of 4: The percentage of American Indian and Alaska Native students in grade eight
who scored at or above basic level in reading on NAEP. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 61 Measure not in place
2003 66 57 Did Not Meet Target
2005 63 59 Made Progress From Prior Year
2007 61 (June 2008) Pending
Source.
NAEP assessments for reading and mathematics are not administered annually. The next assessment in
reading and math is scheduled for 2007. American Indian and Alaska Native students were oversampled
for the 2005 NAEP assessments in reading and mathematics to increase the reliability of the data.
Measure 1.3 of 4: The percentage of American Indian and Alaska Native students in grade four
who scored at or above basic level in math on NAEP. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 40 Measure not in place
2002 64 Not Collected Not Collected
2003 66 64 Made Progress From Prior Year
2005 66 68 Target Exceeded
2007 69 (June 2008) Pending
Source. U.S. Department of Education, National Center for Education Statistics, National Assessment of
Educational Progress,
Frequency of Data Collection. Biennial
Data Quality. Data validated by National Center for Education Statistics review procedures and statistical
standards. The small sample (for the subpopulation of American Indian and Alaska Native students)
means there is a high degree of standard error surrounding the estimates and limits data collection and
possibilities for comparison to other populations. These estimates will vary greatly until a larger population
is surveyed.
Explanation. NAEP assessments for reading and mathematics are not administered annually. The next
assessment in reading and math is scheduled for 2007. American Indian and Alaska Native students were
oversampled for the 2005 NAEP assessments in reading and mathematics to increase the reliability of the
data..
Measure 1.4 of 4: The percentage of American Indian and Alaska Native students in grade eight
who scored at or above basic level in math on NAEP. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2000 47 Measure not in place
2002 62 Not Collected Not Collected
2003 64 52 Made Progress From Prior Year
2005 54 53 Made Progress From Prior Year
2007 55 (June 2008) Pending
Measure 1.2 of 4: The percentage of CCAMPIS program participants receiving child care
services who remain in postsecondary education at the end of the academic year as reported in
the 36-month performance report. (Desired direction: increase)
Actual
Year Target Status
(or date expected)
2002 79 Measure not in place
2004 79.5 74 Made Progress From Prior Year
2005 80 67 Did Not Meet Target
2008 81 (July 2009) Pending
2009 81.5 (July 2010) Pending
2011 82 (July 2012) Pending
Source. U. S. Department of Education, Office of Postsecondary Education, Child Care Access Means
Parents in School Program 36-Month Performance Report.
Frequency of Data Collection. Other
Data Quality. Data are supplied by grantee institutions with no formal verification procedure provided.
Grantees attest to the accuracy of data.
Explanation. This measure has been reformatted to display performance by year without regard to
cohort. Data are collected, per program statute, from 18-month and 36-month performance reports. The
calendar for 36 month data collection means that data are not collected in FY 2006, as there were no new
competitions in 2003 or 2004. Data for FY 2007 will be available in July 2008.
Measure 1.3 of 4: The graduation rate of Child Care Access Means Parents in School program
participants in postsecondary education in other than four-year schools as reported in the 18-