21stCentury Community Learning Centers
Overview of the
21stCCLC Annual Performance Data:
2014–2015
U.S. Department of Education
Office of Elementary and Secondary Education
21stCentury Community Learning Centers
Sylvia Lyles, PhD
Program Director, Office of Academic Improvement
This report was prepared for the U.S. Department of Education under contract number ED-ESE-14-C-0120. The contracting officer representative is Daryn Hedlund of the Office of Academic Improvement.
This report is in the public domain. Authorization to reproduce it in whole or in part is granted. While permission to reprint this publication is not necessary, the suggested citation is as follows:
U.S. Department of Education. (2016). 21stCentury Community Learning Centers (21st CCLC) overview of the 21stCCLC performance data: 2014–2015 (11th report). Washington, DC.
Content
Tables
INTRODUCTION
SECTION 1: GPRA RESULTS
A. GPRA Measures #1-3: Percentage of Improvement in Mathematics Grades
B. GPRA Measures #4-6: Percentage of Improvement in English Grades
C. GPRA Measures #7-8: Percentage of Improvement on Reading and Mathematics State
Assessments
D. GPRA Measures #9-11: Percentage of Improvement on Homework Completion and Class Participation
E. GPRA Measures #12-14: Percentage of Improvement in Student Behavior
SECTION 2: GRANTEE AND CENTER CHARACTERISTICS
A. Center Type
B. People Served
C. Activity Participation
D. Staffing Type
E. Attendees Served per Demographic
F. Estimated Per-Student Expenditures
CONCLUSION
Tables
Table 1. The GPRA Outcomes for all 54 States/Territories
Table 2. Regular Attendees % Improved in Mathematics Grades
Table 3. Regular Attendees % Improved in English Grades
Table 4. Regular Attendees % Improved on Reading/Mathematics State Assessments
Table 5. Regular Attendees % Improved Homework Completion/Class Participation
Table 6. Regular Attendees % Improved Student Behavior
Table 7. Grantees’ Centers Broken Down by Organization Type
Table 8. Attendees Served Based on Type
Table 9. Total Attendees by Center Type
Table 10. Regular Attendees by Center Type
Table 11. Times per Week of Each Activity Offered
Table 12. Frequency of Each Activity Offered
Table 13. Times per Week of Each Academic Activity Offered
Table 14. Frequency of Each Academic Activity Offered
Table 15. Staffing Type per Paid and Volunteer Staff
Table 16. Participant Demographics
Table 17. Number of Participants per Grade Level
Table 18. Estimated Expenditure per Regular Attendee and All Attendees
EXECUTIVE SUMMARY
Originally created in 1994 through the Elementary and Secondary School Act and expanded in 2001 through No Child Left Behind (NCLB), the 21stCentury Community Learning Centers (CCLC) program provides students in high-need, high-poverty communities the opportunity to participate in afterschool programming. Present in all 50 States, the District of Columbia, and 3 Territories (Bureau of Indian Education, Virgin Islands, and Puerto Rico), academic enrichment and youth development programs are designed to enhance participants’ well-being and academic success. For the 2014-2015 academic school year, the United States (US) Department of Education funded 11,512 centers under the 21stCCLC program.
In this Annual Performance Report (APR), data from the 21APR Data Collection System were analyzed to report on the Government Performance and Results Act (GPRA) performance indicators associated with the 21stCCLC program. These metrics assist the federal government in determining progress of the 21stCCLC program based on the statutory requirements. The APR has historically been completed by grantees once a year to summarize the operational elements of their program, the student population served, and the extent to which students improved in specific areas. 30 States reported data to assess for student improvement in mathematics and English grades across all grade levels, while an additional six and seven States respectively only reported the data for some grade levels and not others. EighteenStates/Territories did not report data on mathematics grades and 17 States/Territories did not report data on English grades.
Based on the available data, the key findings from this year’s APR are:
- During SY14-15 over1.8 million people have been served by this program:
- academic year total student attendees (n = 1,405,722), including regular[1] student attendees (n = 752,008)
- summer attendees (n = 279,314), and
- adults/family members (n = 183,461).
- Overall, there was aneven split between male (49.2%, n = 687,464) and female (48.2%, n = 673,800) attendees.
- In terms of race/ethnicity, the majority of the attendees (84.5%) were identified as Hispanic or Latino (35.9%, n = 504,661), with White (27.8%, n = 391,422) and Black or African American (20.8%, n = 292,260) following.
- 48.0% reported a percentage of improvement in mathematics grades.
- 48.5% reported a percentage of improvement in English grades.
- 28.4% reported a percentage of improvementonstate assessments in elementary reading and 22.6% inmiddle/high school mathematics.
- 65.2% of teachers reported a percentage of improvement in homework completion and class participation.
- 56.8% of teachers reported a percentage of improvement in student behavior.
The data and performance indicate that this broad reaching program touches students’ lives in ways that will have far reaching impact.
INTRODUCTION
Originally created in 1994 through the Elementary and Secondary School Act, and expanded in 2001 through No Child Left Behind (NCLB), the 21stCentury Community Learning Centers (CCLC) program, provides students in high-need, high-poverty communities the opportunity to participate in afterschool programming. Present in all 50 States, the District of Columbia, and 3 Territories, academic enrichment and youth development programs are designed to enhance participants’ well-being and academic success. For the 2014-2015 academic school year, the United States (US) Department of Education funded11,512 centers under the 21stCCLC program.
In this Annual Performance Report (APR), data from the 21APR Data Collection System were analyzed in order to report on the Government Performance and Results Act (GPRA) performance indicators associated with the 21stCCLC program. These metrics, which are further described in Section 1, are the primary way the federal government determines the success and progress of the 21stCCLC program based on the statutory requirements. The APR has historically been completed by grantees once a year to summarize the operational elements of their program, the student population served, and the extent to which students improved in academic-related behaviors and achievement.
This year, the data show that the majority of funded centers were classified as school districts with community-based organizations following second. In the past year, the 21stCCLC program has served a total of more than 1.8 million people and employed 115,000 paid and 31,319 volunteer staff. The majority of the paid staff were school day teachers and most of the volunteers were reported to be community members and college students.
In the following report, the methodological approach taken to data analysis is highlighted before turning to the results of the GPRA analysis. The report concludes with a demographic analysis of students and staff to provide context to the GPRA analysis as well as present a holistic picture of the 21stCCLC program.
Methodology:
There are several key changes in this data collection system designed specifically to increase validity of the overall data. Most significantly the vast majority of questions asked are related directly to the participation demographics or the GPRA indicators. This results in less data entry. Likewise, data are collected for each term of the program, with a cumulative academic year total also collected in the spring. It should be noted that the collection of the cumulative year score in the spring term translates as a proxy for the academic year.
Another significant change involves the calculation of the GPRA measure. In previous reports the total number of participants was used as the population from which to determine the percentage of improvement on State tests and State grades. The new system asks States to report the total number of participants but also the total number of students who needed to improve (e.g., were failing);the system uses thenumber of students who needed to improve to calculate the percentage of improvement. This provides a more accurate representation of performance against the GPRA measure. All GPRA calculations were made using the data entered into the 21APR system by the States, which ties attendance and outcomes together, reducing duplicative data entry and improving accuracy.
Data for the participating 54 States/Territories were entered by each State and certified by the State Education Agency (SEA) for the 21st CCLC program. The MySQL database was queried and exported to SPSS (via Excel) and then analyzed using descriptive statistics (frequencies, percentages, and averages) and reported in tabular format. As validity checks, the data were run independently by two statisticians. A third researcher, who had not previously worked with the data, conducted a final internal consistency check.
To provide a whole program understanding of the data, an aggregate statistic for each of the items analyzed is provided.Descriptive statistics throughout the report are calculated on the States/Territoriesthat provided data on the given measure. For example, if only 46 States/Territoriesout of the total 54 provided data around staffing, then the percentages are only based on the data obtained from those 46. Incorporating missing data from the other eight into the statistical analysis would skew the findings and thus cause them to be inaccurate. This method of only using reported data preserves the statistical integrity of the reported results. This change from previous reporting further provides a more accurate representation of performance against the GPRA measure on a national level.
SECTION 1: GPRA RESULTS
In addition to collecting information on the operational characteristics of 21stCCLC programs, a primary purpose of the system is to collect data that inform the GPRA indicators established for the program. It is important to note that not all States report data for each GPRA. States are afforded the choice to report performance culled from grades, state assessments, and/or teacher-reported student behavior. Certain GPRA then seek data based on these instruments. The GPRA indicators are the primary means by which the US Department of Education measures the effectiveness and efficiency of the program based on the following two overall goals:
- Participants in 21stCentury Community Learning Center programs will demonstrate educational and social benefits and exhibit positive behavioral changes.
- 21stCentury Community Learning Centers will develop afterschool activities and educational opportunities that consider the best practices identified through research findings and other data that lead to high-quality enrichment opportunities that positively affect student outcomes.
Data for each GPRA are provided at the end of the academic school year and presented in tabular and summary form below (Section A-E). Any methodological considerations are noted following each GPRA table. A summary of the findings for each GPRAis presented in Table 1.
1
Table 1. The GPRA Outcomes for all 54 States/Territories[2]
Program GPRA Measures / 2014-20151. The percentage of elementary 21st Century regular program participants whose mathematics grades improved from fall to spring. / 49.7%
2. The percentage of middle/high school 21st Century regular program participants whose mathematics grades improved from fall to spring. / 45.4%
3. The percentage of all 21st Century regular program participants whose mathematics grades improved from fall to spring. / 48.0%
4. The percentage of elementary 21st Century regular program participants whose English grades improved from fall to spring. / 49.6%
5. The percentage of middle/high school 21st Century regular program participants whose English grades improved from fall to spring. / 46.9%
6. The percentage of all 21st Century regular program participants whose English grades improved from fall to spring. / 48.5%
7. The percentage of elementary 21st Century regular program participants who improve from not proficient to proficient or above in reading on state assessments. / 28.4%
8. The percentage of middle/high school 21st Century regular program participants who improve from not proficient to proficient or above in mathematics on state assessments. / 22.6%
9. The percentage of elementary 21st Century regular program participants with teacher-reported improvement in homework completion and class participation. / 66.2%
10. The percentage of middle/high school 21st Century program participants with teacher-reported improvement in homework completion and class participation. / 63.1%
11. The percentage of all 21st Century regular program participants with teacher-reported improvement in homework completion and class participation. / 65.2%
12. The percentage of elementary 21st Century participants with teacher-reported improvements in student behavior. / 57.5%
13. The percentage of middle/high school 21st Century participants with teacher-reported improvements in student behavior. / 55.3%
14. The percentage of all 21st Century participants with teacher-reported improvements in student behavior. / 56.8%
A. GPRA Measures #1-3: Percentage of Improvement in Mathematics Grades
- 36 out of 54 States (66.7%) reported a percentage of improvement in mathematics grades (23 more States reported data than the previous year: 42.6% increase).
- Overall, States reported the following % improvement: 49.7% Elementary, 45.4% Middle/High School, and 48.0% for all students (13.0%, 9.4%, and 11.4% improvement from the previous year respectively).
Table 2. Regular Attendees[3] % Improved in Mathematics Grades
State/Territory / MathematicsElementary
% Improved / Mathematics
Middle/High School
% Improved / Mathematics
All Students
% Improved
Overall / 49.7% / 45.4% / 48.0%
1. Alabama / 0.0 / 0.0 / 0.0
2. Alaska / 0.0 / 0.0 / 0.0
3. Arizona / 60.5 / 56.9 / 59.3
4. Arkansas / 0.0 / 77.3 / 77.3
5. Bureau of Indian Affairs / 0.0 / 0.0 / 0.0
6. California / 44.4 / 49.1 / 47.6
7. Colorado / 0.0 / 0.0 / 0.0
8. Connecticut / 0.0 / 0.0 / 0.0
9. Delaware / 72.1 / 86.3 / 79.3
10. District of Columbia / 74.9 / 61.7 / 69.6
11. Florida / 64.8 / 70.1 / 66.4
12. Georgia / 0.0 / 0.0 / 0.0
13. Hawaii / 61.9 / 41.4 / 51.0
14. Idaho / 0.0 / 0.0 / 0.0
15. Illinois / 60.8 / 60.6 / 60.7
16. Indiana / 0.0 / 0.0 / 0.0
17. Iowa / 75.0 / 42.6 / 48.4
18. Kansas / 87.7 / 0.0 / 86.7
19. Kentucky / 53.9 / 53.1 / 53.6
20. Louisiana / 73.5 / 67.2 / 71.6
21. Maine / 0.0 / 0.0 / 0.0
22. Maryland / 58.5 / 63.0 / 60.2
23. Massachusetts / 0.0 / 0.0 / 0.0
24. Michigan / 55.8 / 44.5 / 50.5
25. Minnesota / 0.0 / 18.3 / 18.3
26. Mississippi / 60.0 / 49.2 / 55.7
27. Missouri / 32.1 / 33.7 / 32.6
28. Montana / 0.0 / 0.0 / 0.0
29. Nebraska / 0.0 / 0.0 / 0.0
30. Nevada / 35.2 / 35.9 / 35.3
31. New Hampshire / 0.0 / 0.0 / 0.0
32. New Jersey / 76.3 / 72.6 / 74.9
33. New Mexico / 0.0 / 0.0 / 0.0
34. New York / 54.8 / 44.6 / 48.2
35. North Carolina / 13.9 / 6.0 / 9.6
36. North Dakota / 0.0 / 100.0 / 100.0
37. Ohio / 56.4 / 61.0 / 58.4
38. Oklahoma / 0.0 / 0.0 / 0.0
39. Oregon / 70.1 / 0.0 / 70.1
40. Pennsylvania / 45.6 / 41.7 / 43.2
41. Puerto Rico / 58.9 / 60.6 / 59.5
42. Rhode Island / 0.0 / 0.0 / 0.0
43. South Carolina / 42.2 / 77.8 / 44.4
44. South Dakota / 76.1 / 0.0 / 76.1
45. Tennessee / 69.7 / 67.9 / 69.0
46. Texas / 26.6 / 25.7 / 26.2
47. Utah / 71.4 / 73.2 / 71.8
48. Vermont / 0.0 / 0.0 / 0.0
49. Virgin Islands / 61.3 / 75.9 / 64.7
50. Virginia / 68.2 / 66.9 / 67.6
51. Washington / 61.1 / 26.7 / 55.3
52. West Virginia / 79.0 / 72.0 / 75.6
53. Wisconsin / 59.6 / 66.7 / 59.7
54. Wyoming / 0.0 / 0.0 / 0.0
Note: Raw scores were used to calculate overall % improvement. When calculating the % improvement “Overall”, the total amounts of regular attendees with reported APR resultswere used in the calculations across all States/Territories who reported on this measure. Zeroes do not necessarily reflect delinquency in reporting or lack of improvement; States elect to report on grades, state assessments, and/or teacher-reported student behavior. Therefore, zeros in this table may reflect that a State is not reporting on the outcome represented.
B. GPRA Measures #4-6: Percentage of Improvement in English Grades
- 37 out of 54 States (68.5%) reported a percentage of improvement in English grades (24 more States reported data than the previous year: 44.4% increase).
- Overall, States reported the following % improvement: 49.6% Elementary, 46.9% Middle/High School, and 48.5% for all students (12.9%, 9.6%, and 11.7% improvement from the previous year respectively).
Table 3. Regular Attendees % Improved in English Grades
State/Territory / EnglishElementary / English
Middle/High School / English
All Students
% Improved / % Improved / % Improved
Overall / 49.6% / 46.9% / 48.5%
1. Alabama / 0.0 / 0.0 / 0.0
2. Alaska / 0.0 / 0.0 / 0.0
3. Arizona / 67.8 / 58.3 / 56.6
4. Arkansas / 0.0 / 50.0 / 50.0
5. Bureau of Indian Affairs / 0.0 / 0.0 / 0.0
6. California / 69.0 / 52.7 / 60.2
7. Colorado / 0.0 / 0.0 / 0.0
8. Connecticut / 0.0 / 0.0 / 0.0
9. Delaware / 70.5 / 81.0 / 75.3
10. District of Columbia / 76.1 / 69.6 / 73.6
11. Florida / 66.0 / 71.8 / 67.8
12. Georgia / 0.0 / 0.0 / 0.0
13. Hawaii / 56.4 / 36.4 / 46.2
14. Idaho / 0.0 / 0.0 / 0.0
15. Illinois / 56.1 / 83.3 / 63.0
16. Indiana / 0.0 / 0.0 / 0.0
17. Iowa / 66.1 / 28.7 / 32.1
18. Kansas / 77.7 / 0.0 / 77.6
19. Kentucky / 55.7 / 54.6 / 55.3
20. Louisiana / 74.0 / 67.1 / 72.0
21. Maine / 0.0 / 0.0 / 0.0
22. Maryland / 57.6 / 67.1 / 61.3
23. Massachusetts / 0.0 / 0.0 / 0.0
24. Michigan / 54.0 / 46.5 / 50.9
25. Minnesota / 0.0 / 17.3 / 17.3
26. Mississippi / 58.8 / 49.6 / 54.9
27. Missouri / 36.4 / 36.2 / 36.3
28. Montana / 0.0 / 0.0 / 0.0
29. Nebraska / 0.0 / 0.0 / 0.0
30. Nevada / 30.2 / 33.5 / 30.9
31. New Hampshire / 0.0 / 0.0 / 0.0
32. New Jersey / 75.0 / 75.1 / 75.0
33. New Mexico / 0.0 / 0.0 / 50.2
34. New York / 56.1 / 46.9 / 50.2
35. North Carolina / 10.2 / 9.9 / 10.1
36. North Dakota / 0.0 / 100.0 / 100.0
37. Ohio / 55.9 / 58.3 / 56.9
38. Oklahoma / 0.0 / 0.0 / 0.0
39. Oregon / 96.8 / 0.0 / 96.8
40. Pennsylvania / 46.4 / 42.4 / 44.0
41. Puerto Rico / 59.7 / 65.8 / 61.6
42. Rhode Island / 0.0 / 0.0 / 0.0
43. South Carolina / 33.0 / 0.0 / 33.3
44. South Dakota / 63.3 / 60.0 / 62.8
45. Tennessee / 71.5 / 68.1 / 70.3
46. Texas / 21.6 / 33.8 / 26.6
47. Utah / 77.3 / 73.7 / 76.6
48. Vermont / 0.0 / 0.0 / 0.0
49. Virgin Islands / 55.4 / 74.6 / 59.8
50. Virginia / 66.7 / 68.9 / 67.7
51. Washington / 49.7 / 38.2 / 47.5
52. West Virginia / 78.2 / 67.6 / 73.2
53. Wisconsin / 65.1 / 40.0 / 64.5
54. Wyoming / 0.0 / 0.0 / 0.0
Note: Raw scores were used to calculate overall % improvement. When calculating the % improvement “Overall”, the total amounts of regular attendees with reported APR results were used in the calculations across all States/Territories. Zeroes do not necessarily reflect delinquency in reporting or lack of improvement; States elect to report on grades, state assessments, and/or teacher-reported student behavior. Therefore, zeros in this table may reflect that a State is not reporting on the outcome represented.
C. GPRA Measures #7-8: Percentage of Improvement on Reading and Mathematics State Assessments
- 36 out of 54 States/Territories (66.7%) reported a percentage of improvement from not proficient to proficient or above on theElementary reading state assessment (16 more States/Territoriesreported data than the previous year: 48.2% increase).
- 34 out of 54 States/Territories (63.0%) reported a percentage of improvement from not proficient to proficient or above on the Middle/High School mathematics state assessment (14 more States reported data than the previous year: 44.5% increase).
- Overall, the States/Territories reported the following % improvement: 28.4% Elementary Reading and 22.6% Middle/High School Mathematics Assessment (23.0% and 10.0% improvement from the previous year respectively).
Table 4. Regular Attendees % Improved on Reading/Mathematics State Assessments
State/Territory / ReadingElementary
% Improved / Mathematics
Middle/High School
% Improved
Overall / 28.4% / 22.6%
1. Alabama / 0.0 / 0.0
2. Alaska / 0.0 / 0.0
3. Arizona / 14.7 / 33.7
4. Arkansas / 34.6 / 41.5
5. Bureau of Indian Affairs / 0.0 / 0.0
6. California / 0.0 / 0.0
7. Colorado / 0.0 / 0.0
8. Connecticut / 0.0 / 0.0
9. Delaware / 38.1 / 49.0
10. District of Columbia / 38.0 / 52.4
11. Florida / 87.2 / 57.8
12. Georgia / 6.5 / 19.6
13. Hawaii / 0.0 / 0.0
14. Idaho / 6.7 / 0.9
15. Illinois / 0.6 / 3.4
16. Indiana / 0.0 / 0.0
17. Iowa / 27.7 / 28.0
18. Kansas / 66.7 / 50.0
19. Kentucky / 0.0 / 0.0
20. Louisiana / 69.2 / 60.6
21. Maine / 0.0 / 0.0
22. Maryland / 9.7 / 42.9
23. Massachusetts / 21.2 / 12.4
24. Michigan / 0.0 / 0.0
25. Minnesota / 0.0 / 0.0
26. Mississippi / 34.9 / 33.1
27. Missouri / 0.0 / 0.0
28. Montana / 0.0 / 0.0
29. Nebraska / 0.0 / 0.0
30. Nevada / 0.0 / 0.0
31. New Hampshire / 0.0 / 0.0
32. New Jersey / 48.6 / 62.4
33. New Mexico / 0.0 / 0.0
34. New York / 16.2 / 13.4
35. North Carolina / 0.0 / 0.0
36. North Dakota / 2.0 / 0.0
37. Ohio / 54.7 / 28.0
38. Oklahoma / 30.8 / 29.0
39. Oregon / 0.7 / 0.0
40. Pennsylvania / 26.1 / 29.0
41. Puerto Rico / 49.4 / 27.3
42. Rhode Island / 0.0 / 0.0
43. South Carolina / 16.0 / 1.3
44. South Dakota / 10.6 / 1.6
45. Tennessee / 37.2 / 40.6
46. Texas / 41.3 / 31.3
47. Utah / 21.4 / 25.0
48. Vermont / 25.4 / 27.3
49. Virgin Islands / 0.0 / 0.0
50. Virginia / 49.2 / 55.1
51. Washington / 6.7 / 1.6
52. West Virginia / 68.0 / 77.8
53. Wisconsin / 56.9 / 0.0
54. Wyoming / 69.3 / 50.5
Note: Raw scores were used to calculate overall % improvement. When calculating the % improvement “Overall”, the total amounts of regular attendees with reported APR results were used in the calculations across all States/Territories. Zeroes do not necessarily reflect delinquency in reporting or lack of improvement; States elect to report on grades, state assessments, and/or teacher-reported student behavior. Therefore, zeros in this table may reflect that a State is not reporting on the outcome represented.