CDE will be closed Wednesday, Nov. 27 through Friday, Nov. 29 for the Thanksgiving holiday.
You are here
Program Evaluation
Program Evaluation Training
The ESEA Program Office has partnered with the Regional Educational Laboratory (REL Central) to develop and deliver a series of Program Evaluation Trainings (PET), to support schools and districts with evaluating the impact of programs supported with ESEA (or ESSA) funds. The PET modules are sequential and build upon each other. Following the trainings from Modules 1 through 6 is recommended. However, they are also designed as freestanding trainings for anyone who would like to dive into a specific component of program evaluation.
PE Module Descriptions
Logic Models and Evaluation Questions
On December 20th, 2019, the Office of ESEA recorded Module 1 and 2 of the Program Evaluation Training.
Module 1 will introduce users to the purpose and elements of logic models and guide users through the development of a logic model useful to their own context. Having a sound logic model is the foundation for completing a program evaluation.
Module 2 will guide users in leveraging their own logic models to inform the creation of evaluation questions. Participants will be introduced to the different types of evaluation questions and the elements of high quality evaluation questions.
- Recorded Webinar of Modules 1 and 2
- For materials used in Modules 1 and 2, please email Emily Owen
- PowerPoint presentation of Modules 1 and 2
Assessing the Availability and Quality of Existing Data
On January 10th, 2020, the Office of ESEA recorded Module 3 of the Program Evaluation Training. In Module 3, users will identify appropriate available data and the need for additional data collection to answer their evaluation questions. Participants will then assess the quality of available data and learn about options for collecting additional high-quality data.
- Recorded Webinar of Module 3
- For materials used in Modules 3, please email Emily Owen
- PowerPoint Presentation of Module 3
Identifying and Developing Data Collection Instruments
On January 17th, 2020, the Office of ESEA recorded Module 4 of the Program Evaluation Training. Module 4 will guide users through the identification of available quality data collection tools and best practices in the development of additional tools to gather data to answer evaluation questions. In this module, participants will begin development of a data collection instrument.
- Recorded Webinar of Module 4
- For materials used in Module 4, please email Emily Owen
- PowerPoint Presentation of Module 4
Coding, Cleaning, and Analyzing Data
On January 23rd, 2020, the Office of ESEA recorded Module 5 of the Program Evaluation Training. Module 5 will guide users through considerations and strategies for coding, cleaning, and analyzing quantitative and qualitative data.
- Recorded Webinar of Module 5
- For materials used in Module 5, please email Emily Owen
- PowerPoint Presentation of Module 5
Interpreting and Presenting Findings
On January 31st, 2020, the Office of ESEA recorded Module 6 of the Program Evaluation Training. Module 6 will guide users through considerations for interpreting and presenting evaluation results. Participants will develop strategies for connecting results to evaluation questions, presenting findings to different audiences, and designing graphics for evaluation reports.
- Recorded Webinar of Module 6
- For materials used in Module 6, please email Emily Owen
- PowerPoint Presentation of Module 6
Program Evaluations and Analyses
The DARE team is responsible for evaluating the effectiveness of ESEA programs, to inform the development of applications for funding and standards for program quality, and to identify best practices in program implementation.
Migrant Education Program Evaluation
The Colorado Migrant Education Program (MEP) is a supplemental educational program, authorized by Title I, Part C of the Elementary and Secondary Education Act (ESEA), designed to ensure high-quality comprehensive education for migratory children and minimize potential negative impacts of multiple moves during their educational careers. The program goals include designing and implementing strategies that ensure access to state academic and achievement standards, support successful transition to postsecondary education or the workforce, and help migratory children overcome cultural and language barriers, social isolation, health-related concerns and other factors that can make it difficult for them to continue in school or gain employment. CDE evaluates the impact of the supports and services provided to Colorado migrant children to ensure that program goals and objectives are being met.
Summer Migrant Youth Leadership Institute (SMYLI) Evaluation
The Colorado Migrant Education Program (MEP) has organized and implemented the annual Summer Migrant Youth Leadership Institute (SMYLI) since 2001. The purpose of SMYLI is to motivate and empower 9th - 12th grade migrant students to reach their educational goals and increase their potential as school and community leaders. SMYLI provides ten days of workshops and activities that advance social, academic, and leadership skills; increase self-efficacy, confidence, and self-esteem; provide an early college environment experience and impart knowledge and skills that foster college success; and help students navigate the college and scholarship application processes. CDE evaluated the impact of 2016 SMYLI on participating Colorado migrant youth to ensure that program goals and objectives were met. Evaluation results indicate that students who attended the 2016 SMYLI experienced statistically significant improvements in communication, civic duties and responsibilities, college and career readiness, leadership skills, and life skills.
Tiered Intervention Grants (known nationally as School Improvement Grants)
The purpose of the Tiered Intervention Grant (TIG)/School Improvement Grant (SIG) is to provide supports to districts with schools in the lowest five percent of Title I schools and high schools with graduation rates below 60% (called “priority schools” under federal accountability). Two cohorts of TIG/SIG awardees have completed implementation of the grant. A summative evaluation was conducted to determine the changes in overall performance based on School Performance Frameworks (SPF) and achievement based on percent proficient and advanced in reading and math. Both SPF rating and achievement are part of the criteria used to identify priority schools as eligible for this grant. Therefore, improvement on these performance measures are goals of the grant and were analyzed across the implementation years. Additional analyses were conducted to determine changes in reading and math median growth percentiles of the participating schools across the years of grant implementation. Of the schools that implemented the grant for the full three years, 55% met exit criteria [earned one of the state’s two highest SPF ratings (Performance or Improvement) for two consecutive years] and are no longer in priority status. Increases in reading and math achievement and growth were also noted for large percentages of the schools. For example, 89% and 84% of the schools had a higher percent proficient and advanced on reading and math, respectively, at the end of the grant than they did prior to implementation. Nonetheless, some schools continue to struggle. Further qualitative analyses are being conducted to determine what strategies and practices are common across the higher performing schools or those with the greatest increases, in comparison to the schools that are continuing to struggle after grant implementation.
Five schools were identified for a comprehensive study of how they are attaining higher academic achievement than other schools in the state for English Learners. Students with Disabilities, students experiencing poverty, and minority students. The school leadership, personnel, and families participated in surveys, focus groups, and interviews to help identify the factors contributing to the schools' success with the identified groups. The following reports summarize the study purpose and methods, overall findings across schools, and findings within each school.
- Executive Summary (PDF)
- Introduction Report - Purpose and Methods of the Study (PDF)
- Overall Findings: Common Practices and Procedures across Schools (PDF)
- Canyon Creek Elementary School (PDF)
- Burlington Elementary School (PDF)
- Soaring Eagles Elementary School (PDF)
- South Lakewood Elementary School (PDF)
- Tavelli Elementary School (PDF)
Connect for Success Grant
The High Achieving Schools (HAS) study findings were used to develop the Connect for Success (CFS) grant, which enables grantees to connect with and learn from the HAS and to implement the practices and strategies aligned with those noted in HAS. The following report summarizes the design of the CFS grant, provides information on the reach of the program, and presents the results of CDE's study of the impact of CFS.
Title I, Part A ~ Schoolwide versus Targeted Assistance Programs
Hierarchical Linear Modeling was used to determine if the growth trajectories of student reading and math performance varied for students that were served in Title I schoolwide schools or targeted assistance schools, compared to students with similar backgrounds who did not receive any Title I services. A follow up study was done to test the relationship between schools’ Per Pupil Allocation and the Median Growth Percentiles of the schools.
Title I, Part A High Growth Schools (HGS)
Quantitative data was used to identify Title I schools that had attained high growth with their lowest performing students (Catch-Up Students on the Colorado Growth Model). A comprehensive school study was conducted of each school to determine areas of strength and strategies that can be shared with other Title I schools. Reports are available on each school on our HGS Study webpage.
- Summary Report of the findings from the study (PDF)
- For more information about the project and findings visit the HGS Study Webpage.
Title I, Part A Effective Summer Schools
Title I funds were awarded on a competitive basis to support summer school programs in the summer of 2010. Because this was a pilot study, one of the grant requirements was submission of program implementation data. Data submitted by grantees and the state assessment data were used to conduct a statewide evaluation of the effectiveness of summer school programs in increasing student performance from the year before to the year after the summer school program implementation. The impact of the program varied by grantee: some experience greater success in increasing student performance the year following the grant. English Language Learners, who participated in summer school programs, experienced a greater increase in performance in comparison to their academic and linguistic peers. The more effective math programs implemented, on average, between 19 to 57 hours of math programming. The more effective reading programs had more variability in the average number of hours provided.
While conducting this evaluation, it was noted that many of the more successful grantees had also implemented other federally-funded programs (such as Title I, Part A Supplemental Educational Services or Title II, Part B Math and Science Partnership Programs). Therefore, follow-up analyses were conducted to ascertain the combined effects of multiple programs. Schools that had implemented multiple federally-funded programs had significantly higher increased on the percentage points earned on the State’s Performance Framework than schools that did not implement any of the studied programs. Further investigation of these trends has been planned.
Title I, Part A School Improvement Grants
The goal of the School Improvement Grant (SIG) was to target low performing Title I schools and provide an intensive two year intervention aimed at improving students’ academic achievement. The OMNI Institute was contracted to conduct an evaluation of the SIGs implemented across several years (2004-2005 through 2007-2008).
- Evaluation Report on School Improvement Grant Process, June 2010, from OMNI Instituts (PDF)
- For more information about the project and findings visit the SIG Webpage.
Title I, Part A Supplemental Educational Services Provider and Statewide Evaluations
Supplemental Educational Services (SES) is a program wherein district Title I funds are set aside to provide tutoring to students at no expense to the student. The tutoring is offered outside of the school day. CDE is required to evaluate the effectiveness of the SES program implemented in Colorado, including evaluating the effectiveness of each provider. In prior years, CDE has contracted with an outside evaluation firm to conduct the evaluation.
- Each year’s evaluation report, as well as a table that summarizes the longitudinal findings can be viewed by clicking below:
-
Longitudinal Summary Report by CDE – 2008-2009 through 2013-2014 (PDF)
- 2013-2014 Evaluation Report by CDE (Content) (PDF)
- 2013-2014 Evaluation Report by CDE (Hours) (PDF)
- 2013-2014 Evaluation Report by CDE (ELD) (PDF)
- 2013-2014 Evaluation Report by CDE (Districts) (PDF)
-
2010-2011 Evaluation Report by Center for Research Strategies (PDF)
- 2009-2010 Evaluation Report by OMNI Institute (PDF)
- 2008-2009 Evaluation Report by OMNI Institute (PDF)
- 2007-2008 Evaluation Report by OMNI Institute (PDF)
-
Title I, Part D Annual Analyses of Facility Data
This program provides funding to support the education of youth in state-operated institutions and provides assistance to school districts that work with local facilities that serve adjudicated youth. Colorado receives formula funds based on the number of students in state institutions and local facilities. Facility report cards and statewide reports are generated annually to present the academic and vocational performance of students within the facilities.
Title I, Part D Resources
- PowerPoint on Statewide Performance (PDF)
- For more information about the project and findings visit the Title I, Part D Webpage
Title I, Part D Forms and Templates
Title II, Part A Effective Professional Development
Title II, Part A is intended to increase student academic achievement by improving teacher and principal quality. This includes increasing the number of Highly Qualified teachers in classrooms, improving the skills of principals and assistant principals in schools, and increasing the effectiveness of teachers and principals. A study was conducted to determine how districts were allocating their IIA funds and the changes in IIA district allocations across years (2003-2004 compared to 2008-2009).
- Title II, Part A Dissemination Report (PDF)
- For more information about the project and findings visit the Title II, Part A Webpage.
Title II, Part B Statewide Evaluation of Program Impact
Title II-B, Mathematics and Science Partnerships (MSP), is intended to increase the academic achievement of students in math and/or science by enhancing the content knowledge and teaching skills of classroom teachers. This grant provides districts and schools with the opportunity to partner with faculty from the science, technology, engineering, and/or mathematics (STEM) departments in institutions of higher education. CDE evaluates the statewide impact of the partnerships in the year subsequent to funding (evaluation reports are one year lagged to due to the data collection periods and reliance on state assessments which are not ready for analyses until the year after implementation).
- 2010-2011 MSP Evaluation Report (PDF)
- 2009-2010 MSP Evaluation Report (PDF)
- Program Evaluation findings varied by grantee. Several grantees successfully increased teachers’ math and science knowledge based on pre- to post-test results of Teacher Content Knowledge. Trends of an implementation lag were detected and will continue to be tested in future evaluations.
- For more information about the project and findings visit the Title II, Part B Webpage
Title II, Part D Statewide Evaluation (Final Year of Analyses)
The primary goal of this program is to improve student achievement through the use of technology in elementary and secondary schools. CDE evaluated the impact of the program in the final year of funding.
Title III, Part A Analyses of Use of Funds
Title III is designed to improve the education of Limited English Proficient (LEP) students by helping them learn English and meet challenging state academic content and student academic achievement standards.
- Summary of findings from analyses of High Growth Districts’ Use of Funds ~ coming soon
- Summary of findings from the qualitative analyses of the program ~ coming soon
- For more information about the project and findings visit the Title III, Part A Webpage.
State of the State: English Language Learners
Colorado has had an increasing number of culturally and linguistically diverse learners across the years. The State of the Stare is updated each year to present data on Colorado English learners, including changes in demographics, linguistic and academic performance, and postsecondary and workforce indicators.
- 2014 State of the State: Colorado English Learners (PDF)
- 2015 State of the State: Colorado English Learners (PDF)
- 2016 State of the State: Colorado English Learners (PDF)
The English Language Proficiency Act (ELPA) Excellence Award program is designed to award grants to local education providers and charter schools with evidence-based English language development (ELD) programs that achieve the highest English language and academic growth among English learners and the highest academic achievement for English learners who transition out of the English language proficiency program. ELPA requires the Colorado Department of Education (CDE) to identify and disseminate the practices that have contributed to the success of the Excellence Awardees.
At the conclusion of each school year for which it receives a grant, each district and charter school that receives an ELPA Excellence Award must submit a data analysis and summary of the evidence-based ELD program and an annual financial report of the use of funds received.
For the 2016-17 ELPA Excellence Award, CDE synthesized the practices and procedures common to the awardees and highlighted their exemplary practices into two ELPA Excellence Award Evaluation Reports.
To view reports from additional ELPA Excellence Awardees, please visit the ELPA Excellence Award Webpage.
For Additional Information Contact:
Nazanin Mohajeri-Nelson, Ph.D.
Executive Director
720-626-3895
send an email
Tina Negley, M.A.
720-766-2793
send an email
Anna Rowan
ESEA Program Evaluator and Lead Research Analyst
720-215-7456
Send an email
Connect With Us