General Teacher Preparation Policy
The state should collect and publicly report key data on the quality of teacher preparation programs. This goal was reorganized in 2017.
Student Growth Data: Oklahoma does not collect or publicly report data that connect student growth to teacher preparation programs.
Additional Program Data: Oklahoma collects other objective, meaningful data to measure the performance of teacher preparation programs, including licensure exam pass rates, completer and principal surveys of program satisfaction, and candidates' results on assessments of reading instruction.
Oklahoma data sharing policies are covered in the Oklahoma Department of Education's Data Governance Manual.
Oklahoma Administrative Code 712:10-5-1 OEQA Administrative Rules: 218:10-5 Data Governance Manual http://sde.ok.gov/sde/sites/ok.gov.sde/files/Data%20Governance%20Program%20Manual%2003112016.pdf Annual Reports http://www.ok.gov/oeqa/About_OEQA/Annual_and_State_Reports/index.html http://www.ok.gov/octp/Educator_Preparation/Accreditation_Accountability/index.html
Collect and report data that connect student growth to teacher preparation programs, when those programs are large enough for the data to be meaningful and reliable.
Oklahoma should use the data it plans to collect on the academic achievement gains of students taught by programs' graduates, averaged over the first three years of teaching, to hold education preparation programs accountable for the performance of their completers. The state should also report this data publicly. Data that are aggregated at the institution level (e.g., combining elementary and secondary programs), rather than disaggregated by the specific preparation program, have less utility for accountability and continuous improvement purposes than more specific data because institution-level data aggregation can mask significant differences in performance among programs.
Oklahoma indicated that starting in August 2017, educator preparation programs are collecting teacher evaluation data on graduates first three years of employment in the state. This data on their graduates' effectiveness in the classroom will be analyzed and used for continuous program improvement. A memorandum of understanding between the State Department of Education and the Office of Educational Quality and Accountability allows for the transfer of data, including the evaluation data on educator preparation programs' graduates. This data is collected and distributed to educator preparation programs for purposes of continuous improvement only. Any request for use of this data for any other purpose must be approved by the Oklahoma Teacher Preparation Data Governance Council.
NCTQ appreciates the response from Oklahoma. However, we were unable to include it in the state analysis because we were unable to find any information to verify its accuracy.
1C: Program Performance Measures
The state should examine a number of factors when measuring the performance of and approving teacher preparation programs. Although the quality of both the subject-matter preparation and professional sequence is crucial, there are also additional measures that can provide the state and the public with meaningful, readily understandable indicators of how well programs are doing when it comes to preparing teachers to be successful in the classroom.
States have made great strides in building data systems with the capacity to provide evidence of teacher performance. These same data systems can be used to link teacher effectiveness to the teacher preparation programs from which they came. States should make such data, as well as other objective measures that go beyond licensure test pass rates, central components of their teacher preparation program approval processes, and they should establish precise standards for performance that are more useful for accountability purposes.
National accrediting bodies, such as CAEP, are raising the bar, but are no substitute for states' own policy. A number of states now have somewhat more rigorous academic standards for admission by virtue of requiring that programs meet CAEP's accreditation standards. However, whether CAEP will uniformly uphold its standards (especially as they have already backtracked on the GPA requirement) and deny accreditation to programs that fall short of these admission requirements remains to be seen. Clear state policy would eliminate this uncertainty and send an unequivocal message to programs about the state's expectations.