Teacher Preparation Program Accountability :
South Carolina

Delivering Well Prepared Teachers Policy


The state's approval process for teacher preparation programs should hold programs accountable for the quality of the teachers they produce.

Meets goal in part
Suggested Citation:
National Council on Teacher Quality. (2011). Teacher Preparation Program Accountability : South Carolina results. State Teacher Policy Database. [Data set].
Retrieved from: https://www.nctq.org/yearbook/state/SC-Teacher-Preparation-Program-Accountability--6

Analysis of South Carolina's policies

South Carolina's approval process for its traditional and alternate route teacher preparation programs does not hold programs accountable for the quality of the teachers they produce.

Most importantly, South Carolina does not collect value-added data that connect student achievement gains to teacher preparation programs.

The state does rely on some other objective, meaningful data to measure the performance of its traditional teacher preparation programs. The state collects results from new teacher performance evaluations (ADEPT); consolidates them by institution; and then uses the information to affect decisions on the creation, continuation and elimination of programs. South Carolina requires at least a 95 percent pass rate for its ADEPT evaluation results. However, these data are not collected for the state's alternate route program.

South Carolina also collects programs' annual summary licensure test pass rates (80 percent of program completers must pass their licensure exams). Regrettably, the 80 percent pass-rate standard, while common among many states, sets the bar quite low and is not a meaningful measure of program performance. 

Finally, the state posts "Fact Sheets" on its website that include Praxis II and ADEPT pass rates for each institution.


Recommendations for South Carolina

Collect data that connect student achievement gains to teacher preparation programs.
To ensure that programs are producing effective classroom teachers, South Carolina should consider academic achievement gains of students taught by the programs' graduates, averaged over the first three years of teaching.

Gather other meaningful data that reflect program performance.
In addition to knowing whether programs are producing effective teachers, other objective, meaningful data can also indicate whether programs are appropriately screening applicants and if they are delivering essential academic and professional knowledge. Building on the data the state currently collects for its traditional teacher preparation programs, South Carolina should gather data for all teacher preparation programs, such as the following: average raw scores of graduates on licensing tests, including basic skills, subject matter and professional knowledge tests; satisfaction ratings by school principals and teacher supervisors of programs' student teachers, using a standardized form to permit program comparison; evaluation results from the first and/or second year of teaching; and five-year retention rates of graduates in the teaching profession.

Establish the minimum standard of performance for each category of data.
Programs should be held accountable for meeting these standards, with articulated consequences for failing to do so, including loss of program approval after appropriate due process. 

Publish an annual report card on the state's website for all teacher preparation programs.
South Carolina is commended for including "Fact Sheets" on its website with data for each institution. The state should also include information on its alternative route in order to provide the public with meaningful, readily understandable indicators of how well all of its programs are doing.

State response to our analysis

South Carolina recognized the factual accuracy of this analysis. The state added that it is piloting Project - Higher Education Assessment of Teaching (HEAT), which provides value-added assessment (VAA) data for Clemson University, University of South Carolina and Winthrop University using results from participating TAP schools.

South Carolina pointed out that it is transforming its ADEPT evaluation system to include multiple measures of teacher effectiveness, including VAA, whenever possible. Its goal is for all educator preparation programs to receive VAA data for all graduates.

The state also noted that ADEPT data is provided to alternate preparation programs. 

Research rationale

For discussion of teacher preparation program approval see Andrew Rotherham's chapter "Back to the Future: The History and Politics of State Teacher Licensure and Certification." in A Qualified Teacher in Every Classroom. (Harvard Education Press, 2004).

For evidence of how weak state efforts to hold teacher preparation programs accountable are, see data on programs identified as low-performing in the U.S. Department of Education, Secretary's Seventh Annual Report on Teacher Quality 2010 at:

For additional discussion and research of how teacher education programs can add value to their teachers, see NCTQ, Tomorrow's Teachers: Evaluation Education Schools, available at http://www.nctq.org/p/edschools.

For a discussion of the lack of evidence that national accreditation status enhances teacher preparation programs' effectiveness, see D. Ballou and M. Podgursky, "Teacher Training and Licensure: A Layman's Guide," in Better Teachers, Better Schools, ed. Marci Kanstoroom and Chester E. Finn. Jr. (Washington, D.C.: Thomas B. Fordham Foundation, 1999), 45-47. See also No Common Denominator: The Preparation of Elementary Teachers in Mathematics by America's Education Schools (NCTQ, 2008) and What Education Schools Aren't Teaching About Reading and What Elementary Teachers Aren't Learning (NCTQ, 2006).

See NCTQ, Alternative Certification Isn't Alternative (2007) regarding the dearth of accountability data states require of alternate route programs.