CAEP

This document is written to include the original self-study (text in black), and for each standard the relevant CAEP offsite Formatie Feedback Review (FFR) narrative (text in blue). COE's response to the FFR is incorporated throughout and at the end of each standard (text in red).

The following link takes the reader to the FFR comments along with the EPP's response.

Click here to see the EPP's response to the FFR

Standard 1: Content and Pedagogical Knowledge

The provider ensures that candidates develop a deep understanding of the critical concepts and principles of their discipline and, by completion, are able to use discipline-specific practices flexibly to advance the learning of all students toward attainment of college- and career-readiness standards.

SPECIALIZED PROFESSIONAL ASSOCIATION (SPA) REPORTS

Content and pedagogical knowledge are demonstrated through a variety of means; the first specific artifact chosen to demonstrate candidate success in Standard 1 is the College's SPA reports. (Providers ensure that completers apply content and pedagogical knowledge as reflected in outcome assessments in response to standards of Specialized Professional Associations).The Associations determine their specific standards and quality of evidence and fall within the larger accreditation framework . Like all colleges, we're striving for National Recognition in all areas . 

The two sources of evidence used to demonstrate that the unit is meeting Standard 1 are results from the 2013-2014 Specialized Professional Association (SPA) reports (1A) and Portfolio Summary Data (1B).

SPA REPORTS

Table 1A summarizes SPA results as of February, 2014. All of the SPA submittals, save one, have been nationally recognized, although at varying levels of endorsement. Since the College has experienced considerable flux, one of the programs simply did not have the history or a sufficient number of years' worth of data to provide a good overview of its progress. The UCCSTeach program, which served as the basis for the NCTM report, is only three-and-a-half years old, and had only one program completer in 2013: a candidate who started training prior to the implementation of UCCSTeach. The first cohort of students completing the entire UCCSTeach sequence will finish in May of 2014. While the report was well written and collaboratively produced, it simply lacked the longevity necessary to establish a track record so did not earn recognition. Any programs that were not recognized or recognized with probation will resubmit during the 2014-2015 academic year. In consultation with the Council for Exceptional Children (CEC), a Special Education SPA report was not submitted since the old program has been completely revamped and had no program completers at the time of submission.

In reviewing the SPA reports, it is evident that the quality of the assessments and collection processes are acceptable, but there is work to be done in aligning assessments with specific SPA standards, creating robust rubrics that measure what they're designed to assess, and reporting data at a more granular, better-defined level. While there are variations across programs in the areas needing improvement and the degree of compliance, the broad themes of alignment, measurement, and data relevance remain. More detailed information, including timelines, is presented in the Continuous Improvement Plan. SPA results are symptomatic of the College's past response to assessment needs. Programs have been tightly siloed and data were difficult to access as reports had to be requested through a limited number of gatekeepers. Initiatives currently being implemented include migration of student data to the College's new Office of Assessment and Accreditation; the newly-implemented ability to share standards, portfolios, and rubrics on Blackboard, a two-day retreat in June (2014) for all education preparation faculty to discuss common terminology, practices, assessments, candidate expectations, etc. The College runs four good but disparate educator preparation programs. The SPA reports remind us that we need to create a more cohesive, unified, transparent approach by reaching across programs and departments.

SPA Table 1A

Table 1A-2013/2014 SPA Report

PORTFOLIO SUMMARY DATA

Portfolios were chosen as the second artifact of evidence for Standard 1 because, other than the CACREP-accredited counseling programs, all initial preparation and advanced programs use the portfolio as a mechanism to collect and assess student achievement. Portfolios have been a standard instrument used for this purpose in educator preparation programs across the nation for at least twenty years and have provided important feedback to programs on the quality of their candidates and effectiveness of their programs. While the program requirements vary, constants include clearly-defined candidate expectations, processes, and standards alignments for portfolio submissions. As with other universities, COE faculty determine what is collected, how and by whom the collection is scored, and how the results are used. Programs have cross-walked course content, assignments and assessments to state and professional standards and built rubrics to ensure candidates are meeting expectations. 

Quality of the portfolios is determined by a number of means. First, portfolios contain multiple elements, providing candidates the opportunity to demonstrate their effectiveness across several measures, some of which are highly prescriptive and others that allow flexibility and freedom to interpret assignments according to candidate preference. Second, the portfolios are assessed by multiple scorers, providing better reliability. Third, candidates are given multiple opportunities to achieve mastery on several of the elements. The tables reflect high composite scores, which are interpreted as evidence that candidates have demonstrated the knowledge, skills, or dispositions needed to be successful. 

Programs establish their own thresholds for candidate effectiveness, based on standards and multiple measures. Included in Standard 1 is summary data (1B) included for three programs: Teacher Education Licensure, Special Education, and Principal Licensure. In looking at TELP data, the reader will notice that there are two years (2012 and 2013) when data are not included for standards 7 or 8. That is because there was a change in the standards used in 2012. Note also that Standard 6 includes a score reported as the pass rate on Teacher Work Sample for both Elementary and Secondary majors for 2012 and 2013. Elementary majors generally score between 4.13 and 4.72 on a five-point scale. Secondary majors score between 4.47 and 4.86.

Special Education used a mean of all the standards in reporting candidate scores. Scores range from 3.74-3.91 on a four-point scale over the 2011-2013 timeframe.

The Principal Licensure program uses a three-point scale, and scores reflect a range of 2.7-3.00. One of the revelations we've gotten from SPA reports, CAEP preparation, and the Assessment and Accreditation Committee's work is that there are variations in scales, standards, processes, artifacts, etc. While some of that is necessary because of unique programmatic requirements, the College could be more consistent and coordinated in portfolios design, artifacts required, submission process, scoring, data feedback, etc. Conversations among teacher education faculty in June will address many of those issues and the College's movement to edTPA, CLASS, and the Student Perception Survey will help build uniformity and allow better cross-program comparisons of data.

Departments use portfolio results to analyze individual and aggregate candidate success in order to inform program effectiveness and drive change. Lessons learned from portfolios that are driving change include UCCSTeach's realization that candidates struggle with CTQS 5a: Teachers demonstrate leadership in their schools. In response to this concern, the program has modified their Apprentice Teaching seminar by creating a Professional Learning Community-type environment and assigning candidates responsibility for leading the discussion. The PhD program, in analyzing portfolio results, determined that candidates were only completing a minimal qualitative component of their portfolios due to the timing of LEAD 7150: Applications of Qualitative Research. In response, the faculty reformulated the course to be split across the spring semester to allow candidates the time and opportunity to complete the IRB approval process and collect data. The TELP program noted that their candidates needed to do better in demonstrating knowledge and understanding of their specific content area standards (Social Studies, English, Elem Ed areas of Art, P.E., etc.), so that was added to lesson expectations and the Teacher Work Sample. Based on portfolio analysis, the Department of Special Education redesigned math requirements, added more clearly defined literacy lesson requirements, and added a new component to student teaching where candidates would have to implement assistive technology and augmentative alternative communication into their lesson plans.

Standard 1: Summary

Three years ago the College began a focused, strategic plan for creating a pervasive culture of assessment. The Assessment and Accreditation Committee was enlarged and strengthened through the addition of the department chairs and other strategic members. The Committee assumed a more holistic approach to assessment and has created a series of documents noting what information is collected across the college, what reports are required by various accreditation and regulatory agencies, and a college-wide assessment process.

Quite honestly, the SPA results were a wake-up call to the College. We learned that the existing data system was not the most convenient means of housing and reporting data; that some programs could have done a better job of aligning programs to standards; that assignments and assessments did not always line up with the curriculum, or perceived curriculum; and that the faculty chosen to write the SPA reports needed more support in preparing their submissions.

Through portfolios involving multiple measures, candidates demonstrate content and pedagogical knowledge aligned to standards. This internally validated measure addresses both the art and science of the profession. The College has a significant foundation in utilizing the feedback loop to candidates and programs are systematically strengthening their use of aggregate data to improve their effectiveness.  One of the problems uncovered through the SPA results applies to portfolios as well - the existing data systems used to house data have not been user friendly in reporting it. However, faculty are putting new systems in place to correct that problem. An analysis of the data input process along with the addition of new, externally validated measures (edTPA, SPS, CLASS) will substantially strengthen the portfolio process and the validity of portfolios in terms of candidate achievement and their predictability of candidate success in their careers. 

The leadership of the department chairs, the work of the Assessment and Accreditation Committee, faculty initiatives related to SPA and CACREP reports, and the support of the Assessment and Operations Specialist have combined to provide the momentum necessary to ensure the robust, iterative, and relevant assessment system currently being designed and implemented. Scheduled conversations with educator preparation faculty within COE and with faculty from the College of Letters, Arts, and Sciences will create a deeper understanding of, and agreement on, what candidates know and are able to do to be effective educators.

Standard 1

SPS Grades 3-5 Instrument

SPS Grades 6-12 Instrument

SPS Technical Report

Administrator licensure portfolio rubric

Principal licensure portfolio rubric

Back to Top
 
Feedback