CAEP

This document is written to include the original self-study (text in black), and for each standard the relevant CAEP offsite Formatie Feedback Review (FFR) narrative (text in blue). COE's response to the FFR is incorporated throughout and at the end of each standard (text in red).


The following link takes the reader to the FFR comments along with the EPP's response.

Click here to see the EPP's response to the FFR

Click here to see the EPP's new CIP Matrix

PAST PRACTICES

Assessment in the College was relatively dynamic during the period between accreditation visits. In the past, accreditation was largely a function of an external consultant hired prior to the site visit. This practice, coupled with a number of changes in leadership, not surprisingly yielded a 2007 NCATE visit where the College was cited for deficiencies in assessment. A number of faculty worked to improve assessments for the focused visit that occurred the following year but the changes were at the programmatic level. The ongoing assessments that were conducted were largely compliance driven: Title II, Professional Education Data System (PEDS), Colorado Department of Education (CDE), and Specialty Professional Associations (SPAs). The identification of assessment as an area needing improvement and the College's understanding that better assessments and a unit-wide assessment plan are needed to move it forward are the primary factors that led to Assessment being the area of focus for the Continuous Improvement Plan.

Currently the College is undergoing a number of important transitions related to assessment. The most important is a movement from compliance-driven assessment with little long-term vision to an internally-directed, program-improvement framework. The second shift is a movement from NCATE to CAEP. The former marks not only a new use in assessment data, but also new assessments and audiences. The third change is a shift from the College's use of internally validated assessments to more nationally and externally-validated measures. Understandably, the College is working to adjust to these differences and integrate substantial cultural and practical shifts.

Departments often functioned as autonomous units with limited integration of assessment, data sharing, or concrete steps toward achieving shared goals. This culture has begun to visibly shift under a newly revitalized Assessment and Accreditation Committee supported by improved business practices, dedicated personnel and program support, and increased availability of data and new tools. Among these important steps are: the creation of an integrated database capable of tracking students from inquiry to the workplace (Cherwell) and an Assessment and Operations Specialist supporting data flow and assessment needs. Finally, the University has begun to realize some of the benefits of the multi-million dollar, system-wide student record system (ISIS) capable of supporting advanced business analytics.

As a result of this evolution, the Assessment and Accreditation Committee and College as a whole have begun to identify key needs in order to support this change. Among these needs are regular reports and new assessment tools in order to answer vital questions in a more timely and transparent manner. More importantly, they have begun to address broader questions of candidate achievement and program improvement.

These needs were highlighted by the convergence of two key events that occurred over the past 14 months. The first were SPA reports. As part of the first step toward national program recognition and CAEP accreditation, the majority of programs submitted SPA reports. Because of the multitude of loosely integrated data streams and a limited assessment framework, the work proved exceptionally taxing. The difficulty in producing strong SPA reports highlighted the need for better tools, planning, and reporting to departments and program leaders. Nearly simultaneously, the University of Colorado Board of Regents initiated a program prioritization process based largely on the work of Dickeson (2010). Similar to the SPA process, the campus struggled to provide the required data and was not positioned to provide the program-specific metrics required to successfully benefit from the prioritization process. Although the College was able to provide program-specific data, it was a cumbersome and time-consuming process.

CURRENT STATE

Concurrent with this shift in how data are collected and utilized are a number of key drivers of change. The most important is Colorado's entry into the use of value added measures (VAM) tied to teacher effectiveness. While the legitimacy of tying student growth data to individual teachers continues to be researched, the State's data collection will provide critical data to Educator Preparation Programs. An outgrowth of Senate Bill 191, this system not only requires districts to use measures of student growth for teacher retention, it links this data back to EPPs for the first time. The data, when fully available, will allow EPPs to longitudinally track the workforce participation and results of program completers in a systematic way never previously possible. With this increase in data quality and availability, there will surely come increased scrutiny of EPPs to account for differences in workforce outcomes. In addressing the implementation with the Colorado Department of Education, and similar to other states, it will take multiple iterations before full implementation.

The wealth of information provided through this framework is hard to overstate. These data will help inform the College on teacher impact on student growth. Unlike other measures such as employer surveys, portfolios, or alumni surveys, the data returned to the College will provide a view of teachers as their districts see them. It will shift the foundation for a comprehensive framework of conversations between school district superintendents and human resource directors and the College to a more outcome-centric view. Given the diversity of districts served by the College-from high-risk, urban to exceptionally small rural-the College can better address the needs of regional employers. The data will serve as a lever toward measuring and improving program efficacy.

The second key driver is the rapid growth of both UCCS and the College. There are a number of new programs entering the assessment system, often bringing new perspectives on the role and nature of assessment. These include: UCCSTeach--a UTeach replication site-with their first program completer in Fall of 2013; a new, cross-disciplinary Bachelor of Innovation in Inclusive Early Childhood Education (BI:IECE) started fall of 2013; a proposed Bachelor of Arts in Inclusive Elementary Education to begin in 2015; and school-district level cohorts in principal preparation and educational leadership begun in 2013. All of these have leaders eager for feedback on how to improve and grow their programs. As a result, there is a renewed and pressing need for candidate and course-level data to inform questions about programs as well as build an infrastructure to support unprecedented levels of growth.

Another driver of change were the college-level conversations that emerged over the past three years. The impact is seen in departmental meetings, cross-department discussions, and program creations. The lessons from SPA and the program prioritization process also have impacted this change. The combination of experiences highlighted the demand for quality, timely data and the need to build the requirements of accreditation into the regular operations of programs and departments. The first steps in this process were identifying regular reports to inform practice. In conjunction with this need, the College has taken tangible steps to automate processes. The combination of the revitalized Assessment and Accreditation Committee and new data streams have laid the foundation for departments to work collaboratively in strategic investigations for regular program improvement.

A final critical component in the move toward an integrated assessment framework is the reconstitution of the College Advisory Board (CAB). The CAB has members from a number of stakeholder groups beyond the College, including district and community representatives.

As a result, CAB provides an authentic audience for the work of the assessment committee in addition to serving as a source of feedback. Fully integrating the CAB into the assessment framework represents a key component in the shift from predominantly internally-generated assessments with largely internal audiences to a variety of internal and external assessments reported to a larger audience.

FUTURE STATE

In order to accomplish the critical changes needed to move assessment and accreditation initiatives forward and institutionalize practices, the College has already taken a number of concrete steps. The first is regular reports of data on key internal measures. These data will provide new and more complete final measures of program impact and will serve as a springboard for more detailed conversations and investigations into candidate quality and content knowledge. Because of the large numbers of programs that utilize a cohort model, the College is able to link course-taking patterns, year of completion, and instructor to groups as well as individual students. Thus, if a program makes a change within a given year, it is reasonably easy to trace the effects through cohorts.

The biggest changes include incorporating new, externally validated measures into programs. Educator preparation is implementing a number of new instruments in the near future. These include:

  • Student Perception Survey (SPS) - jointly developed by the Gates Foundation, Tripod Project, and Colorado Legacy Foundation. The instrument is directly aligned to the Colorado Teacher Quality Standards and measures multiple domains of student perceptions including learning environment and classroom management.
  • edTPA- developed at Stanford to predict classroom readiness for candidates, it is a performance-based assessment of teaching. COE educators will also score candidates' work to compare their predicted scores with nationally certified examiners. UCCSTeach, TELP, and ALP are piloting the use of edTPA in 2014.
  • The Classroom Assessment Scoring System (CLASS) - developed by the University of Virginia's Curry School of Education. The domains for the instrument target the craft of teaching and will be administered multiple times to measure candidate growth. Both Special Education and UCCSTeach are currently piloting CLASS.
  • The Novice Teacher Project is a collaborative statewide effort is being piloted by a number of institutions and led by the University of Denver. The Novice Teacher Core Competencies: Impacts on Student Achievement and Effectiveness of Teacher Preparation. The participating EPPs have received data from the 2013 administration of the survey and are anticipating 2nd year data sometime in summer 2014.

Internally developed measures will remain an important part of the College's assessment framework. The College already utilizes an exit survey for program completers. Utilizing workforce data from SB 191, the College is building a number of new instruments including: a post-employment survey to be administered annually for the first five years after exiting a program and an employer survey to be administered annually. The post-employment survey is being jointly developed with the Office of Alumni Relations to provide feedback for the University's strategic plan without creating redundant surveys.

Paramount among the internally developed measures are the key assignments linked to SPA standards in each course throughout the College. Departments and programs are understandably at different places culturally and technically with this process. Leadership and Special Education are front runners in this process, having already implemented initial rounds of key assignments. These assignments are aligned to ELCC and CEC standards as well as the appropriate Colorado standards. To facilitate this process, the key assignments and rubrics are incorporated into the Blackboard platform faculty already use. This also allows for automation of reporting and ease of sharing rubrics between faculty members teaching the same course. In the case of TELP And ALP, assignments are linked to the Colorado Standards in both Blackboard and Task Stream.

Reflecting the collaborative nature of educator preparation, the College has initiated an annual dialogue with faculty in the College of Letters Arts and Sciences (LAS) who teach in those content areas. This dialogue is designed to serve a number of functions including dissemination of SPA requirements regarding content knowledge; establishing a framework for sharing internally and externally collected data about candidates and completers; and identifying areas for improved articulation between the two Colleges.

The College has utilized a modified version of the 'Plan/Do/Study/Act' cycle outlined in the CAEP Evidence Guide for introducing its major College-wide initiatives. These include a multitude of departmental projects such as key assignment reporting, new courses, online offerings, and the implementation and adoption of external measures such as CLASS.

CONCEPTUAL DIAGRAM OF THE ASSESSMENT FRAMEWORK

Figure 1 is a visual representation of the assessment and accreditation framework for the College. It shows the flow of aggregated data from support personnel for the College to departments. Departments-and programs within departments-make use of the data both internally and to generate an annual report to the Assessment and Accreditation Committee and externally for reporting purposes such as SPAs. The annual report is framed around the CAEP, CACREP, and SPA standards and is presented to the Assessment and Accreditation Committee at the February meeting. The Assessment and Accreditation Committee provides feedback to departments and the Associate Dean aggregates the reports into an annual report for the College. This report is presented to the CAB and Dean at their April meeting.

The Dean and Associate Dean are integral to this process at a number of junctures, providing feedback to the departments on the Colorado, CAEP, CACREP, and SPA standards as well as the College's own initiatives such as mission/vision and candidate learning outcomes. The Associate Dean is also central in providing feedback on data quality and collection to support personnel throughout the process. The values and mission of the College are infused throughout all our work rom program design, courses, and assessments and ensure that innovation, equity and inquiry are woven throughout programs. They also are considered when determining that program completers are competent representatives of the College of Education as well as their chosen professions. Among these internal measures are specific visions regarding equity, diversity, and social justice.
CIP Figure 1

CIP Figure 1

Back to Top
 
Feedback