Std 2 Printable PDF File


The unit has an assessment system that collects and analyzes data on the applicant qualifications, the candidate and graduate performance, and unit operations to evaluate and improve the unit and its programs.

How does the unit use its assessment system to improve the performance of candidates and the unit and its programs?

The unit maintains an assessment system that provides regular and comprehensive information that is useful to the whole unit. The assessment system was established based on the unit’s conceptual framework as well as state and professional standards. The unit fully developed and implemented a comprehensive gated system for candidates in both the initial and advanced levels. This Gated Assessment System consists of 11 transitional points (See Unit Continuous Assessment System).

Using primarily LiveText, Excel, and SPSS, the office of the associate dean, which coordinates the preparation of summary data reports summarizes and analyzes aggregated and disaggregated in tabular and graphical formats that include summary statistics (i.e., means, standard deviations, percentages, quartiles. etc.), which specify within and between groups performance for analyses by program faculty, the Assessment Committee and other unit committees/individuals.  Selected examples of summarized and analyzed data are documented as a part of the exhibits: Follow-up surveys; Student Teaching Observations; Student Evaluation of Cooperating Teachers; Student Evaluation of University Supervisors; College of Education Enrollment Data; Exit Surveys among others.

Assessment data is shared with candidates individually as faculty meet with them to provide feedback on their performance and progress through the program. Rubrics and other assessment forms provide both immediate and written feedback to candidates during course activities and field-based practice. Faculty assessment data are used regularly to improve faculty practice through the analysis of assessment data and the regular course evaluation and faculty evaluation process.

Candidates receive feedback concerning assessment of performance mainly through graded course assignments. Depending on the assessment, if a satisfactory performance is not demonstrated, candidates may have the opportunity to repeat the performance. Data from the candidates’ field experiences and clinical practice performances are shared several times throughout the student teaching semester through three-way conferences among the candidate, the university supervisor, and the mentor teacher. This data is used to guide the candidates in development of the skills and dispositions expected of teacher candidates. Where candidates’ weaknesses are identified, the evaluator makes a recommendation and encourages the candidates to include the area in their Professional Growth Plan (PGP). This information allows us to not only document the quality of our programs, but it also allows us to consider program changes and use data to inform our curricular decisions.

In addition to student-level data, the unit collects data about faculty qualifications, academic advising, and surveys from program completers. This information is summarized and analyzed. When appropriate, summarized data are shared with faculty, staff, students, and our P-12 partners through various committees and advisory councils. Faculty and staff evaluation data are used for merit pay, tenure, and promotion.

2.   Please respond to 2a if this is the standard on which the unit is moving to the Target Level

2a. Standard on which the unit is moving to the Target Level

Unit has not selected Standard 2: Assessment System and Unit Evaluation as standard to move to Target Level in this cycle.

2b. Continuous Improvement.

Since the last NCATE visit in October 2003, the unit has made consistent progress in refining and expanding the various components of the Gated Assessment for candidates. The Gate 1 through Gate 4 forms the assessment benchmarks that candidates in the initial programs must meet. These include, but are not limited to interviews, writing prompts, GPA, Praxis I and II, course embedded assessments, and observations and evaluations made by cooperation teachers, university supervisors, and e-Portfolios. At the advanced level, the unit has designated Gate 5 through Gate 11 as a pathway to matriculate beyond the initial programs. Gates 5 through 9 are for candidates in the teacher leadership program; Gate 5 through Gate 11 are for candidates in the principal preparation program. The unit has now fully implemented all the components of Gate 7 through Gate 11 as these were only recently established during the redesign process of the principal preparation and the Teacher Leader M.Ed. programs.

These transition points have been carefully designed to determine admission to initial programs, entry to and exit from clinical experiences, and program completion. The unit regularly evaluates all candidates, initial and advanced, from application for admission through program completion. There are four gates (transition points), for the initial programs, at which data are collected and decisions are made relative to candidates and the program.

Each GATE has one or more key assessments. Candidates are not able to progress in programs unless they have successfully met assessment standards. Sources of data are multiple, internal and external to the unit, and are collected and are analyzed in a systematic manner. At both the initial and the advanced levels, data are collected at various decision points as teacher candidates’ progress through their respective programs. This data is relative to their qualifications for admission to the program as well as their performance during their program and following program completion. The unit uses these key assessments to evaluate candidates’ knowledge, skills, and professional behaviors. Data from these assessments are summarized and analyzed by each program area as appropriate. Candidates who are not meeting expectations are generally allowed a second opportunity to complete an assignment or field placement before being removed from programs.

Another significant improvement to the program, since the 2003 visit, is the refinement and implementation of the Gated Transition through the advanced programs. Beginning in 2008, the unit with its school partners collaborative redeveloped Gates 5 though Gate 11 to address the redesign efforts in the School Guidance Counselor, Instructional Leader (principal preparation), and the Teacher Leader M.Ed., programs and its multiple concentrations/endorsement areas. These redesign efforts resulted in an ongoing examination of how candidates in advanced programs are monitored and assessed. The unit is in the process of completing its first year of data collection in the educational administration program using the new gated assessment system. The unit is closely examining the elements of the newly developed transitional points and faculty are engaged in examining critical pieces of information to inform their instruction and provide ongoing assessment for programmatic review. In all cases, the UCAS is a comprehensive system of monitoring all candidates’ progress regardless of redesign efforts.

In the last two years, our faculty engaged in the redesign and development of various programs, and especially those in the advanced level. During the process, our faculty participated in a series of conversations with teachers and administrators (in Jefferson County Public Schools, Bullitt County Public Schools, Shelby County Public Schools, Louisville Archdiocese, and Oldham County Public Schools) who are committed to creating a unique partnership in development, supervision, and assessment of our candidates. Out of those conversations, the unit was able to refine the gated pathway through programs and the various components of the assessment system.

Unit faculty have also worked ardously through the Assessment Committee (a standing committee of the unit) in evaluating unit capacity, effectiveness, and efficacy of assessments instruments that the unit has implemented. On a regular basis, faculty addresses issues of fairness, accuracy, consistency and the avoidance and elimination of bias as an integral component of the development and implementation of a high quality assessment system. The unit ensures that assessments are fair, accurate, consistent, and unbiased through communicating procedures with candidates, as well as through systematic review of content validity, timing and nature of assessments and independent and collaborative assessments of candidate performance. In addition, many of the unit’s assessment are accompanying rubrics intended to outline the possible levels of candidate proficiencies. The unit continues to engage in systematic reviews of assessment rubric to ensure that they are optimally calibrated to provide accuracy in performance measures. Further, the use of standardized tests, especially during the admission process, plays an important role in ensuring candidates are assessed in as unbiased a manner as possible. With support from the professional community, the unit refines the assessment system so as to reflect both the conceptual framework and candidate proficiencies outlined in professional and state standards.

Since the 2003 visit, faculty continue in conversations and review assessment tools to assure fairness in assessment. This ensures  that candidates have been exposed to knowledge, skills, and disposition proficiencies that are identified in course syllabi and evaluated in key assessments. Candidates are also apprised of the timing of assessments, instructions for completing assessments, performance expectations on assessments, how assessments are scored, and how they count toward program completion. In addition, all course syllabi in the unit are required to include information for candidates with disabilities to insure fairness in assessments for those with documented disabilities. All candidates, irrespective of program, are provided with sufficient opportunity to demonstrate knowledge; skills; and dispositions in their clinical practice, practica, and internships; and in e-Portfolio development and presentations. Candidates in both initial and advanced programs are provided feedback through multiple sources. During the student teaching and internships (practica), candidates are observed at least four times per semester. Analysis of field evaluations suggests a consistent pattern of candidate ratings across supervisor and mentor teacher evaluations. Multiple evaluators also assess portfolio presentations and come to consensus on the final portfolio score at both the initial and advanced program levels.

Unit faculty’s awareness of the need to insure accuracy in assessment has increased, and they continue to collaborate in assuring that key assessments are of the appropriate type and content.. To insure such accuracy, unit faculty have come to recognize and work collaboratively across programs to see that assessments are aligned with knowledge, skills, and dispositions drawn from the conceptual framework. These competencies are specified in course syllabi.

An on-going engagement among faculty the effort given to insure consistency in assessment by assuring that key assessments produce dependable results that remain constant over repeated trials. Consistency in assessment is assured: (1) as faculty and the Assessment Committee review data longitudinally to insure consistency in scoring; (2) by providing training for new raters of e-Portfolio using scoring rubrics; and (3) by conducting a study of inter-rater reliability of instruments where rubric are used for scoring. To ensure reliability and validity of assessment instruments, faculty design assessment instruments and rubrics and score rubrics collaboratively. Adjunct faculty are oriented to and trained to use all gate and program assessments. Additionally, cooperating teachers have been oriented to student teaching assessments and trained to assess candidate content knowledge, pedagogical knowledge and skills, and assessment knowledge and skills.

The unit insures that assessments are free of bias by addressing contextual distractions and problems with key assessment instruments that introduce sources of bias that can adversely influence candidate performance. All University classrooms are well maintained to insure adequate lighting, a quiet learning and assessment environment, comfortable seating, temperature-controlled classrooms, and functional technologies used for assessment purposes. Assessment instruments, including critical tasks and scoring rubrics in LiveText, are carefully designed, worded, monitored, and reproduced to insure clarity of instructions for completing assessments and ease of reading assessment questions and items.

The evaluation process of the assessment system continues to be faculty driven with formal evaluation being a constant agenda in the faculty annual retreats. Where changes are mandated by regulating agencies including EPSB with regard to Praxis codes and scores, reviews, and refinement of the assessment system is evaluated and refine is warranted. Other aspects of the program that might warrant reviews and evaluation may be triggered by the assessment process itself based on data. For example, following an analysis of data collected by cooperating teachers using the Student Teacher Observation form, which was previously designed with a three-point scale, unit faculty determined that a four-point scale would calibrate the scale better. Revisions were made to develop and implement a four-point scale. The office of the associate dean serves as a clearinghouse for collecting, organizing, aggregating, and disaggregating assessment data for the Assessment Committee review.

The use of multiple assessments has supported the unit in documenting candidate level of proficiency to demonstrate knowledge, skills, and professional dispositions for helping all students learn. Each of our programs, initial and advanced, emphasize the importance of multiple measures of assessment and using that to inform instructional decisions focused on differentiated instruction to assure that all teacher candidates leave Spalding University well prepared to use these strategies to positively impact student achievement. In supporting our teacher candidates in improving their teaching skills and become reflective practitioners, they are required to maintain a daily journal, particularly during student teaching and internships, where they would reflect on their teaching, insights, feelings, perceptions, and impressions of their teaching delivery and lesson content.

A review of the Student Teacher Observation Instrument data completed by university supervisors during clinical practice site visits include the following selected anecdotal data with includes the following:

The students respond so positively to Mrs.________. They participate…by asking questions and making comments on the topic… she gets students back on track! They listen intently to her and want to do well!

Another comment from the collected data stated,  “Becky collaborated with her social studies teacher to develop the lesson.” An excerpt from another university supervisor affirmed: “Guides students to the correct answer to help them understand what they are doing.” Another supervisor stated:

…appropriate instructional strategies used. Provides opportunities for students to understand lesson content from different perspectives. Redirected students a few times to address misconceptions/clarified directions. …Variety of formative and summative assessments have been used to determine students’ progress and measure students’ achievement.

Additional comments as these are available as Exhibits.

Unit Use of Technologies

During the 2003 visit, the Board of Examiners team noted an area of improvement in the “use appropriate information technologies to maintain its assessment system” (BOE Report, 2003, p. 44). The unit has adopted and fully implemented various information technologies. These include LiveText, DataTel, SPSS, and the use of a secure network drive.

a.  LiveText: Spalding University’s current use of LiveText emphasizes candidates’ field experiences and ePortfolios. In its use of LiveText system, the unit has the utility and flexibility in the assessment of initial and advanced candidate performance and unit operations through the use of contemporary information technologies. The system accommodates change and revision, programmatically and systemically, based upon planned and purposeful feedback from multiple constituents.

The unit’s plan is to increase the use of LiveText in the next two semesters to include the collection of data in follow-up studies. The LiveText coordinator, the director of field experiences and clinical practice, and the university’s office of institutional effectiveness have been instrumental in obtaining candidate data for LiveText and other data analysis procedures since the last visit. In the spring 2009 semester, the unit migrated from the use of “flat” portfolios (three ring-binder portfolios) and introduced e-Portfolios, the culminating project for the initial and the teacher leader programs. Required entries vary according to the class, and may include artifacts such as reflection, personal educational aims and goals, summaries and critiques of resources and activities, lesson plans, classroom management plans, and course examinations that address pedagogical content knowledge. In implementing LiveText, the Unit has budgeted to provide free subscription of LiveText accounts to all candidates in all programs. The unit is steadily implementing the use of online collection of data. The use of online collection and assessment instruments allows the unit to ensure the accuracy of data eliminating the possibilities of scanning and repairing errors. Beginning in the spring 2009, trial online submission of other assessment instrument data began.

b. DataTel: This is an integrated student management system and is one of the two independent data management systems that the unit utilizes. Access to the administrative database is available to program administrators, all faculty, and staff within and outside the unit (those outside the unit input data in the offices of admissions and registration). In addition, candidates have some limited access to the system and may use it for course registration and viewing academic records, including course grades and GPA, as well as their business records). Each of the user entities has varying levels of access and usage. In fall 2010, the unit upgraded the SPSS software from version 17 to version 19.

c. SPSS: The unit utilizes the Statistical Package for Social Sciences (SPSS) software extensively in aggregating and analyzing candidate and unit data. The University has made available SPSS software licenses to all faculty. The software is available on several faculty desktops include a dedicated workstation used by Graduate Assistants.

d. Unit-wide Secure Network Drive: Each member of the unit’s faculty have access to a unit wide secure network portal (referred to as the vault). Faculty share a folder from their individual computers on campus where they can connect to the folder in the “vault” over the network. Each unit faculty has access to the College of Education’s folder on the vault, a network drive established for the unit where information, documents, and other collaborative projects are shared and saved. All faculty members in the unit have access to aggregated and disaggregated data stored in the network drive (vault). This data is meaningfully organized for ease in storage and retrieval. As a part of the implementation of the shared drive, its use has encouraged both transparency, timely sharing of aggregated and disaggregated data, and collegiality among unit faculty.

3.  Exhibit Links