NCATE Standard 2 Report


STANDARD 2. ASSESSMENT SYSTEM AND UNIT EVALUATION

2a. Assessment System

2a.1. How does the Unit ensure that the assessment system collects information on candidate proficiencies outlined in the Unit's conceptual framework, state standards, and professional standards?

The Unit’s conceptual framework (Exhibit B1.8.1 Conceptual Framework Tree Graphic and Strands) is aligned with state and professional standards (Exhibit B.1.9.1 Alignment of State, Professional and Institutional Standards) and serves as the foundation for the Unit Assessment System (UAS).  The signature assessments are aligned with conceptual framework outcomes (Exhibit B.1.9.2 Conceptual Framework KSD Alignment with Signature Assessments). This alignment ensures that the assessment system collects information on candidate proficiencies as articulated in the conceptual framework, the state’s standards and the professional standards for both initial and advanced programs. The alignment also enables us to be efficient and focused on the data that are collected, which maximizes our ability to grow a culture of data-driven decisions in the Unit. The UAS is a blueprint for fostering a continuous cycle of planning, implementation and evaluation that ultimately documents that the Unit produces knowledgeable, skilled, compassionate educators and other school professionals. 

The UAS is a centralized system that is comprehensive in the assessment of the Unit’s operations, the quality of its initial and advanced programs, the performance of its candidates and the professional competencies of its graduates. A new dean was appointed during the fall of 2009.  As part of the orientation process to the Unit and GSU, the newly appointed dean conducted an informal, internal review of the Unit’s operations, including the assessment system. As a result, UAS was refined to include 1) a more defined approval process for implementing data-driven decisions, and 2) a Unit adoption of TaskStream with implementation in spring 2010.  

TaskStream is not new to the Unit. When the State terminated its use of Passport, the Unit adopted and implemented TaskStream as the electronic portfolio system used by two degree programs: Doctor of Education in Curriculum and Instruction and Doctor of Education in Educational Leadership. These programs are attached to the Louisiana Education Consortium (LEC); GSU is one of three institutions in this consortium. TaskStream through its portfolio system enables the Unit to collect candidate data, to provide faster feedback to candidates, and to communicate with candidates post-graduation.  Systematic and periodic checkups are performed every semester to ensure candidates submit the required assessment (e.g., signature assessments, surveys, portal reviews) and faculty and supervisors evaluate the assessment on time. A calendar of assessment events is widely distributed as a means of ensuring the meaningful participation of those who can best help us ensure that data are collected (and used) to determine candidates’ proficiencies as articulated in the conceptual framework, state standards and respective professional standards.

2a.2. What are the key assessments used by the Unit and its programs to monitor and make decisions about candidate performance at transition points such as those listed in Table 6? Please complete

Exhibit 2a.2.1 Table 6 Unit Assessment System Transition Points  

2a.3. How is the Unit assessment system evaluated? Who is involved and how?

The UAS is a comprehensive, systematic, standards-based assessment system and Unit evaluation is created through collaboration. The initial assessment system was designed to be implemented over a series of years from fall 2002 to spring 2005. Stakeholders included members of the PK-16 Council, the Professional Education Council, College of Education Administrative Council, and departmental faculty from College of Education and College of Arts and Sciences. The initial Unit assessments have remained relatively unchanged except for changes in Louisiana PRAXIS scores or changes due to State redesign efforts. The initial program assessments, however, have changed and continue to change as members of the stakeholder groups review and revise assessments. (See Exhibit 2a.3.1 Table of Current Program Assessments; Exhibit 2a.3.2 Table of Program Stakeholder Committee Members). Stakeholders have met in individual work teams to review the data derived from the system, to determine applicability to measure candidate performances, to determine usefulness in informing decisions about candidates and Unit functions. In addition workshops and retreats were held to enable work teams to collaborate on assessments and to allow groups to collaborate across programs (Exhibit 2c.2.10 Assessment Retreat 2009; Exhibit 2c.2.11 COE Newsletter 2008; Exhibit 2c.2.12 Retreats October 2008 Agenda and Minutes; Exhibit 2c.2.13 Assessment Work Session April 25, 2008). SPA work teams collaborated with the Unit Assessment Committee (UAC) through the UAC chairs.

The UAC oversees assessment within the Unit. The assessment coordinator coordinates assessment practices within the Unit among programs: 1) analyzes data from various areas of the UAS, 2) disseminates data to program chairs and department heads, and 3) performs systematic and routine checks to ensure candidates submit required assessments and that faculty and supervisors evaluate the assessment on time. Proposed changes may be initiated at the program, department, or committee level and flow through the UAC. The UAC reviews and notifies the dean of proposed changes. The dean forwards changes to the COE Administrative Council (AC), which forwards to PK-16+ Council for review and recommendations. Their recommendations are considered in the final action taken by the COE AC. The UAS transition points were expanded in AY 2006-2007 to include portal transition points for the Louisiana Educational Consortium program, but upon recent review by the UAC the transition points were collapsed in an effort to streamline and more efficiently depict the UAS. This change was recommended by the UAC (Exhibit 2a.3.4 Minutes of Assessment Committee, see November 10, 2009) and reviewed by the PK-16+ Council (Exhibit 2a.3.10 Minutes of PK 16 Council and PEC Meetings and Agenda, see February 23, 2010) and approved by the COE Administrative Council (Exhibit 2a.3.6 Minutes of CO Administrative Council).

2a.4. How does the Unit ensure that its assessment procedures are fair, accurate, consistent, and free of bias?

The Unit and its programs take multiple steps to ensure procedures are fair, accurate, consistent and free from bias:

  • Candidates at both initial and advanced levels are informed of program requirements at the time of program admission and during advisement each semester (Exhibit B1.2.14 GSU Academic Advisement Manual), and these requirements are detailed in Exhibit B1.2.1  GSU General Catalog, available online, and in program handbooks (Exhibit 2a.4.1 Initial and Advanced Program Handbooks).
  • Course syllabi were standardized and course performance objectives and rubrics aligned with the conceptual framework and with state and professional standards. Program faculty provide candidates with course syllabi and rubrics at the beginning of each semester. Candidates also have “due process” procedures at GSU, Unit and program level. There is an appeals process for candidates stipulated in GSU catalog. Faculty also provide assessment accommodations for candidates registered with the Student Intervention Resource Center.
  • The Unit uses multiple measures at each transition point (See Exhibit 2a.2.1 Table 6 Unit Assessment System Transition Points).  Assessments are reviewed by program faculty to ensure assessments are free of racial and ethical stereotypes, poorly conceived language and task situations, and other forms of cultural bias that could unintentionally favor one candidate over another or impact candidate performance. Discussion between supervising faculty and cooperating teachers address issues of fairness, accuracy, consistency and avoidance of bias at the start of each semester during clinical practice. The diversity of the faculty in the Unit also helps to ensure the elimination of bias.
  • Standardized tests scores on ACT, SAT, GRE, PRAXIS I and II, and LATAAP provide the Unit with data based upon consistent, reliable, and nationally validated criteria on candidate performance to be used in comparative analyses and assurance of candidate mastery of content. The signature assessment for student teachers and interns (ED 455 Student Teacher/Intern Evaluation) is the previously validated Louisiana Components of Effective Teaching instrument used by state evaluators.
  • When possible, multiple raters are used and data are triangulated to ensure validity and reliability. For example, several assessments at initial level are panel reviewed by members internal and external to GSU (ED 455 Electronic Portfolio, MUS 411 Juried Panel Recital, Art 422 Senior Exhibition). Advanced candidates comprehensive exams use multiple raters.
  • Content validity has been a major focus of the redesign process over the past five years. Prior to approval, each course was examined by a team representing different disciplines who scrutinized both objectives and assessment for alignment with professional standards. The evaluators were brought in from outside of Louisiana to provide a broader perspective in the state review process.

2a.5. What assessments and evaluation measures are used to manage and improve the operations and programs of the Unit?

The Unit maintains a plan for data collection, analysis and review (see Exhibit 2a.5.1 Data Collection, Analysis, Review Plan) that describes how data are used. Assessment data are collected at multiple points, and multiple assessments are used including both internal and external data. Data are regularly compiled, summarized, analyzed and used. For example, candidate data are used by programs to make decisions regarding candidate admission, matriculation, and program completion.  Program assessments are used internally to measure program quality and manage and improve Unit operations and programs. SPA program reports are external evaluations used to strengthen the overall performance of the Unit and ensure that graduates have the knowledge, skills, and dispositions to meet program standards. SPA program approval reflects on Unit and operations quality.  Employer surveys are used to ascertain candidate proficiencies in the workplace as well as Unit and operations quality. Follow-up surveys also provide data for improvement of Unit operations (Exhibit 2a.5-2 Follow-up Survey; Exhibit 2a.5-3 Employer Survey).

Exhibit B1.2.16 GSU Course and Instructor Evaluations are completed by candidates.  Results of these evaluations are shared with faculty members to improve the teaching and learning environment and are used by departmental chairs during annual faculty evaluations as well as an indicator of Unit and program operations quality.

Faculty submit the Annual Faculty Report (Exhibit B1.2.17 Annual Faculty Report). Faculty evaluations by department chairs are conducted annually (Exhibit B1.2.18 Faculty Performance Evaluation Form) and feedback is used to improve faculty productivity and to assist faculty in meeting tenure and promotion goals. Data also provide evidence of Unit and program operations quality. Tenure-track faculty are evaluated for tenure and promotion on criteria following procedures established in the Exhibit B1.2.11 GSU Faculty Handbook. Faculty are also evaluated by peers using the Exhibit B1.2.19 Faculty Peer Evaluation.

GSU supervisors and cooperating teachers are evaluated and data are used to make future assignments and used as an indicator of Unit and program operations quality (Exhibit B1-2-4 OPLE Student Teaching Handbook).  These evaluations are completed at the end of each semester: 1) student teaching candidate evaluation of GSU supervisor and cooperating teacher, 2) cooperating teacher evaluation of GSU supervisor, and 3) GSU supervisor evaluation of cooperating teacher.

The annual Departmental Goals and Objectives Form (Exhibit B1.2.20 Administrative and Academic Support Units Forms) is used to guide the planning and operations of each department and is used as an indicator of Unit and program operations quality. Each fall, departmental faculty set goals, objectives, strategies, and performance measures for the upcoming fiscal year and evaluate performance measures from the previous year.

2a.6. (Optional Upload for Online IR) Tables, figures, and a list of links to key exhibits related to the Unit's assessment system may be attached here. [Because BOE members should be able to access many exhibits electronically, a limited number of attachments (0-5) should be uploaded.]

2b. Data Collection, Analysis, and Evaluation

2b1.   What are the processes and timelines used by the Unit to collect, compile, aggregate, summarize, and analyze data on candidate performance, Unit operations, and program quality?

Historically, assessment data have been collected in the Unit, but the UAS implemented in 2002-2003 academic year provided structure and improved the process. The UAS included an Action Matrix delineating the data collection, analysis, summarization, and dissemination processes as well as timelines and detailed information on the who, what, and when of each process (see Exhibit 2b.1.1 COE Assessment Action Matrix). A Unit Assessment Committee (UAC) self study recently revealed that while data was systematically collected and analyzed, the dissemination and review processes were not as systematic. This prompted several changes in UAS: 1) UAS transition points (Exhibit 2a.2.1 Table 6 Unit Assessment System Transition Points) and 2) UAS data collection, analysis and review plan (Exhibit 2a.5.1 Data Collection, Analysis, Review Plan).

At the time of the 2003 NCATE visit, the Unit used PASSPORT, an electronic portfolio system, as a tool to support data collection, aggregation, and disaggregation. When PASSPORT changed ownership, the Unit had to decide to either remain with PASSPORT or switch. In 2006, the Louisiana Education Consortium (LEC) Board decided to utilize TaskStream as its new electronic portfolio system (see Exhibit 2a-3-11 Minutes of LEC Board Meetings). Since GSU along with Louisiana Tech University and University of Louisiana-Monroe comprise the three-university consortium offering two doctoral degrees (Doctor of Education in Curriculum and Instruction and Doctor of Education in Educational Leadership), Grambling switched to TaskStream with the other two universities. ULM took the lead in establishing a TaskStream LEC assessment structure and populated the system with assessments specific to the LEC and its courses which are taught at each of the three universities. Faculty at all three institutions were trained. In 2007, two training sessions were conducted to train the Unit’s faculty on how to use TaskStream.

In summer 2007, however, the Unit’s assessment coordinator resigned his position and GSU began a search for a new coordinator. As time passed, the doctoral programs remained the only programs using TaskStream. Although the college experienced challenges in finding an assessment coordinator, college faculty and staff across all programs and the Unit continued to follow the UAS schedule in the Action Matrix and continued to individually collect, analyze and summarize data from applicants, candidates, graduates and employers, cooperating teachers, GSU supervisors, faculty, and national testing services.  Data were compiled and shared using MS Excel, SPSS and MS Word software and presented in table format as well as in graphs depending on how data are best represented. However, data collection began to decentralize in multiple places and in multiple formats becoming cumbersome for Unit faculty and administration.

With the change in College of Education Dean and GSU administration in fall 2009, a self study was conducted by the UAC. The committee streamlined the UAS for consistency and a clearer picture across programs collapsing the 13 portal transition points into six transition points across all programs (see Exhibit 2a.2.1 Table 6 Unit Assessment System Transition Points). To more efficiently and effectively monitor the processes, the number of events collapsed into four: assessment administration, data collection, data analysis and summary, and data usage. At each event the frequency and responsibility were clearly delineated (Exhibit 2a.5.1 Data Collection, Analysis, Review Plan). This recommendation from the UAC with input from Unit faculty and staff was reviewed by the PK-16 Council and adopted by the COE Administrative Council (see Exhibit 2a.3.4 Minutes of Assessment Committee; Exhibit 2a.3.10 Minutes of PK-16 and PEC Meetings and Agenda; Exhibit 2a.3.6 Minutes of CO Administrative Council). However, even with streamlined UAS, it was a challenge to fully integrate the UAS because data was housed in multiple formats and in multiple places.

In fall 2009 due to the absence of an assessment coordinator, the Associate Vice President/Planning and Institutional Research along with a faculty member appointed as a data analyst for the Unit were given the responsibility to centrally analyze and summarize and disseminate Unit and program data for review following the UAS scheduled dates (Exhibit 5a.5.1 Data Collection Analysis Review Plan). The data analyst is a faculty member in the Unit who teaches research and statistics classes and assumed the assessment coordinator’s responsibilities in fall 2009. In fall 2009, all programs began to transition to TaskStream, and data from surveys and signature assessments from all programs are scheduled for collection via TaskStream in spring 2010 semester. Using the TaskStream electronic portfolio system will expedite the process to data review for program improvement. It also increases access to information since reports are posted on the TaskStream website making data accessible to faculty and administrators anytime with Internet access.

2b.2 How does the Unit disaggregate candidate assessment data for candidates on the main campus, at off-campus sites, in distance learning programs, and in alternate route programs?

The Unit only has programs on the main campus. The Unit does not have off-campus sites or distance learning programs although some courses are offered through distance learning.  

Teach GSU (Practitioner Teacher Program) is an alternate certification program with concentrations in elementary and special education.  The Teach GSU program does share common assessments with undergraduate initial programs (See Exhibit 2a.3.1 Table of Current Program Assessments) and the program is aligned with Louisiana Components of Effective Teaching Standards. The program assessments include dispositions inventories, grades in core courses, written lesson plan, practicum internship evaluation, impact on student learning, electronic portfolio, employer survey, and follow-up survey. The follow-up survey is to be mailed to graduates of the Teach GSU program in spring 2010.

The Unit disaggregates data to review candidate performance on specific expected outcomes, to review the performance of candidates enrolled in specific Unit programs, and to review candidate performance in job placements by the programs they completed. These disaggregated data inform the Unit of efficacy of assessment tools by program and by specified program outcomes and of effectiveness of programs offered in the Unit in producing the professional educator who possesses the knowledge, skills, and dispositions explicated in the Conceptual Framework.

2b.3. How does the Unit maintain records of formal candidate complaints and their resolutions?

The Unit follows the procedures established by GSU as described in two documents: Exhibit B1.2.21 Code of Student Conduct Handbook and Exhibit B1.2.1 GSU General Catalog, 2009-2011 Undergraduate and Graduate. Complaints must be given to the GSU Judicial Officers and procedures followed as delineated in the Code of Student Conduct Handbook.

The GSU Catalog has procedures for student appeals of grades and academic suspension for undergraduates (p. 32 and 33, respectfully), and for graduate students (page 175). In all instances the appeals follow a chain of command. For appeals of grade, the procedure begins with the course instructor and continues to dean and, if necessary, to the vice president level. If the appeals relate to programmatic rules and regulations, procedures begin with the academic department and continues through to the college level and if necessary to the vice president level. The Unit follows GSU’s policy, which is based on University of Louisiana System Board policy on formal candidate complaints and for their resolutions.

Maintenance of records of formal candidate complaints and their resolutions are decentralized according to departments. When complaints cannot be resolved in a departmental level, then those complaints are forwarded to the Dean’s office and records are maintained in the Dean’s office.  

2b.4. (Optional Upload for Online IR) Tables, figures, and a list of links to key exhibits related to the Unit's data collection, analysis, and evaluation may be attached here. [Because BOE members should be able to access many exhibits electronically, a limited number of attachments (0-5) should be uploaded.]

2c. Use of Data for Program Improvement

 

2c.1. In what ways does the Unit regularly and systematically use data to evaluate the efficacy of and initiate changes to its courses, programs, and clinical experiences?

Data collected at each transition point are depicted in Exhibit 2a.2.1 Table 6 Unit Assessment System Transition Points. At each transition point, the Unit regularly and systematically examines admission data to determine candidate readiness and matriculation through its programs. For example, PRAXIS I data was used to structure a recommendation that candidates who have not passed PRAXIS I must by their sophomore year enroll in the Professional Accountability courses continuously each semester until PRAXIS I is passed (Exhibit 2c.1.1 Title III Grant PRAXIS Research Report; Exhibit B1-2-1 GSU General Catalog, see p. 134 for catalog description of professional accountability courses ED 208 and ED 209)

The Unit maintains a data collection, analysis, review plan (Exhibit 2a.5.1 Data Collection, Analysis, Review Plan) that details when assessments are administered, the frequency of data collection, the responsibility for data collection, the frequency of data analysis and summary, the responsibility for data analysis and summary, who evaluates and monitors use of data, and how data are used. During the past three years, the Unit has regularly and systematically examined data relative to its programs and has made changes relative to course assessments, clinical experiences, and programs. These are summarized in the Exhibit 2c.1.2 Table of Program Improvements by Program.

Program faculty review data on signature assessments each academic year. Changes occur for a variety of reasons, ranging from changes in state or national program standards, best practices, or identified needs. For example, if faculty perceive current signature assessments do not sufficiently measure program standards in a course and data supports this perception, then changes are made in the course and its signature assessments. Many of the changes in signature assessments in the last three years resulted from this type of change (see Exhibit 2a.3.3 Table of Program Assessment Changes).  As an example of a change resulting from the national program standards and curriculum review, the Health and Physical Education program faculty created departmental level examinations to provide further insight into the effectiveness of its program: Level I-Sophomore Examination, Level II-Junior Examination, and Level III-Senior Comprehensive Examination. The faculty aligned level exam questions with required courses and the NASPE Standards. Candidate performance on exam questions was aligned with questions and with specific NASPE standard indicators so the analysis of data shows how candidates performed on each level exam by standard.   Analysis of data has been used to determine where course materials and delivery of the materials need adjusting. Strategies to assist students include: study sessions, assigned readings, peer review sessions, using computer-based materials and materials relative to the area of weakness (Exhibit 2c.1.2 Table of Program Improvements; Exhibit 2a.3.8 Minutes of Department of Kinesiology, Sport & Leisure Studies Faculty Meetings, see October 30, 2008 and February 10, 2009).  Students must pass the exams with a 70% to graduate.

Data from three sources caused the Office of Professional Laboratory Experiences (OPLE) to rethink field assignment procedures: 1) field experiences indicated a low completion rate for freshmen field experiences, 2) GSU demographics indicated an increased number of students on financial aid, and 3) an increase was noted by OPLE staff in the number of requests to change assignments due to lack of transportation. This led OPLE to move freshman field experiences to Alma J Brown and Grambling Middle and High Schools due to their close proximity to campus. Additionally, in the last three years, to promote shared transportation and expenses, OPLE has assigned classes to a specific school when school grade levels and subjects at the school are matched to a candidate’s area of certification (Exhibit 2a.3.7 Minutes of Curriculum and Instruction Departmental Faculty Meeting, see August 15, 2008).

Data such as candidate knowledge, skills, and dispositions, standardized test scores (PRAXIS) and exit surveys are used to implement program changes. For example, PRAXIS I data was used to support the reopening of the PRAXIS lab and the continued support of personnel to oversee its operation.

Additionally PRAXIS II data was reviewed and new courses were proposed to support candidate passage (ED 375 PRAXIS II Preparation – Elementary Content Knowledge and KNES 349: Accountability in Kinesiology). Due to budget constraints these suggested improvements have not been implemented.

2c.2. What data-driven changes have occurred over the past three years?

Over the past three years, Unit consideration of assessment data resulted in substantial data-driven-changes in courses, programs, and the Unit.

Standards alignment initiated changes in course signature assessments: ED 216, ED 330, ED 450, ED 452 and ED 453, ED 455, and ED 402 (Exhibit 2a.3.3 Table of Program Assessment Changes).  Additionally, State redesign initiatives have prompted curriculum and programmatic changes (Exhibit B1-3-1 Special Education Redesign). The State approved the redesign for reading and literacy and numeracy which impacted courses: ED 217, ED 303, ED 304, ED 325, ED 322, and ED 431 (Exhibit B1-3-2 Reading and Literacy Redesign Grades 1-3; Exhibit B1-3-3 Reading and Literacy Redesign Grades 1-6; Reading and Literacy Redesign Grades 6-12). 

Data-driven changes have occurred at the Unit level:

  • Data from candidate course enrollments supported a change to establish a scheduling cycle of courses (Exhibit 2a.3.7 C & I Departmental Faculty Meetings, see November 20, 2006, March 20, 2007, August 15, 2008).
  • Comparison of program admitted majors to non-admitted declared majors along with data on PRAXIS I passage rates, supported the initial opening and the two-year continuation of the PRAXIS lab with a full-time coordinator.  Additionally, this same data has been used to support for the last three years regularly scheduling of PRAXIS professional accountability courses (ED 208 and ED 209) (Exhibit B1-2-1 GSU General Catalog, p. 134). 
  • PRAXIS I data was used to support a funded research grant that led to a policy change in initial teacher preparation programs. Effective spring 2010, the Unit requires all sophomores who have not passed the PRAXIS to enroll in PRAXIS professional accountability courses (ED 208 and ED 209) and to continue in these courses until all parts of PRAXIS are passed (see Exhibit 2c.1.1 Title II Grant PRAXIS Research Report; Exhibit 2a.3.7 C & I Departmental Faculty Meetings, see February 2, 2010 approval).
  • PRAXIS I data supported the Unit sponsorship of an interdisciplinary workshop and collaboration with Arts and Science faculty to align courses with PRAXIS I and the design of strategies to promote knowledge acquisition for elementary education majors (See Exhibit 2c.2.1 Interdisciplinary Workshop).

One improvement impacted three programs. PRAXIS II data and SPA recommendations were used to leverage for faculty positions. Faculty with certification in English, social studies and mathematics were transferred to the College of Education in support of programs in those disciplines.

In the past three years the Unit has held two retreats (2008, 2009) and one assessment work session (2008). The work session held in April 2008 focused on program assessment revisions and standards alignment and included faculty from the College of Education and College of Arts and Science (see Exhibit 2c.2.11 COE Newsletter 2008). The retreat in October 2008, focused on providing stakeholders with an overview of the assessment system and participants included K-12 school partners, community leaders, faculty from College of Education and College of Arts and Science (Exhibit 2c.2.14 Retreat October 2008 Agenda and Assessment Minutes and Exhibit 2c.2.11 COE Newsletter 2008). The focus of the latest retreat held December 3, 2009, was to examine data summaries and make recommendations for program improvements (Exhibit 2c.2.12 GSU Assessment Retreat); these suggested recommendations were reviewed and after discussion tabled for further study by the PK-16 Council (Exhibit 2a.3.10 Minutes of PK-16 and PEC Meetings, see February 22, 2010).  

2c.3. What access do faculty members have to candidate assessment data and/or data systems?

 Signature assessments data are used to determine candidate proficiencies which impact candidate matriculation and to examine Unit and operations quality (Exhibit 2a.5.1 Data Collection, Analysis, Review Plan). Faculty collect and review each semester the assessment data compiled from signature assessments. In the absence of an assessment coordinator, faculty created Excel data tables and data summaries to support signature assessment changes relating to state, professional and Unit standards or to change course strategies or delivery to promote learning. In fall 2009, the data analyst assisted faculty with data summaries. On TaskStream faculty grade candidate work as well as analyze and review candidate data. TaskStream increases faculty access to information since reports are posted on the TaskStream website making data accessible to faculty with Internet access.

Assessment data are also discussed as the need arises during college and departmental meetings and PK-16+ Council meetings (Exhibit 2a.3.5 Meeting Agenda of College of Education, e.g., April 4, 2006; Exhibit 2a.3.10 Minutes of PK-16+ and PEC Meetings and Agenda). Faculty from the College of Education and the College of Arts and Sciences also serve on the PK-16 Council and the Professional Education Council. In addition, departmental goals are discussed with faculty and goals are set based on data and targeted program goals. Faculty are given updates relative to candidate graduation data, PRAXIS data, PRAXIS lab operations, field and placement experiences, and recruitment efforts (Exhibit 2a.3.7 Minutes of C & I Departmental Faculty Meetings; Exhibit 2a.3.8 Minutes of the Department of Kinesiology, Sports and Leisure). Recommendations from departmental faculty have caused the COE Administrative Council and the LEC Board to review policy and procedures (Exhibit 2a.3.6 Minutes of COE Administrative Council; Exhibit 2a.3.11 Minutes of LEC Board Meetings, see May 22, 2006; August 12, 2009). 

2c.4. How are assessment data shared with candidates, faculty, and other stakeholders to help them reflect on and improve their performance and programs?

Data on individual candidate performance are shared with candidates by the course instructors and by advisors. Candidates must enroll in ED 201 Advisee Report during the first three transition points (Exhibit 2a.2.1 Table 6 Unit Assessment System: Transition Points). At each point, the candidate must meet requirements and has an opportunity to reflect on his/her performance on signature assessments to improve future performance. During clinical practice candidates meet regularly with supervisors to discuss performance, and upon exit, each candidate completes an exit interview requiring the candidate to reflect on his/her performance as well as examine aspects of his experience. Advanced candidates meet with major professors to complete required research reports and faculty give candidate feedback on the reports.

Faculty receive feedback from students (Exhibit B1-2-16 GSU Course and Instruction Evaluation) and from peers (Exhibit B1-2-19 Faculty Peer Evaluation). Faculty reflect on performance while preparing the annual report (Exhibit B1-2-17) and when receiving feedback from department chair (Exhibit B1-2-18 Faculty Performance Evaluation Form).

Data are shared among stakeholders (COE and Arts and Science faculty; PK-16 Council) in retreats and work sessions that focus on: 1) program assessment revisions and standards alignment (Exhibit 2c.2.11 COE Newsletter 2008); 2) providing stakeholders with an overview of the UAS (Exhibit 2c.2.12 Retreat October 2008; Exhibit 2c.2.11 COE Newsletter 2008), and 3) review and use of data for program improvements (Exhibit 2c.2.10 GSU Assessment Retreat) with recommendations for improvement forwarded to PK-16+ Council for advisement (Exhibit 2a.3.10 Minutes of PK-16+ and PEC, see February 2, 2010). Faculty are members of the PK-16+ Council as well as members of SPA teams. SPA teams periodically review data and assessments related to candidate and program quality (Exhibit 2a.3.2 Table of Program Stakeholders).

2c.5. (Optional Upload for Online IR) Tables, figures, and a list of links to key exhibits related to the use of data for program improvement may be attached here. [Because BOE members should be able to access many exhibits electronically, a limited number of attachments (0-5) should be uploaded.]

Optional

1. What does your Unit do particularly well related to Standard 2?

 


2. What research related to Standard 2 is being conducted by the Unit or its faculty?