The program review process was redesigned to foster accountability and reinforce the integration of assessment and strategic planning.
Some departments previously didn't participate. Key assessment outcomes were not elements of EMCC's strategic planning processes.
To affect change, participation was strongly encouraged. Non-essential questions were removed. A more collaborative model between leadership and writers was forged. Scripted use of linked supporting data about Strategic Commitments, Gen Ed Abilities, CATS, Student Learning Outcomes made it clear as to what each question was asking. Three working sessions of the Leadership Team time were scheduled as training / writing sessions.
Although program reviews are not due until the spring, participation was still low. As a result of the first writing session, writers were aware of the resources and the templates but did not demonstrate actual writing. A week after the first session, 74% of the program reviews had not been started. 19% had substantial progress. 7% had some progress. Writers and reviewers were coached and encouraged to promote writing. After the 2nd working session, 53% of reviews were in progress.
Follow-up message issued from VPs & Deans to personally invite & strongly encourage their team to begin/continue writing & submitting OPIE data requests for additional data. The primary factor for success of program review redesign is ongoing coaching and encouragement from the reviewers and making data review a regular part of dept meetings. Rubrics for qualitative review are being developed to evaluate quality of work.
Program review is a mission-critical strategic planning process and is one critical element of accreditation. In past years, some departments chose not to complete a program review, while others only did the bare minimum. Some did stellar work. Part of the completion problem was length, along with question redundency, and a lack of accountability for its completion. This year the program review process was redesigned to foster collaboration and accountability between writers and reviewers (Deans/VPs) and to reinforce the application of strategic data for the program. Questions were added to relate a program to MCCCD strategic directions, general education abilities, student learning outcomes, persistence & completion, the job market, use of internships & service learning, and data (enrollment, completions, success, transfers, student engagement, student satisfaction, and more). The first milestone is to get people writing. Using formative assessment between writing sessions, timely participation improved dramatically. Once writing is done, scoring rubrics will used to evaluate the effort.