CATS Nip: Increasing Gen Ed Abilities Assessment Participation

Submitted by James Waugh on
Duration
-
What is the Purpose of the Assessment?

Simplify data entry, automate tabulation, &  streamline reporting by pre-populating term, instructor, course, section, and student ID for Gen Ed Abilities templates. 

Describe the necessity for this assessment

Gen Ed Abilities assessment participation is low.  Need to remove excuses for not participating, including not knowing how to tabulate data. 

Describe how the practice will be implemented

Simplified data entry, automate tabulation, &  streamline reporting by pre-populating term, instructor, course, section, and student ID for Gen Ed Abilities Excel-based templates. 

Interpret, compare, and describe the results

Outcomes:

  • Communications assessment participants  went from 391 in 2010/11, 512 in 2013, to 1303 in 2016.
  • Critical Inquiry assessment participants went from 227 in 2011, 427 in 2014, to 1495 in 2017.
  • Other contributing participation factors include a more active solicitation by SAAC Co-Chairs and senior leadership support.
  • Similar results are expected with other assessments. 
After analyzing, and reflecting on the outcome, what are the next steps?

I will use the customized data entry templates for all Gen Ed Abilities assessments.  This may translate into a redesign of Gen Ed data entry templates to allow for longitudinal comparisons, comparisons across prefixes, courses, and more.This serves not only accreditation purposes, but refocus our efforts in being a learning college.

Abstract

Gen Ed Abilities assessment participation is low I developed customized scoring templates for each section to simplify data entry, automate tabulation, &  streamline reporting by pre-populating term, instructor, course, section, and student ID.  Outcomes:

  • Communications assessment participants  went from 391 in 2010/11, 512 in 2013, to 1303 in 2016.
  • Critical Inquiry assessment participants went from 227 in 2011, 427 in 2014, to 1495 in 2017.
  • Other contributing participation factors include a more active solicitation by SAAC Co-Chairs and senior leadership support.
  • Similar results are expected with other Gen Ed Assessments. 

I will use the customized data entry templates for all Gen Ed Abilities assessments.  This may translate into a redesign of Gen Ed data entry templates to allow for longitudinal comparisons, comparisons across prefixes, courses, and more.This serves not only accreditation purposes, but refocus our efforts in being a learning college.

Division/Department
Completed Full Cycle
Yes
Files
Attachment Size
communication-2016-sample-template.xlsx 33.78 KB
Assessment of the Month

Comments

Jake Ormond Mon, 04/09/2018 - 9:29am

Great work, Jim. This seems like a lot of work on your end to integrate the data into the customized templates. Do you have any concerns of the sustainability of this process as we continue to recruit and increase Gen Ed participation, to hopefully have college-wide participation in the future? Are there other technologies or softwares that could help streamline the process? 

James Waugh Mon, 04/09/2018 - 9:59am

Supporting SAAC is a critical role for OPIE, thus the time expended is justified.  The standardized template format takes much more time to compile but the analysis time is slashed and the transcription accuracy of the data is no longer a liability.  This data can be a strategic insight to our college once we have a representative sample of the college (not available in the past).   Furthermore, data using the new template makes it possible to easily and quickly compile data for longitudinal analysis.   Thus, for the first time, we will truly have comparable data from term to term.  We can then see if we made a difference from the time of first starting EMCC (via Gen Ed Ability assessment in CPD150 or CIS105) against future terms (any class/section down the road which participates in the same Gen Ed ability).  We can then have some way to see if a student improved while at EMCC.  This is the holy grail of college-wide assessment, and only the beginning of this adventure. 

James Waugh Mon, 04/09/2018 - 10:22am

One of the potential reasons for non-participation in the past is confusion with how an instructor should capture their Gen Ed Ability scoring.  It just seemed easier to not participate.  By providing pre-populated forms, this concern is mitigated and standardized.  In the future, we may move to a pre-populated scannable bubble form with student ID, student name, section, term, class, and section already scored.  The distribution process would change but the outcome would be the same.  We first had to prove the concept and then potentially make this change.  There is no magic software solution, it will still require a fair amount of manual manipulation.



This, in conjunction with active recruitment and adminstrator support, have made the participation rates skyrocket.  There is no data to indicate the weight of each interventions' contribution, but together the outcome improvement is substantial.

Becky Baranowski Mon, 04/16/2018 - 2:41pm

Hi Jim - thanks very much for taking time to submit this CATS.  Like Jake's concern, I am also wondering about our ability to do this on a much larger scale.  I know you are able to work the magic of Excel, and you are correct in your comment to Jake that one major purpose of OPIE is to help SAAC.  So, thank you.  I really hope we continue to increase participation, have comparable data, and impress HLC in 3 years.  Thank you for your time with all of this.  

Catherine Cochran Tue, 04/17/2018 - 11:40am

Excellent work Jim!  It is very much appreciated that you are willing to customize the templates to assist the instructors who are participating in assessing the General Abilities.  This has created a more user friendly approach to the templates and data entry.  

Catherine 

James Waugh Tue, 04/17/2018 - 1:51pm

This is much more than an accreditation issue, it is the right thing to do.

Assessment is the process of determining whether students have acquired the knowledge and skills described in the course, program, and college-wide learning outcomes.  SAAC's general goal is to know if they have actually learned what we assert that they will learn. 

Prior to thee years ago, EMCC's Gen Ed Abilities assessments did not address a program level assessment so we began to collect more robust data to capture at various phases in the semester.  The goal now is to capture a student's progress throughout their time at ECC.  In an ideal world, the new model with more robust data will allow us to do more with the integration of learning outcomes so we may measure improvement on a course, program and college-wide level. 

  • This may involve pre and post testing within a single course to capture course level results. 
  • For those courses which are stackable, the post testing could serve as pre testing baseline data for a subsequent class.  This would address our need for program-level assessment.
  • We could then aggregate the results for sophomores, comparing the same student results when they were new freshmen.  This would get us to the college-wide asssessment, with evidence if we had moved the needle.

Yes, the work required is exhaustive but the results may provide us with a methodogy for critical organizational learning outcome assessment.  The question of sustainability is valid and we will continue to look at better data collection methods which will encourage faculty particpation while becoming less resource-intensive.  We have investigated alternative data collection tools such as Canvas but have not received much support for easier options.  I will keep looking...