Noncredit Pilot Project Goals

  • Establish clear communication between institution MIS reporting and noncredit programs
  • Collect a pilot set of accountability data based on noncredit indicators
  • Evaluate the ability of noncredit programs to work with these indicators
  • Evaluate the effectiveness of these indicators for use as accountability requirements

Noncredit Pilot Project Guidelines

  • Implement noncredit progress indicators starting fall, 2010
  • Gather data from participating programs and submit it though MIS
  • Analyze the data and make the recommendation as to the feasibility of a standard progress indicator system for noncredit across the state

Potential Noncredit Pilot Progress Indicators

  • Grades A – F
  • “P”: Passing, i.e. at least satisfactory completion of course
  • “NP”: Not Passing, i.e. less than satisfactory completion of course
  • “SP”: Satisfactory Progress, i.e. satisfactory progress towards completion of course

Noncredit Pilot Progress Metrics

  • Metrics for success include S and SP as success because research found that the SPs turn to P or NP by within 2 to 3 reporting periods
  • Noncredit success is asynchronous with the reporting periods for data; therefore the denominator of included students is large (including all students with 8 hours or more in positive attendances) and the numerator includes S and SP

Materials

Rostrum Noncredit Progress Indicator Pilot
Noncredit Progress Indicator Pilot Training Materials
SCE – Material samples for determining P, SP and NP
Santa Rosa Progress indicator material samples
Mt SAC examples of progress indicators for ABE, GED, HS referral progress indicator samples. All GED are turned into grades at the end of the learning time period.

Noncredit Progress Indicator Pilot Training Powerpoint
Noncredit Accountability Powerpoint
A Guide to Noncredit Accountability

FALL 2010 UPDATE

Data from the fall 2010 pilot process yielded only two college reports that were submitted to the Chancellor's Office. A variety of gaps and issues in submitting data were identified in this first reporting attempt:

1. The Chancellor's Office changed the percentage of errors allowed upon submission, reducing it to a 1% error margin. This change resulted in a variety of data rejections for all institutions and delayed the submission process. (The taskforce supports the effort to have quality data, but this delayed the submissions until after the taskforce meeting as errors were corrected at local institutions.)

2. One institution’s noncredit data was not submitted due to credit data errors.

3. One institution’s researcher changed all the progress indicators submitted to UG before submitting it to the Chancellor's Office. When asked to resubmit the data, large errors were made in changing the data back to the original submission.

4. Some colleges were erroneously told that their local MIS system would not support progress indicators. We know this is incorrect because we have participating colleges from Banner, Peoplesoft and Datatell – which facilitated submission.

5. Many institutions have relied upon shadow systems for collecting and reporting noncredit data primarily because without progress indicators/grades the data reports as zero success, which is then electronically combined with the credit data reducing the apparent overall success.

6. Noncredit faculty assess students in an individual and independent manner, so the use of progress indicators was new and a shift for many. Defining and training faculty to first use and then record and report the data became a key need apparent after the initial semester of reporting.

7. Some institutions had no method for noncredit faculty to actually submit progress indicators. In these cases, a scantron reporting method was developed and faculty and administrators worked around the electronic collection and reporting methods collecting the data by hand.

The fall 2010 data included information from 4 colleges although 6 had reported an intent to pilot the process.

SPRING 2011 UPDATE

As a follow-up to the fall submission process, an online faculty survey was sent out to participating faculty for input. The results indicated that the process improved faculty work and student engagement as well as providing valuable information. This survey will be used again at the end of the Fall 2011 submission to gather final faculty input.

Seven colleges were able to collect and report some level of data for the Spring 2011 process, but the Chancellor's Office did not provide data. Colleges that reported in the fall were able to collect better data and provide far better information for analysis and decision making.

The data is being kept confidential and anonymous because there are still many issues which need to be resolved in order to make the data valid and reliable.

During the Spring 2011 reporting period, additional gaps and issues surfaced. As a result a sub-committee examining data submission problems was formed to delineate the issues, gaps and areas needing attention. (Diane Mendoza - School of Continuing Education, Sergio Oklander – Santa Ana College - SCE, Mark Wade Lieu - Chancellor's Office)

In Spring 2011, colleges were encouraged to experiment with a variety of indicators in order to gather data on the usefulness of the data elements. This may have produced problems in data submission; it did inform the taskforce of the importance of a meaningful data element regarding students that do not meet a threshold of hours and participation to be given progress indicators.

© 2009 Basic Skills Initiative. All Rights Reserved. Contact Us