Return to Archives
Return to Article Summaries

February 2005, Vol. 4 Issue 2
 
THE QUALITY MATTERS RUBRIC: A TOOL FOR PROMOTING QUALITY IMPROVEMENT IN ONLINE COURSES

by John Sener

Quality Matters (QM) is a project developed by MarylandOnline (MOL) and funded by the U.S. Department of Educationís Fund for the Improvement of Post-Secondary Education (FIPSE) to create an inter-institutional model for assuring the quality of online courses. The original primary purpose of the project was to create a faculty-centered process for demonstrating quality in course design so that MOL member institutions could better serve students by successfully facilitating course sharing.

The QM model has two key components: the QM process and the QM tool set. The QM rubric is part of the tool set which faculty peer reviews use to review courses; the tool set also includes an Instructor Worksheet, Matrix of Review Standards, Exit Interview Form, and other documents which support various elements of the course review process. The QM process is the key to the success of the project, and the centerpiece of the QM process is the QM rubric.

The primary purpose of the rubric is to enable faculty peer review teams to apply their expertise as practitioners to evaluating the quality of online courses based on a set of criteria which are derived from the research literature and recognized quality standards. The current QM rubric is the outgrowth of an instrument used in a pilot course review project conducted by several MOL member institutions prior to the start of the QM project in September 2003. The rubric consists of 40 Specific Review Standards which are grouped into eight General Review Standard areas corresponding to critical aspects of online course design:

  1. Course Overview and Introduction
  2. Learning Objectives (Competencies)
  3. Assessment and Measurement
  4. Resources and Materials
  5. Learner Interaction
  6. Course Technology
  7. Learner Support
  8. ADA Compliance

Since the projectís inception, the QM rubric has gone through several iterations, with refinements made based on feedback from peer reviewersí experience with using it. The current version of the QM rubric also includes annotations and examples to guide peer reviewers on how to apply or interpret each of the rubricís standards. The QM project has also created online demo and production versions of the rubric which offers great advantages over using print documents: prospective users can test-drive the demo rubric (see link at end of article) to see how they like it, while the production version greatly eases the peer review process by enabling peer reviewers to record their evaluations online and automatically tabulating the results.

To date, 20 courses have been reviewed as part of QM project activities, with 20 additional courses slated to be reviewed during the Spring 2005 semester. The QM project has also trained 178 faculty to use the rubric, including 129 from MOLís 19 member institutions and 49 from 26 outside institutions or organizations. Thanks to training sessions and conference presentations, the QM rubric and process has generated a groundswell of interest among educators involved with online learning.

Unexpectedly Versatile Tool

Although the QM rubric was designed as a tool for reviewing courses, one of the projectís unanticipated outcomes has been the wide variety of other ways which MOL members and other interested institutions have used or plan to use the rubric. Some of these additional uses were not particularly surprising; for example, it quickly became apparent during the pilot of the peer review process in Spring 2004 that the QM rubric was also suitable as a guide to prepare for a course review. From there, it was a small step to using the rubric for other online courses which were not undergoing the formal QM review process. As the QM Project Management Team learned about some of these additional applications, it decided to ask its team of MOL member Administrative Representatives (ARs) how they were using or planned to use the rubric. The range of their answers was a bit of a revelation:

Support for course review - For example, Chesapeake College is using the rubric to support the design review process for all current courses, while the College of Southern Maryland (CSM) has established a course peer review process for all new web-based courses using the QM rubric.

Support for course design - Allegany College is using the rubric as a tool to make it easier to design new courses from scratch, while Frederick CC is using a Blackboard course template incorporating the QM standards to streamline the course creation process. The University of Maryland University College (UMUC) is incorporating the QM rubric into a checklist to be used in new online course development.

Support for course revision and improvement - Several MOL members are using the rubric for this purpose as well; for instance, Baltimore City CC is using the rubric as part of its on-campus training session offerings for all online instructors on how to revise their courses.

Review of non-project courses - Prince Georgeís CC and Frederick CC are using the QM rubric to conduct internal peer reviews of online courses, apart from the QM project.

Faculty training and professional development - Several MOL members have reported incorporating the rubric into their faculty training process for course review, revision, development, and other professional development such as "best practices" workshops. For instance, Anne Arundel CC has adopted the QM rubric into its Online Academy (for faculty developing courses) and Online Academy "Lite" (for faculty taking over existing online courses) training programs. Frederick CC reports incorporating the QM standards and rubric into their ongoing Blackboard LMS training for faculty.

Quality benchmarking for online and hybrid courses - Allegany College has used the rubric as the basis for its "Online Teaching Strategies and Best Practices" official document, while Frederick CC is also using the QM standards and rubric to define "best practices" for quality online and hybrid courses.

Raising awareness, interest, and support for online learning - For example, Villa Julie Collegeís AR anticipates that the rubricís validation will encourage VJC to be more attentive to online learning, while Wor-Wic Community Collegeís AR believes that the QM rubric "gives an edge" in discussions with faculty and administrators about online learning.

Strategic planning tool - The QM project and rubric has "energized" Carroll CCís distance learning efforts, enabling them to step back and re-evaluate their needs as well as providing an impetus for CCC to launch a Distance Education committee.

Planned and Anticipated Uses for the QM Rubric

After seeing the variety of ways in which other institutions were adopting the QM rubric for their own uses, the QM Project Management Team (PMT) decided to start tracking how practitioners were planning to use the rubric. Participants in three QM peer reviewer training sessions conducted during December 2004 and January 2005 were asked how they planned to use the QM rubric. Two-thirds of respondents (n=73) said they planned to share the rubric with others and to use it to review othersí online courses at their institution or organization. The same proportion said they planned to use it to help them with their own course revisions, while an even higher number (71%) said they planned to use it to help them with their own course development. Nearly half also reported planning to use the QM rubric to train others how to develop (44%), revise (41%), or review (46%) an online course. Other respondents planned to use the rubric as part of their instructional design process, to help develop a course development methodology, and even in their Ph.D. studies in education.

MOL members also have more specific plans to expand their use of the QM rubric. For instance, the QM rubric will be a featured item in a Best Practices Retreat for UMUCís fulltime Sciences & Math department chairs and fulltime faculty during the Spring 2005 semester, and the rubric and its use will be a featured activity in UMUCís annual summer retreat for its entire writing faculty. CSM has also incorporated the QM rubric into its faculty training initiatives; faculty with existing web-based courses will have an opportunity to use the rubric to revise their courses during a summer institute and in fall 2005. Prince Georgeís CC administrators are hoping that the results of their internal peer reviews will lead to incorporating the QM rubric standards into the normal course review process for all online courses at the college.

Based on interest in QM training sessions, requests for permission to use the rubric, and feedback from dissemination activities, the QM rubric appears to have created a splash in the online learning community beyond the PMTís wildest expectations. There are also indications that the QM rubric is being adopted in K-12 and governmental organizations as well. Although the QM project is making efforts to measure the impact of this splash, it is clear that there is also a "ripple effect" which is difficult, if not impossible, to measure. For instance, Montgomery Collegeís AR Buddy Muse anticipates that some MC faculty are using the QM rubric on their own without telling anyone, while Howard CCís AR Virginia Kirk anticipates that some faculty who have been trained as QM peer reviewers may do some peer course improvement projects.

In addition to maintaining its original focus on continual application and refinement of the QM rubric, the QM project is also undertaking the new task of exploring the application of the QM rubric to hybrid and even classroom courses. The QM PMT welcomes your participation in its new and ongoing activities.

To discuss the project or to inquire about participating in it, contact project co-directors Chris Sax (University of Maryland University College, csax@umuc.edu) and Mary Wells (Prince Georgeís Community College, mwells@pgcc.edu).

Maryland Online -  www.marylandonline.org

Quality Matters -  www.qualitymatters.org

Online Demo of Quality Matters Rubric - www.esac.org/fdi/rubric/finalsurvey/demorubric.asp

John Sener, contributing editor for Educational Pathways, is the founder of Sener Learning Services (www.senerlearning.com) and the QM project evaluator and Project Management Team member.

Return to Archives
Return to Article Summaries


Copyright. All rights reserved. Lorenzo Associates, Inc., P.O. Box 74, Clarence Center, NY 14032.