MSC DY Benefits and Challenges for Faculty Participants

Benefits and Challenges for Faculty Participants                
 

Download the Word document here

This document provides a summary of potential benefits, opportunities, and challenges for faculty who participate in the MSC Demonstration Year. The items listed below are intended to provide general guidance. Project leaders recognize that campus cultures and climates are highly diverse. What may appear to be an opportunity for one campus may emerge as a challenge to another. Responses to this document are welcome. Please address responses to Gloria Auer at SHEEO gauer@sheeo.org

The overall goal of the Demonstration Year: Faculty from an array of institutions in 12 states will gather student work and then submit that work to be scored by a project-organized group of faculty, all using selected VALUE rubrics. The MSC intends to assess student work in a way that is valid, reliable, and consistent or aligned—without using standardized tests. Another important objective of the Demonstration Year is to scale the Pilot Study year work by increasing the number of states, institutions, faculty participants, and artifacts in order to get closer to achieving the long-term vision of collecting a representative sample of student work from each state.

Faculty Benefits:

Participation in the Demonstration Year will provide faculty the opportunity to:

• work with other faculty to develop an assessment process based on authentic student work and enhance their understanding of the connection between institutional learning outcomes and program-level assessment for their department, generating new ideas to improve teaching and learning in their own classrooms

• learn from and share information with colleagues across departments, institutions, and states on assignment design, teaching, and different approaches for increasing student success

• gain expertise in designing assignments that align with course, program, and institutional or general education learning outcomes 

• identify the difference between grading and “scoring” while recognizing the importance of both for enhancing and measuring learning

• use student work collected for the Demonstration Year as part of degree and/or program assessment 

• learn more about direct assessment of student work using rubrics, and gain experience in using rubrics, specifically the VALUE rubrics, to assess particular learning outcomes
• provide opportunities for faculty to be involved in scoring student work, analyzing the assessment data and results, sharing a faculty perspective on what the data suggest, and bringing these insights back to institutions to engage colleagues in curricular improvement

Participation in scoring student work will:

• facilitate discussion of common expectations for student learning

• facilitate cross-institutional faculty dialogue

• provide faculty the opportunity to view student work from other institutions and states

• provide faculty the opportunity to study and describe student work as a progression of demonstrated learning at levels keyed to degrees

• allow faculty the opportunity to score student work from other institutions and states

• facilitate sharing of ideas to improve pedagogy

• facilitate sharing of ideas for intentional assignment design as an essential part of building valid assessment of student learning

• provide faculty with training and experience in scoring student work using the VALUE rubrics for written communication, quantitative literacy, and critical thinking

• increase faculty scoring experience building on-campus assessment capacity


Consideration for faculty: How exactly will participation affect my workload?

You will be asked, in your capacity as a faculty member:

• to become familiar with the selected VALUE rubrics

• to re-examine the design of assignments already being used or to consider development of new assignments geared specifically to meet particular learning outcomes

Discussion:
• Is your assignment appropriate for assessment using the appropriate VALUE rubric?
• Does your assignment align with broad assignment parameters offered by the MSC?
• Do your assignment instructions prompt students to demonstrate the intended skills?

• to submit an assignment (either already being used or newly developed) and the corresponding student work during Fall 2015 and/or Spring 2016

The person designated “Institution Lead” will alert faculty to professional development activities and resources to assist in assignment design and to answer other questions related to the MSC Demonstration Year.


Institution Benefits

Institution participation will:

• help institutions publically affirm their commitment to improve the quality of student learning
• help institutions respond to accountability expectations with assessments based on authentic student work
• help elevate the importance of the quality of student learning as documented by assessment of student work
• provide external validation for internal institutional assessment results 
• provide the opportunity for institutions to learn from benchmarking their assessment results to the aggregated results of similar institutions in their state and among states in the MSC
• result in benefits to students as faculty gain access to state and multistate assessment information useful for curricular planning, program improvement, and enrichment of pedagogy

Institutions will gain access to data and analysis of data on student learning, helping them to:

• identify trends in institutional data that vary from or are reflected in the state and multistate data
• benefit from the MSC findings regarding inter-rater reliability and VALUE rubric validity
• provide insight into the ability to collect assignments and corresponding student work suitable for assessing written communication, quantitative literacy, and critical thinking from students nearing graduation—valuable for creating ongoing professional development activities to build institutional capacity

Overview of Information on Sampling and Scoring

• Sampling Size: The project offers guidance on the needed sample size at the institution level to generalize to the total institutional population of students, and the sample size needed to generalize to subgroups based on race/ethnicity, gender, discipline or program, age, etc.
• Sampling Guidelines: Established sampling guidelines will be available for institutions to use or modify for your own institution’s assessment initiatives.
• Sampling Consultation: Institutions may work with the MSC Sampling Committee to establish sampling methods and protocols and determine needed sample sizes for meeting the assessment goals of individual institutions. 
• Scoring:

  • A selected group of faculty from the 12 states will score collected student work.
  • Institutions will have access to the scoring protocols, which may be used online for centralized and online scoring.

Information on Resource Commitment:

Institutional participation will require some local and state resources for:

• professional development activities
• collecting and de-identifying student work and the corresponding assignment instructions and cover sheet
• collecting and submitting information on student work to the national database
• engaging institutional constituencies in the analysis of and feedback regarding their own institution-level data.

Institution leads and state leads will have more information about resources.

Prompts for Further Discussion:

Well-designed assignments are essential for high-quality direct assessment. Assignments are thus an important element of an effective assessment plan. Intentional assignment design requires practice and experience. The MSC Demonstration Year will offer opportunities for practice.

While an individual faculty member’s assignment will have distinctive features, it can nevertheless help students achieve learning outcomes that are shared across an array of different courses. Assessment of assignments that are intentionally designed to help students reach shared learning outcomes can be done in a valid and reliable way. The MSC offers opportunities to practice assessment that is aligned across very different courses.

Course grades are often composite measures of a student’s performance in the course, primarily based on a student’s expression of content mastery, rather than demonstration of particular learning outcomes. Direct assessment allows faculty to focus on student demonstration of a specific learning outcome.

Return to MSC DY main page

Project Status: