Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment,...

240
Retreat on Student Learning and Assessment, Level I September 24-25, 2009 The Westin Long Beach Long Beach, CA Retreat Handouts

Transcript of Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment,...

Page 1: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Retreat on Student Learning and Assessment, Level I

September 24-25, 2009 The Westin Long Beach

Long Beach, CA

Retreat Handouts

Page 2: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...
Page 3: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Retreat on Student Learning and Assessment, Level I Table of Contents

Agenda / Schedule ………………………………………………………………………………………………. 2

Westin Floor Plan ………………………………………………………………………………………………… 4

Mentor Biographies ……………………………………………………………………………………………… 5

Attendee Directory …………………………………………………………………………... 10

Plenary: Assessment in Learning-centered Institutions (M. Allen) …………………………………………... 16

Lecture/Discussion 1: An Outcomes-based Assessment Model for General Education (A. Driscoll) …….. 22

Lecture/Discussion 2: Assessment for Student Affairs Staff and Other Campus Professionals (M. Allen) ... 33

Lecture/Discussion 3: Direct and Indirect Assessment Methods (B. Wright) …….………………………... 59

Direct Assessment Methods – A Close-Up Look ………………………………………………... 66

Indirect Assessment Methods – A Close-Up Look ……………………………………………… 77

Lecture/Discussion 4: Unique Issues in Assessment for Community Colleges (F. Trapp) ………………... 83

Handout for Unique Issues in Assessment for Community Colleges ………………………….. 100

Lecture/Discussion 5: Developing and Applying Rubrics (M. Allen) ……………………………………….. 119

Lecture/Discussion 6: Analyzing Evidence of Student Learning to Improve Our Practice (A. Driscoll) …. 136

Lecture/Discussion 7: The Administrators’ Role in Assessment of Student Learning (B. Wright) ………... 140

Assessment Tips for the Top ……………………………………………………………………… 141

Top Ten Ways to Kill Your Assessment Program ………………………………………………... 144

The Administrators’ Role in Assessment of Student Learning: Setting the Context for Success 145

L/D 8: Assessment for Community College Career and Technical Educational Programs (F. Trapp) ……. 148

Handout for Assessment for Community College Career and Technical Education ………….. 163

Plenary: The Learning-centered Institution: Curriculum, Pedagogy and Assessment for Student Success (A. Driscoll)

185

Resources ………………………………………………………………………………………………………… 205

Bibliography ……………………………………………………………………………………….. 206

Rubrics - ACSCU…………………………………………………………………………………… 211

Program Learning Outcomes Rubric …………………………………………………… 212

General Education Assessment Rubric …………………………………………………. 214

Portfolio Rubric …………………………………………………………………………… 218

Capstone Rubric ………………………………………………………………………….. 220

Program Review Rubric ………………………………………………………………….. 222

Educational Effectiveness Framework ………………………………………………….. 224

Rubrics – ACCJC …………………………………………………………………………………... 226

Rubric Table September 2007 …………………………………………………………….. 227

Online Resources; SLOs, Rubrics, & Assessment ……………………………………………… 230

Assessment Quickies: Student Learning Outcomes Assessment in Ten Easy Steps ………….. 233

Notes …….. ………………………………………………………………………………………… 235

2010 WASC Academic Resource Conference – Save the Date……………………………………. Back Cover

Page 4: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Retreat on Student Learning and Assessment, Level I

SCHEDULE/PROGRAM

Thursday, September 24, 2009

9:00 – 10:00 am Arrival, check-in, registration 10:00 - 11:30 am Welcoming remarks Salon B

Plenary: Assessment in Learning Centered Institutions (M. Allen) Salon B

Introduction of mentors

11:30 –12:15pm Meet in mentor groups and schedule appointments 12:15 – 1:15 pm Lunch in teams Salon A

4. Unique Issues in Assessment for Community Colleges (F. Trapp) Salon B 2:45-3:00 pm Snack Break Salon B 3:00-4:30 pm Lecture/Discussions:

8. Assessment for Community College Career and Technical Educational Programs (F. Trapp) Salon B

4:30 – 6:00 pm Work Session: Team planning / Appointments with mentors 6:00 pm Dinner on your own

1:15 – 2:45 pm Lecture/Discussions:

1. An Outcomes-based Assessment Model for General Education (A. Driscoll) Salon C 2. Assessment for Student Affairs Staff & Other Campus Professionals (M. Allen) Salon D3. Direct and Indirect Approaches to Assessment (B. Wright) Barcelona/Casablanca

5. Developing and Applying Rubrics (M. Allen) Salon D 6. Analyzing Student Learning to Improve Our Practice (A. Driscoll) Salon C 7. The Administrator’s Role in Assessment of Student Learning (B. Wright) Barcelona/Casablanca

2

Page 5: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Friday, September 25, 2009

7:00-8:30am Breakfast and appointments with mentors Salon A 8:30-10:00 am Plenary: The Learning-centered Institution: Curriculum,

Pedagogy and Assessment for Student Success (A. Driscoll) Salon B 10:00 – Noon Work Session: Team planning / Appointments with mentors Noon – 1:00 pm Lunch; submit questions for Q&A Salon A 1:00 – 1:45 pm Q & A (Mentor Panel) Salon B 1:45 – 2:00 pm Snack Break Salon B 2:00 – 4:00 pm Mentor Group Final Session –Teams present progress on their projects 4:00 pm Retreat ends

* * * * * * * * * * * * * * * * * * Lecture/Discussions * * * * * * * * * * * * * * * * * * * * * 1. An Outcomes-Based Assessment Model for General Education This workshop will provide a practical guided opportunity for participants to design outcomes-based assessment components of a model for general education, tailored to their institutional purposes and mission. (A. Driscoll) Salon C 2. Assessment for Student Affairs Staff and Other Campus Professionals This session is designed for campus professionals who want to learn more about assessment. (M. Allen) Salon D 3. Direct and Indirect Approaches to Assessment Participants will review criteria for selecting among assessment methods and the strengths and limitations associated with a variety of commonly used techniques. (B. Wright) Barcelona/Casablanca 4. Unique Issues in Assessment for Community Colleges Participants will discuss several aspects of the community college in higher education, the nature of the students served by those institutions, and the implications for assessment of learning outcomes that flow from those considerations. Practical suggestions and illustrations for assessment work will be offered. (F. Trapp) Salon B 5. Developing and Applying Rubrics Participants will review rubric examples, consider strategies for developing rubrics, and learn how to use rubrics for teaching, grading, and assessment. (M. Allen) Salon D 6. Analyzing Student Learning to Improve Our Practice This workshop will engage participants in an analysis process for reviewing student work as evidence of achieving learning outcomes. Participants will discuss implications of the process for improving teaching and learning. (A. Driscoll) Salon C 7. Administrators and Assessment of Student Learning It’s true that faculty need to take primary responsibility for assessment, but administrators also have a critical role to play in creating an environment where good assessment is possible and leads to real improvement. (B. Wright) Barcelona/Casablanca 8. Assessment for Community College Career and Technical Educational Programs Participants will consider ways to articulate program-level learning outcomes, analyze the curricular design of the program and explore several common ways to assess behavioral and performance learning outcomes for competency-based education situations in community colleges. (F. Trapp) Salon B

3

Page 6: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

4

Page 7: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

MENTOR

BIOGRAPHIES

5

Page 8: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

WASC Retreat on Student Learning and Assessment Mentor Biographies

Mary Allen

Dr. Mary Allen is a consultant in higher education, specializing in assessment and accreditation. She is the former Director of the California State University Institute for Teaching and Learning and a Professor Emerita of Psychology from California State University, Bakersfield. Mary has published books on the assessment of academic programs and general education, and she has offered assessment presentations and workshops at AAHE, AAC&U, and WASC conferences. She is a sought after speaker, consultant, and workshop presenter and has worked with over 120 colleges, universities, and college systems. Contact info:

Email: [email protected] Jan Connal

Dr. Jan Connal has engaged in a variety of educational research activities over a long career at Cerritos College. In her current faculty capacity, she co-coordinates the campus SLO assessment activities, facilitates faculty classroom research and inquiry projects, serves as the internal evaluator for the campus Title V grant, and chairs the Developmental Education Committee (which is strongly promoting assessment, faculty inquiry and evidence-based effective practices). Jan is also active statewide currently as a Faculty Inquiry Network coach, with special emphasis on assessment and evaluation for 18 campus-based faculty inquiry projects, and previously as a member of the Steering Committee for the Basic Skills Initiative (BSI) in California’s community colleges. Other professional positions held include Dean for Institutional Advancement and Planning, Dean of Educational Support Services, Director of Research, Development and Planning. Jan holds a PhD in Educational Psychology (Research Methods and Evaluation) from UCLA, MA in Experimental Psychology from CSU Fullerton, and BA in Psychology from Chapman (College) University. Contact info: Email: [email protected]

Amy Driscoll

Dr. Amy Driscoll retired as Director of Teaching, Learning, and Assessment at California State University, Monterey Bay, and was most recently a Consulting Scholar with the Carnegie Foundation for the Advancement of Teaching. Previously, Amy was the Director of Community/University Partnerships at Portland State University, where she initiated the community-based learning and capstone aspects of the university’s innovative curriculum. Dr. Driscoll has presented at AAHE, AAC&U, WASC, and National Assessment Institute conferences. She has also mentored more than 60 institutions in their development and implementation of institutional assessment and/or community engagement. Her most recent book is Developing Outcomes-based Assessment for Learner-centered Education, A Faculty Introduction co-authored with Swarup Wood, a chemistry professor, and published by Stylus (2007).

Contact info: Email: [email protected]

6

Page 9: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

WASC Retreat on Student Learning and Assessment Mentor Biographies

Lynda Harding

Ethelynda Harding is Professor Emerita of Biology at California State University, Fresno. Formerly director of of Teaching, Learning, and Technology, she had primary responsibility for faculty professional development and academic technology. She helped implement Fresno State's outcomes assessment program, coordinated program review, and, as Accreditation Liaison Officer, led the most recent accreditation process. She has presented regionally and nationally on topics including outcomes assessment, faculty development, faculty personnel policies, instructional multimedia, and microbial ecology. Contact info: Email: [email protected]

John A. Hughes

Dr. John Hughes began his teaching career at The Master’s College (TMC) in the fall 1981 as the director of the College’s Teacher Education department. Since 1995, he has served as the Vice President for Academic Affairs. Throughout his tenure at TMC he has been involved in various aspects of the institution’s self-study and accreditation processes. He led the faculty through the development and implementation of the College’s institutional assessment plan. He worked with TMC’s information technology (IT) staff in the design and development of the Assessment Information Management System (AIMS) which is used to collect, store, summarize, and analyze assessment data relating to program-level student learning outcomes. For three years he was a member of the WASC Proposal Review Committee, and has also had the opportunity to serve on a number of CPR and EER visit teams. Contact Info:

The Master’s College (661) 259-3540

Email: [email protected] Robert Pacheco

Robert Pacheco, Director of Research, Development and Planning and Accreditation Liaison Officer at Barstow College. He is member of the Executive Board of the RP Group of California. Previously, Bob was a tenured faculty and a member of the California State Academic Senate Committee on SLOs and Accreditation. He was an At-Large Representative in the college's Academic Senate and is a member of Barstow College’s Strategic Planning Committee, SLO Assessment Committee and the Matriculation Committee. Contact info:

Director of Research, Development and Planning Barstow College Email: [email protected]

7

Page 10: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

WASC Retreat on Student Learning and Assessment Mentor Biographies

Frederick P. Trapp

Fred Trapp is well known in California having served on the steering committee for the California Assessment Initiative and provided numerous workshops on learning outcomes and assessment on behalf of the Research and Planning Group. He has also presented at numerous regional and national conferences, including the Assessment Forum sponsored by the American Association for High Education. Until his retirement in December 2007 Fred was employed as the Administrative Dean for Institutional Research and Academic Services at the Long Beach Community College District for 26 years. In that role he provided campus leadership for he assessment of student learning outcomes from the curriculum design, instructional delivery and research perspectives. He currently serves as an evaluator for the Accrediting Commission for Community and Junior Colleges (ACCJC) within the Western Association of Schools and Colleges and is the Senior Associate with the Cambridge West Partnership, LLC.

Contact info: Fred Trapp 5739 E. Hanbury Street, Long Beach, CA 90808-2049

562 429-6996 (home) Email: [email protected]

Gary Williams Dr. Gary J. Williams is the Instructional Assessment Specialist at Crafton Hills College in Yucaipa, California. He has 20 years of experience in higher education at a variety of institutions, large and small, public and private. His range of experience spans student services, residence life, international student advising, first year experience, college success, student engagement student learning outcomes, and instructional assessment. He currently serves on the RP Group & Academic Senate for California Community Colleges (ASCCC) SLO Assessment Cooperative Committee, and has served on ASCCC Accreditation and Student Learning Outcomes Committee, as well as the Basic Skills Initiative Program Committee for the California Community Colleges. Dr. Willams earned his Doctorate in Education (Ed.D.) from UCLA with a focus on assessment, organizational change, and organizational culture. He also holds a Master's degree in Educational Psychology from Marist College in New York, and a Bachelor of Arts in English Literature from Bates College in Maine. He lives in Riverside, California with his wife Kyoko and 2 children, Harrison and Mako. His hobbies include Taekwondo and playing the Okinawan Shamisen. Contact info:

Gary J. Williams, Ed.D. Instructional Assessment Specialist Crafton Hills College 11711 Sand Canyon Road Yucaipa, CA 92399 Tel: (909) 389-3567

Email: [email protected]

8

Page 11: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

WASC Retreat on Student Learning and Assessment Mentor Biographies

Swarup Wood Swarup Wood is an Associate Professor of Chemistry for an economics and policy infused environmental science degree program at California State University at Monterey Bay. He is chair of CSUMB’s general education faculty learning communities and for the past eight years Swarup has served as CSUMB Assessment Fellow, working closely with CSUMB’s Center for Teaching, Learning, and Assessment. He has given many workshops and oral presentations nationally (AAHE and AAC&U) on how to develop learning outcomes, criteria, and standards, and on his research of what faculty learn from their collaborative review of student work. His recent book with Amy Driscoll, Developing Outcomes-based Assessment for Learner-centered Education shares much of CSUMB’s experience as the campus developed and refined learning outcomes, developed criteria and standards for learning outcomes, and conducted collaborative faculty assessment of student work. Swarup is currently very involved in CSUMB’s work towards reaccreditation. Contact info:

Swarup Wood Ph.D. , Professor of Chemistry California State University at Monterey Bay Office Phone 831-582-3926 Email: [email protected]

Barbara Wright

Barbara Wright is an Associate Director at the Senior Commission of the Western Association of Schools and Colleges, where she leads educational programming efforts in addition to working with individual campuses. She served for over 25 years as a faculty member in German at the University of Connecticut, before retiring in 2001. Although her graduate training was in German language and literature, her interests expanded over the years to include language acquisition, women's studies, curricular reform, general education, and assessment. From 1988 to 1990 she directed a FIPSE-funded project to assess a new general education curriculum at UConn, and from 1990 to 1992 she served as director of the American Association for Higher Education's Assessment Forum. From 1995 to 2001 she was a member of the New England Association of Schools and Colleges' Commission on Institutions of Higher Education, and she has participated in team visits for several regional accreditors. Barbara would be pleased to consult with campus teams on all aspects of assessment, from basic principles and key questions through choice of methods, evidence gathering, interpretation, and use of results for program improvement. She is especially interested in qualitative approaches to the assessment of general education's more challenging, seemingly ineffable goals. She believes that faculty are more willing to engage in assessment when assessment practices correspond to their highest ambitions for students’ intellectual and personal development. She received her PhD from Berkeley. Contact Info:

Barbara D. Wright, Associate Director Western Association of Schools and Colleges Email: [email protected]

9

Page 12: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

ATTENDEE

DIRECTORY

10

Page 13: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Nicola Acutt Associate Dean of programs Presidio Graduate School [email protected]

Eddie Afana Acting Dean, Research and Planning Los Angeles Trade Technical College [email protected]

Mohammad Amin Professor National University [email protected]

Kim Anderson SLO Coordinator Long Beach City College [email protected]

Eileen Apperson SLO Coordinator/Program Review Chair Reedley College [email protected]

Scott Ashmon Assistant Professor of Biblical Languages Concordia University [email protected]

Elizabeth Atkinson Associate Dean of Faculty Linfield College [email protected]

Michael Barber Prof. of Theology John Paul the Great Catholic University [email protected]

William Barrick Professor The Master's Seminary [email protected]

Debrea Bean Associate Provost National University [email protected]

Ron Benton Asst. VP, Administrative Services Colorado Christian University [email protected]

Janna Bersi Associate VP, Academic Resource Mngt & Planning

CSU Dominguez Hills [email protected]

Camilla Betwell Bookstore Manager College of the Marshall Islands [email protected]

Roberta A. Bigelow Associate Dean, CLA Willamette University [email protected]

Andreea Boboc Assistant Professor University of the Pacific [email protected]

Joan Bouillon Dean of Academic Affairs The Art Institute of California - Sacramento [email protected]

Marie Boyd Curriculum/SLO Coordinator Chaffey College [email protected]

La Shawn Brinson Professor LA Southwest College [email protected]

Devon Brooks Associate Dean for Faculty Affairs USC School of Social Work [email protected]

Ronald W. Brown Interim, Assoc. Vice Chancellor Lone Star College System [email protected]

Robert Burns Department Chair Diablo Valley College [email protected]

Irv Busenitz Academic Vice President The Master's Seminary [email protected]

Ana Luisa Bustamante Dept. Chair Phillips Graduate Institute [email protected]

Deborah Buttitta Dept. Chair Phillips Graduate Institute [email protected]

Angela Caballero de Cordero Noncredit Matriculation Coordinator/Counselor Allan Hancock College [email protected]

Anthony Cadavid SLO Facilitator East Los Angeles College [email protected]

Stacey Caillier Director, Teacher Leadership Program High Tech High Graduate School of Education [email protected]

Thomas Camacho Administrative Coordinator USC School of Social Work [email protected]

Jomi Monica Capelle Student Services Support College of the Marshall Islands [email protected]

Daniel Cardenas President The Art Institute of California Sunnyvale [email protected]

Moya Carter Associate Dean, Student Affairs Pitzer College [email protected]

Fred Chapel Core Faculty Antioch University Los Angeles [email protected]

Byron En-pei Chung President The Art Institute of California-San Francisco [email protected]

Catherine Collins Professor of Rhetoric & Media Studies Willamette University [email protected]

Maria Dolores Costa Director, Faculty Development California State University, Los Angeles [email protected]

Debi Gerger Debi Gerger West Coast University [email protected]

David Debrum Safety & Security Director College of the Marshall Islands [email protected]

Sarah M. Dennison AVP Assessment/Institutional Effectiveness Education Management Corporation [email protected]

Nancy Deutsch Staff Development Coordinator Cypress College [email protected]

Jose Dial Dean of Academic Affairs College of the Marshall Islands [email protected]

Dwight Doering Professor of Education Concordia University [email protected]

Qingwen Dong Professor of Communication University of the Pacific [email protected]

Charles Dunn Associate Professor of Mathematics Linfield College [email protected]

Report Name:Report Date:

757797 (24-Sep-09) - Status: ActiveWASC Workshop: Retreat on Student Learning and Assessment, Level I

Workshop Attendee Directory16-Sep-2009

Event# :Event Title :

Record Count: 176

full name (first middle last) job title company/organization email

51 ofPage :RegOnline (Copyright 1996-2009 - All rights reserved)

11

Page 14: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Helen Easterling Williams Dean Azusa Pacific University [email protected]

Paul Eastup Media & Arts Div Chair Marymount College [email protected]

Antonia Ecung Dean, Academic Affairs Porterville College [email protected]

Siobhan Fleming Associate Vice Chancellor-Research & Institutional Effectiveness

Lone Star College System [email protected]

Patricia Flood SLO Coordinator Los Angeles Mission College [email protected]

Judy Foster Institutional Effectiveness Coordinator Diablo Valley College [email protected]

Michelle Fowles Dean, Research and Planning Los Angeles Valley College [email protected]

WILLIAM FRANKLIN AVP STUDENT SUCCESS CSU Dominguez Hills [email protected]

Barbara Fuller Dept. Chair Phillips Graduate Institute [email protected]

Jose Alfredo Gallegos Assistant Research Analyst East Los Angeles College [email protected]

Leslie Gargiulo Academic Dean Allied American University [email protected]

Joanne Gartner Director of Curriculum LA College International [email protected]

Irene Girton Associate Dean for Arts & Humanities Occidental College [email protected]

Shelley Glickstein Dean of Academic Affairs The Art Institute of California--LA [email protected]

Roger Gomez President The Art Institute of California - Sacramento [email protected]

George F Gonzalez Director of Research & Institutional Effectiveness

San Jacinto College [email protected]

Randy Grant Professor of Economics Linfield College [email protected]

Allison Guerin Academic Operations Manager Presidio Graduate School [email protected]

Bradley Hale Associate Professor Azusa Pacific University [email protected]

La Vonne Hamilton Institutional Research LA Southwest College [email protected]

Brian Timothy Harlan Senior Director, Institutional Research & Assessment Group

Occidental College [email protected]

Martin Joseph Harold Director of Admissions John Paul the Great Catholic University [email protected]

Michelle Hawley Faculty Director or Service Learning and Learning Communities

California State University, Los Angeles [email protected]

Errin Heyman West Coast University [email protected]

Cherron R Hoppes Dean, Undergraduate Programs Golden Gate University [email protected]

Heather Hubbert Assistant Dean of Students California Baptist University [email protected]

Behzad Izadi Professor Cypress College [email protected]

Isabel Izquierdo Instructor Diablo Valley College [email protected]

Herschel Jackson Director, Student Life and Leadership Estrella Mountian Community College [email protected]

Jeremiah Jackson Prof. of Business John Paul the Great Catholic University [email protected]

Karen Jackson VP Administration Phillips Graduate Institute [email protected]

Veronica Jaramillo SLO Coordinator East Los Angeles College [email protected]

Patrick Jefferson Dean LA Southwest College [email protected]

Young-Jun Jeon Dean of Business Affaris Shepherd University [email protected]

Michelle R. Johnson Institutional Research Coordinator Reedley College [email protected]

Angela Jones Associate Dean of Academic Affairs The Art Institute of California-San Francisco [email protected]

Phillip Jones-Thomas Professor LA Southwest College [email protected]

K. Jimmy Jimmy Juge Assistant Professor University of the Pacific [email protected]

Miriam Kahan West Coast University [email protected]

Susan Keller Professor Western State University [email protected]

Shalom Kim Dean of Planning and Assessment Shepherd University [email protected]

Erin King-West Faculty Phillips Graduate Institute [email protected]

Lisa Anne Kramer Director of Assessment and Evaluation Golden Gate University [email protected]

Report Name:Report Date:

757797 (24-Sep-09) - Status: ActiveWASC Workshop: Retreat on Student Learning and Assessment, Level I

Workshop Attendee Directory16-Sep-2009

Event# :Event Title :

Record Count: 176

full name (first middle last) job title company/organization email

52 ofPage :RegOnline (Copyright 1996-2009 - All rights reserved)

12

Page 15: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lucinda Kramer Associate Professor National University [email protected]

Deonne Kunkel Instructor, Eng. Dept. Chabot College [email protected]

Susan Lamb Vice President of Office of Instruction Diablo Valley College [email protected].

Bryan Lamkin Professor Azusa Pacific University [email protected]

Lora Lane SLO Assessment Coordinator Los Angeles Harbor College [email protected]

Melinda R Lester Dean of Academic Affairs The Art Institute of California - Orange County [email protected]

Muriel Lopez Wagner Director of Institutional Research Pitzer College [email protected]

Marilyn K. Maine Department Chair Los Angeles Trade Technical College [email protected]

Gary B. Martin SLO Coordinator Cosumnes River College [email protected]

Randy Martinez Professor Cypress College [email protected]

Leslyn J. McCallum Professor/SLO Coordinator San Jose City College [email protected]

Cathie McClellan Associate Professor University of the Pacific [email protected]

Scott McClintock Assistant Professor National University [email protected]

J. Cynthia McDermott Chair Antioch University Los Angeles [email protected]

Ryan McIlhenny Professor Providence Christian College [email protected]

Caren Meghreblian Dean of Academic Affairs The Art Institute of California-San Francisco [email protected]

Holly Menzies Professor California State University, Los Angeles [email protected]

Patty Meyer Faculty American Film Institute [email protected]

Dianne Moore VP of Nursing Operations West Coast University [email protected]

Jason Moore Executive Vice President/Provost Pioneer University [email protected]

Kim P Moore President Pioneer University [email protected]

Susan Mun Institutional Researcher San Diego Mesa College [email protected]

Janice Novak Business Instructor Chabot College [email protected]

Dawn Nowacki Professor of Political Science Linfield College [email protected]

Karen Nowak Dean of Academic Affairs The Art Institute of California - Hollywood [email protected]

Allison Ohle Chief of Staff High Tech High Graduate School of Education [email protected]

Charles E. Osiris Dean,Student Services/Counseling & Matriculation

Allan Hancock College [email protected]

Mona Panchal SLO Facilitator East Los Angeles College [email protected]

Stephen Parmelee Assistant Professor of English Pepperdine University [email protected]

Mark David Parsons Assistant Professor Claremont School of Theology [email protected]

Parviz Partow-Navid Professor and Director of Student Services California State University, Los Angeles [email protected]

Deborah Rena Paulsen Chair, Arts/Media/Humanities Los Angeles Mission College [email protected]

Linda Perez Information Management Liaison Business Analyst

San Jacinto College [email protected]

Joe Petricca Executive Vice Dean American Film Institute [email protected]

Yohan Pyeon Dean of Academic Affairs Shepherd University [email protected]

Susan Regier Div. Chair Language Arts Porterville College [email protected]

Elena Reigadas Psychology Professor Los Angeles Harbor College [email protected]

Gregory J. Riley Professor Claremont School of Theology [email protected]

Jelena N. Ristic Assistant Dean, Undergraduate Programs Golden Gate University [email protected]

Lori Roberts Professor Western State University [email protected]

Troy Roland President National Polytechnic College of Science [email protected]

Constance Rothmund Dean of Academics National Polytechnic College of Science [email protected]

Carlos Royal Social Sciences Dept Marymount College [email protected]

Report Name:Report Date:

757797 (24-Sep-09) - Status: ActiveWASC Workshop: Retreat on Student Learning and Assessment, Level I

Workshop Attendee Directory16-Sep-2009

Event# :Event Title :

Record Count: 176

full name (first middle last) job title company/organization email

53 ofPage :RegOnline (Copyright 1996-2009 - All rights reserved)

13

Page 16: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Amanda Ryan-Romo SLO Facilitator East Los Angeles College [email protected]

Joe Safdie SLO Coordinator San Diego Mesa College [email protected]

Ramona Santiesteban Director, Disability Resources Estrella Mountian Community College [email protected]

Vasemaca Savu Instructor College of the Marshall Islands [email protected]

Jon F. Schamber Director of Educational Effectiveness & Assessment

University of the Pacific [email protected]

Alan Scher Faculty Phillips Graduate Institute [email protected]

Anita Schutz Instructor College of the Marshall Islands [email protected]

Fred Scott Solutions Consultant LiveText [email protected]

Michael Semenoff Institutional Research, Director Marymount College [email protected]

Austin Shepard Director, Academic Enrichment Programs Estrella Mountian Community College [email protected]

Brian Simpson Associate Professor National University [email protected]

Katrina Sitar Assistant to the Dean of Faculty Pitzer College [email protected]

Joe Slowensky Director of Assessment and Strategic Curricular Initiatives.

Chapman University [email protected]

Keith Smith Assistant Professor University of the Pacific [email protected]

Laura Soloff President The Art Institute of California--LA [email protected]

Carole Splendore Learning Assessment Coordinator Chabot College [email protected]

Rebecca Stein SLO Coordinator Los Angeles Valley College [email protected]

Rosalinda Sumaoang Instructor-Developmental Ed College of the Marshall Islands [email protected]

Duncan Sutton Dir. Music/Worsip Ministries The Salvation Army, CFOT [email protected]

Wayne Tikkanen Faculty Director of General Education California State University, Los Angeles [email protected]

Pogban Toure Professor LA Southwest College [email protected]

Elizabeth Trebow Dept. Chair and ALO Phillips Graduate Institute [email protected]

Bill Tsatsoulis GVP - Western Region Education Management Corporation [email protected]

Paulina Van Assessment Director Samuel Merritt University [email protected]

Dan Van Voorhis Assistant Professor of History Concordia University [email protected]

Obed Vazquez Instructor Diablo Valley College [email protected]

Reuben Veliz Business/Economics Dept Marymount College [email protected]

Tom Vessella SLO Cordinator Los Angeles Trade-Technical College [email protected]

Tom Vitzelio SLO Coordinator/Instructional Specialist Chaffey College [email protected]

Michael Vlach Professor The Master's Seminary [email protected]

Nancy Wada-McKee Asst. Vice President, Student Affairs California State University, Los Angeles [email protected]

Dan Walden Dean LA Southwest College [email protected]

Kerry Walk Associate Dean of Faculty for Academic Administration

Pitzer College [email protected]

Tracy Ward Dean of Academic Services California Baptist University [email protected]

Rachel Westlake Division Dean of Math & Computer Science Diablo Valley College [email protected]

Lisa Wheland Director of Institutional Effectiveness the Art Institute of California - Orange County [email protected]

Ted Wieden Senior Dean of Curriculum and Instruction Diablo Valley College [email protected]

June Wiley Vice President of Academic Affairs LA College International [email protected]

David Willoughby Faculty Phillips Graduate Institute [email protected]

Joan Woosley University Registrar California State University, Los Angeles [email protected]

Kristine Wright Professor LA Southwest College [email protected]

Sally Wu Behavioral Sciences Dept Marymount College [email protected]

Mina Yavari Professor Allan Hancock College [email protected]

Report Name:Report Date:

757797 (24-Sep-09) - Status: ActiveWASC Workshop: Retreat on Student Learning and Assessment, Level I

Workshop Attendee Directory16-Sep-2009

Event# :Event Title :

Record Count: 176

full name (first middle last) job title company/organization email

54 ofPage :RegOnline (Copyright 1996-2009 - All rights reserved)

14

Page 17: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Glenn Yoshida SLO Coordinator LA Southwest College [email protected]

Randy Zarn AVP Student LIfe CSU Dominguez Hills [email protected]

George Zottos Outcomes Assessment Specialist Riverside Community College/Moreno Valley Campus

[email protected]

Heather Zuber Professor Western State University [email protected]

Report Name:Report Date:

757797 (24-Sep-09) - Status: ActiveWASC Workshop: Retreat on Student Learning and Assessment, Level I

Workshop Attendee Directory16-Sep-2009

Event# :Event Title :

Record Count: 176

full name (first middle last) job title company/organization email

55 ofPage :RegOnline (Copyright 1996-2009 - All rights reserved)

15

Page 18: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Plenary

Assessment in Learning-

centered Institutions

Mary Allen

16

Page 19: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Assessment in Learning-Centered Institutions

Mary J. Allen, [email protected], September 2009

Why so much emphasis on assessment?

• Accreditation Expectations • A Learning-Centered Focus

Academic Program Goals

Students learn: • The concepts, theories, research findings, techniques, and values of the discipline • How to integrate what they learn to solve complex, real-world problems • An array of core learning outcomes, such as collaboration, communication, critical

thinking, information literacy, and leadership skills Curriculum • Cohesive program with systematically-created opportunities to synthesize, practice,

and develop increasingly complex ideas, skills, and values—to develop deep and lasting learning

How Students Learn

• Students construct knowledge by integrating new learning into what they already know.

• Feedback guides student improvement. • Students can learn, clarify ideas, and develop alternative perspectives through

reflection and interpersonal interactions. Course Structure

• Students engage in learning experiences to master course learning outcomes. • Grades indicate mastery of course learning outcomes.

Pedagogy • Based on engagement of students • Help students be “intentional learners” (AAC&U; greaterexpectations.org)

Course Delivery Faculty use a repertoire of teaching techniques to meet the needs of diverse students and to promote different types of learning outcomes, such as • Active learning • Collaborative and cooperative learning • Community-service learning • Homework and laboratory assignments • Lectures and discussion • Online learning • Problem-based learning

Faculty Instructional Role

• Design learning environments to meet student and program needs • Share interests and enthusiasm with students • Provide students formative feedback on their progress; grade student work • Mentor student development in and out of the classroom • Assess class sessions, courses, and programs to improve their effectiveness

Assessment • Faculty use classroom assessment to improve day-to-day learning in courses (Angelo & Cross, Classroom Assessment, Jossey-Bass, 1993).

• Faculty use program assessment to improve learning throughout the curriculum. • Faculty and others assess their impact to improve institutional effectiveness.

Campus • Co-curriculum and support services are aligned to support learning. • Program reviews and campus decision-making are conducted within a “culture of

evidence.” • Recognition and reward systems value contributions to learning and encourage

flexibility to uncover new ways to encourage/support learning. • Routine campus conversations on learning

17

Page 20: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

The Cohesive Curriculum

• Coherence • Synthesizing Experiences • Ongoing Practice of Learned Skills • Systematically Created Opportunities to Develop Increasing Sophistication and Apply What

Is Learned

Curriculum Map

Course Outcome 1 Outcome 2 Outcome 3 Outcome 4 Outcome 5 100 I, D I 101 I D 102 D D D 103 D 200 D D 229 D 230 D, M M 280 290 M D, M M

I = Introduced, D = Developed & Practiced with Feedback, M = Demonstrated at the Mastery Level Appropriate for Graduation

Academic Program Assessment Program assessment is an on-going process designed to monitor and improve student learning. Faculty: • develop explicit statements of what students should learn. • verify that the program is designed to foster this learning. • develop a meaningful, manageable, sustainable assessment plan. • collect empirical data that indicate student attainment. • assess the data and reach a conclusion (faculty are satisfied or disappointed with the extent of

student learning). • use these data to improve student learning.

Elements of an Assessment Plan

• What assessment evidence will be collected? • When and how often will it be done? • Who will assess and reflect on the results? • How will results, implications, and related changes be documented?

18

Page 21: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Quotations from the Wise and Experienced

1. “Assessment is an on-going process. We don’t ‘get it done’; we ‘get on with it.’” Outcomes Assessment, Miami of Ohio

2. “Three cardinal rules for evaluation or assessment: ‘Nobody wants to be evaluated, nobody wants to be evaluated, and finally, nobody wants to be evaluated.’”

Frank Newman

3. “Much of the literature on assessment suggests, and the Task Force agrees, that an institution will benefit from assessment only if faculty and cocurricular professionals see a use for the results and if they take the lead in formulating questions which assessment can help answer.”

Willamette Task Force on Outcomes Assessment

4. “Self-assessment is not the goal. Self-adjustment is the goal. That’s what makes Tiger Woods and Michael Jordan great. That’s what makes Socrates so impressive. That’s what our best students and teachers do. They self-adjust, with minimal effort and optimal effect.” Grant Wiggins

5. “Assessment per se guarantees nothing by way of improvement, no more than a thermometer cures a fever.” T. J. Marchese

6. “While in the process of developing new outcomes/objectives, the department or administrative unit can easily identify assessment procedures that will be so time- and resource-consuming that they will become an end in themselves and not a means of determining whether a specific outcome/objective has been achieved. If this occurs, the long-term result is likely to be abandonment of the process.”

James O. Nichols

7. “. . . institutional evaluation should use objective data where available and purposeful but make no apologies for using subjective data. Or, it is better to be generally right than precisely wrong.” R. L. Miller

8. “The most important thing about assessment is that it promotes dialogue among faculty.” Mary Senter

Most of us don’t have to assess every outcome in every student every year!

19

Page 22: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Some Basic Vocabulary • Direct vs. Indirect Assessment

• Value-Added vs. Absolute Learning Outcomes

• Embedded Assessment

• Formative vs. Summative Assessment

• Authentic Assessment

• Triangulation

If you have absolute outcomes, your assessment plan should emphasize direct, authentic, summative assessment, with triangulation.

20

Page 23: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Assessment in Learning-Centered Institutions- 5

Sample Assessment Plan

Outcome 3, dealing with the understanding of research methods in the discipline, will be assessed every fourth year starting in 2010/11 by assessing the quality of research skills demonstrated in capstone (taken right before graduation) research projects and by embedding relevant questions in final exams in this course. An ad hoc faculty committee will develop and score the test items, and they will develop and apply a rubric to analyze the capstone projects. Focus groups on student perceptions concerning their understanding of research methods will be conducted by Assessment Center staff, and they will work in consultation with the ad hoc committee. Does this plan involve: • Direct assessment? • Indirect assessment? • Authentic assessment? • Formative assessment? • Summative assessment? • Triangulation?

Closing the Loop Sometimes results support the status quo. Celebrate! If results suggest the need for change, you might consider these four types of change: • Pedagogy—e.g., changing course assignments; providing better formative feedback to

students; use of more active learning strategies to motivate and engage students • Curriculum—e.g., adding a second required speech course, designating writing-intensive

courses, changing prerequisites • Student support—e.g., improving tutoring services; adding on-line, self-study materials;

developing specialized support by library or writing center staff; improving advising to ensure the courses are better sequenced

• Faculty support—e.g., providing a writing-across-the-curriculum workshop; campus support for TAs or specialized tutors

21

Page 24: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lecture / Discussion 1:

An Outcomes-based

Assessment Model for General Education

Amy Driscoll

22

Page 25: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

An Outcomes-based Assessment Model for General Education

Amy Driscoll

WASC EDUCATIONAL SEMINAR Level I

September 24, 2009

DefinitionsAssessment is the process of gathering information/data on student learning.General education generally describes basic or foundational knowledge/skills and attitudes that all undergraduates are required to have for graduation.

Possibilities:Purpose/Definition“The purpose of assessment is to improve learning” (Angelo, 2000)

“Assessment is a dynamic pedagogy that extends, expands, enhances, and strengthens learning” (Driscoll, 2001)

23

Page 26: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Thinking about AssessmentDoes assessment flow from the institution’s mission and reflect the educational values? Does assessment address questions that people really care about?Does assessment help faculty fulfill their responsibilities to students, to the public?Does assessment of general education describe students’ readiness for other curriculum?

Aligning Mission with Goals for General Education:

Our central mission is to develop life-long learning skills, impart society’s cultural heritage, and educate and prepare for both the professions and advanced study.

Goals: life-long learning skillscultural heritage

Aligning Institutional Values With Goals for General Education:

ESU has a commitment to academic and personal integrity.

GOALS: Academic IntegrityPersonal Integrity

24

Page 27: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Aligning Institutional Vision with Goals of General Education

”to enhance recognition of the value of higher education”“…to enhance the intellectual, social, cultural, and economic qualities of urban life”

Goals: Valuing Higher EducationUrban Citizenship

Assessment Protocols for Learning-centered Assessment

GOAL OUTCOMES

Evidence

Criteria

Standards:a) Exemplary Achievementb) Satisfactory Achievementc) Unsatisfactory Achievement

Goals

Broad descriptions

Categories of learning outcomes

End toward which efforts are directed

25

Page 28: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Examples of Goals

Critical Thinking

Citizenship in a Democracy (Grad. School of Education)

Team work and Collaboration (School of Community Health

Ethics

Student Learning Outcomes

Refer to Results in Terms of Specific Student Learning, Development, and Performance (Braskamp and Braskamp, 1997)

Answer the Question – “What Do We Expect of Our Students?” (CSU Report 1989)

Describe Actual Skills, Understandings, Behaviors, Attitudes, Values Expected of Students

Examples of OutcomesMath: Use arithmetical, algebraic, geometric and statistical

methods to solve problems.

Ethics: Identify and analyze real world ethical problems or dilemmas and identify those affected by the dilemma.

Culture and Equity: Analyze and describe the concepts of power relations, equity, and social justice and find examples of each concept in the U.S. society and other societies.

Team work: Listens to, acknowledges, and builds on the ideas of others.

26

Page 29: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Evidence

Student Work that Demonstrates Achievement of Outcomes (Assignments, Projects, Presentations, Papers, Responses to Questions, Etc.)Designed for appropriate level of learning expectations (outcomes)Opportunity for Different Ways of Demonstrating Learning

Examples of Evidence

Teamwork Role play or case study

Project or problem solving assignment

Math Mathematical and statistical projects and papers

Ethics A written accountA multi-media presentation or display boardAn audio tape

Criteria

Qualities Desired in Student Work (Evidence)

Represent Powerful Professional Judgment of Faculty

Guide Student Learning Efforts

Promote Lifelong Learning

Support Faculty in Making Objective Evaluations

27

Page 30: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Examples of Criteria

Math AccuracyComplexityClarity and Coherence

Ethics Complexity (broad, multifaceted, interconnected)Conscious Awareness

Culture and Equity Range of CulturesReflectivity and Integration

TeamworkRespectFlexibility

Standards

Describe Different Levels of Criteria

Describe Specific Indications of Criteria

Promote Understanding of Criteria

Support Faculty in Making Objective Evaluations

Examples of StandardsMath (Accuracy)

Satisfactory: Contains few errors and those errors do not significantly undermine the quality of the work.Considers and uses data, models, tools or processes that reasonably and effectively address issues or problems.Unsatisfactory: One or more errors that significantly undermine the qualityof the work.Uses data, models, tools or processes in inappropriate or ineffective ways.

Ethics (Complexity)Standard for Excellent: Consistently views sophisticated and significant dilemmas and issues with a broad focus and from multiple perspectives.Standard for Satisfactory: Usually views sophisticated and significant dilemmas and issues with a broad focus, but may sometimes use a more narrow focus and may use fewer perspectives.Standard for Unsatisfactory: Mainly views issues and dilemmas in simple terms and usually does so with a limited focus and minimal perspectives.

28

Page 31: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Assessment Protocols

GOAL OUTCOMES

Evidence

Criteria

Standards:a) Exemplary Achievementb) Satisfactory Achievementc) Unsatisfactory Achievement

Assessment SampleEducational Goal-Personal integrity

Outcomes-Students articulate an individual code of ethics and apply it to personal decisions of integrity-Student describe and assume personal responsibility in collaborative endeavors, and respect and support the responsibilities of others

Personal IntegrityEvidence-Written code with discussion of two different life decisions based on the code-Multimedia presentation-Letter of application for professional position-Dramatization of ethical issues

Criteria-Reflection-Multiple perspectives

29

Page 32: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Personal Integrity

Standards-Excellence in Reflection: Consistently raises

questions, checks assumptions, connects with previous experiences, acknowledges biases and values and engages in self-assessment

-Excellence in Multiple Perspectives: Examines thinking and experiences of others, considers those affected by decisions, and considers diverse courses of action

Assessing Student Learning: Course, Program and Institutional Levels

1. Preparation: Determine purpose(s) and definition of assessment; Examine mission and values

4. Make outcomes, evidence, criteria, and standards “public and visible” (syllabi, programs, brochures)

5. Collect evidence ofstudent achievement

7. Revise outcomes and criteria, Improve pedagogy and curriculum for learner success 2. Design

assessment: Articulate goals, Develop clear outcomes, evidence, criteria, and standards

6. Review and analyze student evidence

3. Alignment of curriculum and pedagogy with learning outcomes

Step 3: Aligning Curriculum and Pedagogy with Learning Outcomes

Outcomes and Criteria as Planning FocusFaculty Alignment GridsLearner Grids

30

Page 33: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Step 4: Making Learning Outcomes ---

Public and Visible

Relevant and Meaningful

Motivating and Supportive of Learning

Step 5: Collect Evidence of Student Achievement

Collect representative samples (3 Exemplary, 3 Satisfactory, 3Unsatisfactory)

Step 5: Review and Analyze Evidence

Read holistically to determine whether outcomes are achieved (reliability).Several readings to identify examples of criteria (validity).Final reading for insights about pedagogy, class structure and environment, and learning supports.

31

Page 34: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Step 6: Process Results: Improving Learning

Documentation of student achievement of outcomes Identification of curricular gaps/foci and pedagogical weaknesses/strengthsClarification of outcomes, criteria & standardsRedesign of evidence

Expanding Assessment of General Education

Graduate exit surveys

Alumnae surveys

Employer surveys

SUMMARYOutcomes-based assessment for general education can provide a foundation for integrating institutional goals in the major programs of study. The assessment protocols provide a foundation for students to become successful learners. Faculty who collaborate to develop general education assessment protocols become more intentional with their teaching and learning plans.

32

Page 35: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lecture / Discussion 2:

Assessment for Student Affairs Staff and Other Campus Professionals

Mary Allen

33

Page 36: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Assessment for Student Affairs Staff and Other Campus Professionals

Mary J. Allen, [email protected], September 2009

Assessment

The assessment of student affairs and administrative units is an on-going process designed to monitor and improve the effectiveness of the unit being assessed. Professionals in each unit: • Develop explicit statements of the unit’s mission and objectives. • Verify that the unit’s operations are organized to foster the objectives. • Collect empirical data that indicate how well objectives are being met. • Use these data to improve the unit’s effectiveness (“close the loop”).

Why so much emphasis on assessment?

• Accreditation expectations for institutional effectiveness • Assessment establishes a culture of evidence for supporting effective, reflective, self-

monitoring, self-correcting institutions.

Articulating Objectives (Nichols & Nichols) Processes (e.g., travel claims or applications are processed efficiently and equitably) Learning Outcomes (e.g., students who receive training can write an effective resume or can

use the campus email system; staff who receive training can effectively use campus accounting procedures; students who are served by the Counseling Center report fewer plans to withdraw from campus; students who participate in this event can describe the effects of alcohol on drivers; employees are aware of campus health and safety procedures)

Satisfaction Indicators (people supported by the unit report satisfaction with the service, e.g., students report satisfaction with Health Center services)

Mission and Objectives Mission: a holistic vision of the values and philosophy of the unit. Objectives: desired processes, learning outcomes, and satisfaction ratings. Each should be tied

to an assessment technique (e.g., a survey) with an associated standard (e.g., an average rating of at least 3.5 on a 5-point rating scale).

The unit’s mission should (Nichols & Nichols, p. 35): • Describe the purpose of the unit. What services are provided? To whom? • Be brief (less than one page). • Be aligned with the campus mission.

34

Page 37: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

• Be known by the staff. • Be used by the staff to make decisions and set priorities.

Examples of Mission/Vision/Goals Statements Example 1: Student Affairs Division (Oregon State University. Retrieved 9/21/07 from

http://oregonstate.edu/studentaffairs/missionvision.html) Our Mission. The Division of Student Affairs contributes to and facilitates the success of students and Oregon State University. Our Vision. Faculty and staff provide leadership for the positive development of community at Oregon State University. We collaborate with others to enhance the educational environment and support the teaching and learning process. We value and respect the individual and believe that sharing knowledge changes lives. Example 2: Library Mission Statement (Nichols & Nichols, p. 36) “The university educates students to assume leadership roles in the state, nation, and world through its nationally recognized programs of undergraduate, graduate, and professional study. Its fundamental purpose is the creation and dissemination of knowledge. The university libraries support this mission. Specifically, the university libraries strive to meet the information needs of the academy, its students, faculty and staff, by employing contemporary knowledge management techniques to develop collections, provide access to information sources, and instruct individuals in contemporary bibliographic methodologies.” Example 3: Accounting Office Mission Statement (Nichols & Nichols, p. 36) “The Accounting Office seeks (1) to provide administrators with accurate and timely financial data to assist them in the management of the institution’s resources, and (2) to ensure that financial records are maintained in accordance with generally accepted accounting principles and guidelines as established by State and Federal Agencies.”

Example 4. Student Affairs Goals: Ferris State University (2003; Retrieved 9/21/07 from http://www.ferris.edu/htmls/administration/studentaffairs/assessment/03SAReport.pdf) “The primary goal of Student Affairs is to provide activities, programs, and facilities that support the personal development, educational progress and career goals of all students. • Create and foster an environment in which diverse talents and backgrounds are recognized

while providing unifying common experiences. • Encourage understanding and appreciation for others. • Establish an environment that is safe, secure, and helps students to maximize their mental

and physical health. • Support and advance institutional values by developing and enforcing behavioral standards

for students. • Foster a sense of responsibility for personal and community safety through education which

reinforces personal accountability for one’s actions.

35

Page 38: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

• Help students become informed decision-makers in order to reduce alcohol and other drug abuse.

• Build respect for the value of community and positive group affiliation. • Serve as educational resource personnel to others in the University community. • Continue communication and collaboration with faculty, staff, and administrators campus-

wide to meet the educational goals of the University. • Engage in assessment activities that evaluate the effectiveness of all our programs,

departments, and the Division as a whole on an ongoing basis. • Provide quality service, which includes personal, timely attention to our customers. • Effectively recruit and retain students. • Assist students in securing financial resources to help pay for their educational costs. • Provide accurate and timely institutional, State, and Federal reports as required.” Effective outcomes/objectives should be: • Consistent with the unit and campus mission. • Realistic. • Few in number. • Assessable. • Used by staff to set priorities and make decisions. Examples: 1. Accurate, real-time class enrollment data are continuously available to faculty and

administrators. 2. Students who attend a Career Orientation Workshop can prepare a resume and use our on-

line bulletin board to monitor potential employment opportunities. 3. All students attending orientation will receive email accounts and will know how to use the

email system to communicate with students, faculty, and staff. 4. Interlibrary loan materials will be delivered within eight working days. 5. Students report satisfaction with Health Center Services; ratings will average at least 3.80 on

a 5-point rating scale. 6. On average, at least 100 students will attend each cultural event sponsored by the ASI. 7. Faculty who attend Blackboard workshops will be able to create and update online course

materials. 8. Student government meetings follow procedures defined in the Handbook. 9. Staff who are certified to use the enrollment management system can independently add and

delete courses, place enrollment restrictions on courses, and monitor course enrollments. 10. Students who use the Writing Center at least three times in a semester improve writing skills. 11. Students who participate in the diversity retreat will report increased understanding of people

of racial and ethnic backgrounds different from their own. 12. Students who attend New Student Orientation can describe General Education requirements

and an array of services available to them on campus.

36

Page 39: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Student Affairs Student Learning Outcomes at Southern Illinois University Edwardsville (Retrieved 9/21/07 from http://www.siue.edu/AQIP/goal1/AssessmentStudentAffairs.ppt#256)

1. Integrate classroom and out-of-classroom learning experiences. 2. Integrate values learned from prior experience with values learned at the University. 3. Attend activities, programs, and events not previously experienced prior to attending the

University. 4. Demonstrate the value of diversity and community. 5. Contribute to at least one group for the purpose of developing projects, programs,

relationships or performing volunteer service. 6. Seek the advice and counsel of peers, faculty, staff, and others. 7. Demonstrate the value of their own health and wellness and that of others. 8. Make decisions based on values and ethical principles. 9. Articulate personal and career goals. 10. Demonstrate communication skills and behaviors necessary for the work place. 11. Demonstrate a sense of curiosity and appreciation for lifelong learning.

First-Year Experience Courses Frequently Staffed by Faculty, Student Affairs Professionals, or Both

The First-Year Initiative Survey

The First-Year Initiative (FYI; http://www.webebi.com/University/FYI) benchmarking survey was piloted in 2001 and is designed to assess ten types of learning outcomes typically fostered in first-year experience seminars: ● Study strategies ● Academic/cognitive skills ● Critical thinking ● Connections with faculty ● Connections with peers ● Out-of-class engagement ● Knowledge of campus policies ● Knowledge of wellness/spirituality ● Management of time/priorities ● Knowledge of wellness (Swing, 2004, p. 119)

In addition, it collects demographic information (e.g., gender, age, living arrangements, alcohol use) and assesses campus satisfaction and some aspects of course delivery (e.g., effective readings, engaging pedagogy).

Examples of First-Year Experience Course Outcomes, Objectives, and Goals*

Bryant University. Bryant University’s required first-year seminar, Foundations for Learning, is designed to “help students take responsibility for their education” by: ● Understanding the importance of being actively involved in the educational process ● Developing cognitive and metacognitive abilities

37

Page 40: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

● Developing a fuller understanding of a range of learning and study strategies ● Learning how planning and prioritizing impact academic success ● Developing self-concept including an awareness of health and wellness issues ● Developing communication skills including those related to collaboration and leadership ● Engaging in scholarly activates such as group discussion, conducting research, and

synthesizing materials ● Understanding the importance of respecting diversity as a member of the Bryant

community and a citizen of the world (Hazard, 2005, p. 24)

Mount Mary College. The first-year seminar at Mount Mary College, Leadership for Social Justice, is strongly recommended to all new, traditional-aged students, and it has six primary objectives: ● To introduce students to Mount Mary’s mission and the Mount Mary Women’s

Leadership Model ● To increase self-knowledge leading to an understanding of personal leadership styles ● To develop and increase skills and strategies for dealing with difficult issues and

conflict ● To expand knowledge of local and global social justice issues ● To experience service-learning as a means of growing in leadership, self-

understanding, and knowledge of social justice issues ● To develop reading, writing, and oral communication skills (End, 2005, pp. 97-98)

Northern Illinois University. Northern Illinois University’s University 101, University Experience, course is an elective for first-semester freshmen, and it is designed to help students: ● Understand the challenges and expectations of college ● Develop strategies for academic success ● Adjust to the university community and become involved ● Communicate with faculty ● Learn to manage time and money ● Learn how to use technology and NIU’s resources ● Live in a diverse community ● Prepare for a career (House, 2005, p. 104)

Olympic College. Olympic College offers General Studies 100, Strategies for Academic Success, a requirement for students requiring developmental English courses and an elective for other students. Students in this course learn: ● To demonstrate knowledge of the purposes, values, and expectations of higher education ● To demonstrate basic self-awareness and self-management ● To demonstrate academic skills of learning how to learn ● To write an educational/career plan ● To demonstrate knowledge of physical, social, and emotional wellness (Huston, 2005, p.

123)

Temple University. Faculty at Temple University teach a one-credit, elective course, Learning for the New Century, with four major goals: ● Enhance students’ intellectual development and improve their study behaviors and skills ● Enhance students’ social development and engagement in the campus community

38

Page 41: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

● Promote collaborative learning and group work ● Allow students to practice technology applications and retrieval of information.

(Laufgraben, 2005, p. 152).

*All FYE outcomes are taken verbatim from B. F. Tobolowsky, B. E. Cox, & M. T. Wagner (Eds.). (2005). Exploring the Evidence: Reporting Research on First-Year Seminars, Volume III (Monograph No. 42). Columbia, SC: University of South Carolina, National Resource Center for The First-Year Experience and Students in Transition.

Methods to Assess Objectives

Properties of Good Assessment Techniques

• Valid—directly reflects the objective being assessed • Reliable—especially inter-rater reliability when subjective judgments are made • Actionable—results help reviewers identify what’s working well and what needs more

attention • Efficient and cost-effective in time and money • Interesting—staff care about results and are willing to act on them • Triangulation—multiple lines of evidence point to the same conclusion

Frequently-Used Strategies (Nichols & Nichols)

1. Counts (e.g., number of students who eat in the cafeteria or the number of days to process an

invoice) 2. Client satisfaction measures (e.g., ratings from surveys, interviews, and focus groups; broad-

based and point-of-contact data may be collected). 3. External evaluation reports (e.g., Health Department review of the food service unit) 4. Learning Outcomes (e.g., quality of student resumes after a workshop at the Career Center).

Try to concentrate on direct, authentic assessment—to verify that learners can demonstrate what you want them to learn. Rubrics are useful for making subjective judgments about students’ learning. If you do indirect assessment of the achievement of learning outcomes (based on perceptions of learning), consider gap analysis (comparing importance and achievement ratings).

Some Ways to Collect Assessment Data • Co-curricular portfolio • Essays in required composition courses • Information literacy assignments (to assess library services) • Assignments and exams in first-year seminars (homework, reflective papers, exam questions) • Quick checks during programs (clickers or hand signals) • Quizzes at the end of sessions (perhaps linked to discussion or used as pre/post quizzes)

39

Page 42: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

• Community service learning reflective papers • Systematic observations (e.g., observe Student Senate meetings to see if procedures are

followed) • Focus groups, interviews, surveys Sometimes data are analyzed separately for subgroups of respondents, such as international students, athletes, evening students, or recently-hired employees to verify that all campus segments have benefited from the unit’s services.

Resources ACPA. You might find the work of the ACPA Commission on Assessment for Student Development (http://www.myacpa.org/) useful. Only members can access details. Learning Reconsidered (http://www.learningreconsidered.org/tools/project.cfm?sid=15). See the SLO Identification Rubric for many ideas concerning possible learning outcomes. Some Resources/Examples That Might Be Useful For Assessment of the Library ACRL Standards: http://www.ala.org/ala/acrl/acrlstandards/standardslibraries.htm Boston College: http://www.bc.edu/libraries/about/assessment/ Kapoun: http://www.webpages.uidaho.edu/~mbolin/kapoun2.htm Yale: http://www.library.yale.edu/assessment/toolkit/

References

Allen, M. J. (2004). Assessing Academic Programs in Higher Education. Bolton, MA: Anker.

Nichols, K. W., & Nichols, J. O. (2000). The Department Head’s Guide to Assessment Implementation in Administrative and Educational Support Units. New York: Agathon Press. [http://www.agathonpress.com]

Swing, R. L. (Ed.), Proving and Improving, Volume II: Tools and Techniques for Assessing the First College Year (Monograph No. 37). Columbia, SC: University of South Carolina, National Resource Center for The First-Year Experience and Students in Transition.

Upcraft, M. L., & Schuh, J. H. (2000). Assessment Practice in Student Affairs: An Applications Manual. San Francisco: Jossey Bass. Upcraft, M. L., & Schuh, J. H. (1996). Assessment in Student Affairs: A Guide for Practitioners. San Francisco: Jossey Bass. Schuh, J. H., & Associates, (2008). Assessment Methods for Student Affairs. San Francisco: Jossey Bass. [expected release is October 24, 2008]

40

Page 43: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

San Jose State University Counseling Services Feedback Form*

The Counseling Services’ staff would appreciate your feedback on our services so we can better understand what is working well and what could be improved. We would very much appreciate your honest feedback. This survey is voluntary and non-identifying, so please do not put your name on this form.

Please indicate your degree of agreement with each of the following statements, using this scale:

N/A=This question does not apply to me. 1=Strongly Disagree

2=Disagree 3=Neutral 4=Agree

5=Strongly Agree

_____ 1. My counselor helped me learn about myself. _____ 2. After receiving counseling services, I am more confident that I can succeed in my

studies. _____ 3. My counselor helped me learn about support services and resources at SJSU. _____ 4. My experience with Counseling Services helped me learn skills that I can develop to

maximize my potential for academic success. _____ 5. My experience with Counseling Services helped me learn skills that I can develop to

maximize my potential for personal success. _____ 6. I would recommend this service to a friend. _____ 7. Overall, Counseling Service staff are caring professionals. _____ 8. Overall, Counseling Service staff are effective professionals.

9. Please describe one or two things Counseling Services staff did that you found particularly helpful. 10. Please describe one or two things that Counseling Services staff could do to provide you better support. 11. Please provide some background information about yourself.

a. Age ____ b. Major: ______________ c. Gender: ______________ d. Class Level: ___________ e. Ethnicity________________ f. Reason for Visit: ___ Personal Concerns ___ Educational Concerns g. Number of times you have visited the Counseling Center: ________

Thanks for your feedback!

*Please send comments or suggestions to Wiggsy Siversten, San Jose State University. 2/06

41

Page 44: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Yale Library Focus Group Study Retrieved September 21, 2007 from

http://www.library.yale.edu/assessment/toolkit/DIGquestions.doc

Delivery Improvement Group Focus Group Report

January 2002 Background: As part of its charge, the Shipping and Delivery Service Improvement Group (known as the Delivery Improvement Group or DIG) was asked to “clarify the vision for quality shipping and delivery services in the Yale University Library,” and to “understand the current ‘as is’ processes.” The charge also asked the group to “solicit and review internal assessment data, including feedback from customers and providers of the services.” Focus Groups seemed a logical way to address the charge. Yale University Library delivery service managers (Manager, Shipping and Receiving and Manager, Library Shelving Facility) were asked to identify their key customer groups. Email invitations were sent to the managers of these departments asking them to identify participants for the focus groups. In the end, thirteen customers were invited to provide their perspective on the current state of delivery of materials within the library system, trends they anticipate in the future, and their ideals of delivery services. Two focus groups were held in August 2001 during work hours. Nine participants attended the first session; four the second. Preparation: A number of books in the Yale University Library collections were useful in preparing to lead the focus group. Especially helpful was: Focus groups for libraries and librarians / Beryl Glitz. New York, NY : Forbes, c1998. SML, Stacks: Z678.88 G65 1998 (LC) Key to a successful focus group is a targeted statement of purpose. For this focus group, the purpose was: A) To assess “as is” state of current delivery services—Eli Express, US mail, UPS, LSF, etc. B) To learn more about the shipping and delivery needs of various units of the library. C) To learn more about trends in the use of shipping and delivery. We used a small handheld tape recorder with a tape that lasted 45 minutes and was flipped once. Two members of DIG participated in each session: Holly Grossetta Nardini as facilitator at both sessions, John Gallagher as observer at the first session, Carol Jones as observer at the second session.

42

Page 45: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

We arrived early to set the chairs up in a comfortable arrangement and to test the equipment. Moderator’s Introduction: I’m Holly Grossetta Nardini, Service Quality Support Director, and chair of the new Delivery Improvement Group. I’d like to also introduce John Gallagher, Head of Circulation at the Medical Library, who’s a member of the task force and will be helping today. We have asked you here today because the Library is exploring possible improvements to its internal and external delivery services, both through Shipping & Receiving and LSF. You’ll see a yulib-l message later this week outlining our charge. You all have some knowledge about delivery. We asked LSF and S&R to identify their key customer groups and your units were identified. Your individual input is important and will be used to improve service. Thank you for taking the time to come to this discussion. This is a focus group. I am going to be asking you some questions about your experiences and the information we learn from you will be used to help decide what direction we pursue for improvements. We’ll start with general questions about the current state of delivery services—Eli Express, US mail, UPS, LSF, etc. Then we’ll talk a bit about trends in your use of shipping and delivery services and finally we’ll try to learn more about the shipping and delivery needs of various units of the library. Note that when we talk about LSF, we will only be discussing its delivery operations, not the selection process for material transferred to LSF. Goal is not to suggest solutions but to identify issues and problems. Overview We’ll be here about an hour. I’ll ask you a series of open-ended questions. Feel free to express your ideas. Feel free to ask for clarification. An interesting discussion, not a test, a debate or a lecture. Please feel free to say what’s on your mind. There are no right or wrong answers. I know that you won’t all agree with each other and, in fact, the more views we hear the better, since only a small number of people can be reached in a focus group. Data collected at this session will be aggregated. Your names will not be linked to your comments. We are tape recording session to be sure we capture your comments, but not to identify individuals. For that reason, I will ask you to speak clearly, loudly and one at a time. Interaction is encouraged. I am here to ask questions and facilitate, but the focus is on your opinions and experiences. Each person’s input is important. I ask you to jump in if you want to affirm or disagree with any opinion.

43

Page 46: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Questions Name, department, position? One or two sentences about what you do. What is use do you currently make of shipping and delivery services? Prompts: UPS packages, computers & furniture, LSF deliveries and returns, Eli Express How have your demands for delivery changed over time and what other changes do you foresee? Can you describe one experience with sending or receiving an item on- or off-campus? Imagine you are shipping something for the first time, what would expect from the service? Imagine you are waiting for delivery of material for a third party, what would you expect from the delivery service? What other shipping services would you like to see the Library provide?

Conclusion At the end of each session, we briefly reviewed the purpose of the discussion. We asked each member of the group to sum up their feelings about the topic and add anything they may have wanted to say earlier. Finally, I asked a general question: Did we miss anything? All participants were thanked and reassured about the anonymity of the session.

44

Page 47: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Muhlenberg College Dining Services Catering Survey In an effort to continue to provide the best service possible, we would appreciate a few minutes of your time to provide us with your thoughts and input on how we can best serve Muhlenberg College. Please complete the brief survey below. 1. How well do you feel Sodexho Catering is meeting your present needs? (Please rate your satisfaction by

circling one number on the scale below.)

VERY WELL NOT WELL 10 9 8 7 6 5 4 3 2 1

Please comment:

2. Listed below are key elements of a quality catering program. For each element, please rate Sodexho’s

performance.

FOOD EXCELLENT POOR

a. Taste of food 10 9 8 7 6 5 4 3 2 1

b. Nutritional value of food 10 9 8 7 6 5 4 3 2 1

c. Appearance of food 10 9 8 7 6 5 4 3 2 1

d. Variety of food 10 9 8 7 6 5 4 3 2 1

e. Temperature of food 10 9 8 7 6 5 4 3 2 1

f. Vegetarian offerings 10 9 8 7 6 5 4 3 2 1

SERVICE

g. Timing of service 10 9 8 7 6 5 4 3 2 1

h. Courteous service 10 9 8 7 6 5 4 3 2 1

i. Attentive service 10 9 8 7 6 5 4 3 2 1

j. Appearance of personnel 10 9 8 7 6 5 4 3 2 1

k. Professionalism and etiquette 10 9 8 7 6 5 4 3 2 1

ATMOSPHERE

l. Appearance of dining area 10 9 8 7 6 5 4 3 2 1

m. Cleanliness of china, flatware, glass 10 9 8 7 6 5 4 3 2 1

n. Table settings 10 9 8 7 6 5 4 3 2 1

o. Cleanliness of equipment 10 9 8 7 6 5 4 3 2 1

45

Page 48: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

MEETING YOUR NEEDS

p. Understanding your food service requirements 10 9 8 7 6 5 4 3 2 1

q. Receptiveness to new ideas and suggestions 10 9 8 7 6 5 4 3 2 1

r. Efficiency in planning of your event 10 9 8 7 6 5 4 3 2 1

s. Creativity and imagination in presenting new menu ideas 10 9 8 7 6 5 4 3 2 1

t. Consistency of services provided 10 9 8 7 6 5 4 3 2 1

3. When working with the sales associate are we following up and communicating effectively with you?

4. Do you feel that when you give Sodexho your pricing needs that we are able to develop a menu to fit?

5. Have you experienced any significant problems with Sodexho Catering during the past year? YES NO

If YES, please explain:

6. Does the event sheet and/or confirmation letter provide you with enough information for your event? YES NO

If NO, please explain:

7. In your association with our Catering Sales Coordinator, how would you rate his/her performance: (Circle on

number for each.)

EXCELLENT POOR

a. Effectiveness 10 9 8 7 6 5 4 3 2 1 b. Responsiveness to challenges 10 9 8 7 6 5 4 3 2 1 c. Creativity in providing imaginative menu ideas 10 9 8 7 6 5 4 3 2 1 d. Follow up once event is booked 10 9 8 7 6 5 4 3 2 1 to ensure correct information

8. On average, how often each month do you use Sodexho Catering for events you are hosting?

46

Page 49: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

1-2 times a month 3-5 times a month More than 5 times a month

9. On average, how often each month do you participate in a campus catering event that is organized or

arranged by another campus organization or department?

Annually Once every 4 months 1-2 times a month More than 3 times a month

Comments:

Thank you for your valuable input and cooperation.

Name Date

Department Please send your completed survey to: John Pasquarello General Manager Muhlenberg College Dining Seegers Union Building

47

Page 50: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Posted Feedback from Muhlenberg College Napkin Survey General's Quarters Napkin Board Activity

Date: March 6, 2007

Comment Response Why is everything so expensive this year? The food hasn't gotten ANY better! Why should I have to pay more for stupid checkered paper & fancy names.

Some prices did go up this year, but so did Meal Equivalency. The value of a swipe is now $4.50. There is no charge and no change in prices because we now use checkered paper.

Hello! G.Q.. peeps! I wanted to know if you could get Fiji water? It is my native land & I miss it! Thanks! Oh yeah, can you get balsamic dressing for the salads? Thanks!

Due to the exclusive agreement with Pepsi, Fiji Water is not an available product. We do have balsamic dressing at the deli, just bring your salad there and we'll be happy to put some on for you.

Egg Whites would be great! Just ask at the grill, we do have them in both a pasteurized product as well as being able to use the whites only from a whole egg.

I want purple grapes in the little cups! The grapes were part of our summer sizzlers promotion, we're glad you liked them. We do offer grapes and cheese at our grab and go cooler, just not in cups.

Please get plain Yogurt. We will look into that right away! Dear GO, It is ridiculous that a wrap, ONE wrap, that's it, without a soda or chips, goes over 1 swipe. Please make your increased prices correspond with the equivalency of a swipe.

Our regular Turkey, Ham, and Tuna wraps with a soda and chips still are one swipe. We have added some Premium sandwiches to our line-up, without eliminating last years options, to enhance your dining, including wheat wraps too.

Who can I actually talk to about my problems with GO instead of this piece of paper?

You can talk to me, Joe McCloud. I am the new GO manager and I am here 5 (sometimes more) days a week. My office number is: x-3476.

Dear GO, Why do you enjoy scamming Muhlenberg Students? I don't approve

It sounds like you may have had an unpleasant dining experience. Please stop by and see me so we can discuss it. Joe McCloud, GQ manager

(list was truncated to save paper)

48

Page 51: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Writing Rubric Johnson Community College, downloaded 12/22/04 from

http://www.jccc.net/home/depts/6111/site/assmnt/cogout/comwrite 6 = Essay demonstrates excellent composition skills including a clear and thought-provoking thesis,

appropriate and effective organization, lively and convincing supporting materials, effective diction and sentence skills, and perfect or near perfect mechanics including spelling and punctuation. The writing perfectly accomplishes the objectives of the assignment.

5 = Essay contains strong composition skills including a clear and thought-provoking thesis, although development, diction, and sentence style may suffer minor flaws. Shows careful and acceptable use of mechanics. The writing effectively accomplishes the goals of the assignment.

4 = Essay contains above average composition skills, including a clear, insightful thesis, although development may be insufficient in one area and diction and style may not be consistently clear and effective. Shows competence in the use of mechanics. Accomplishes the goals of the assignment with an overall effective approach.

3 = Essay demonstrates competent composition skills including adequate development and organization, although the development of ideas may be trite, assumptions may be unsupported in more than one area, the thesis may not be original, and the diction and syntax may not be clear and effective. Minimally accomplishes the goals of the assignment.

2 = Composition skills may be flawed in either the clarity of the thesis, the development, or organization. Diction, syntax, and mechanics may seriously affect clarity. Minimally accomplishes the majority of the goals of the assignment.

1 = Composition skills may be flawed in two or more areas. Diction, syntax, and mechanics are excessively flawed. Fails to accomplish the goals of the assignment.

Revised October 2003

49

Page 52: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

California State University, Fresno Scoring Guide for Writing Scoring Level Knowledge of Conventions Clarity and Coherence Rhetorical Choices 4 - Accomplished

In addition to meeting the requirements for a “3,” the writing is essentially error-free in terms of mechanics. Models the style and format appropriate to the assignment.

In addition to meeting the requirements for a “3,” writing flows smoothly from one idea to another. The writer has taken pains to assist the reader in following the logic of the ideas expressed.

In addition to meeting the requirements for a “3,” the writer’s decisions about focus, organization, style/tone, and content made reading a pleasurable experience. Writing could be used as a model of how to fulfill the assignment.

3 - Competent

While there may be minor errors, the paper follows normal conventions of spelling and grammar throughout and has been carefully proofread. Appropriate conventions for style and format are used consistently throughout the writing sample. Demonstrates thoroughness and competence in documenting sources; the reader would have little difficulty referring back to cited sources.

Sentences are structured and word are chosen to communicate ideas clearly. Sequencing of ideas within paragraphs and transitions between paragraphs make the writer’s points easy to follow.

The writer has made good decisions about focus, organization, style/tone, and content to communicate clearly and effectively. The purpose and focus of the writing are clear to the reader and the organization and content achieve the purpose well. Writing follows all requirements for the assignment.

2 - Developing

Frequent errors in spelling, grammar (such as subject/verb agreements and tense), sentence structure and/or other writing conventions distract the reader. Writing does not consistently follow appropriate style and/or format. Source documentation is incomplete. It may be unclear which references are direct quotes and which are paraphrased.

Sentence structure and/or word choice sometimes interfere with clarity. Needs to improve sequencing of ideas within paragraphs and transitions between paragraphs to make the writing easy to follow.

The writer’s decisions about focus, organization, style/tone, and/or content sometimes interfere with clear, effective communication. The purpose of the writing is not fully achieved. All requirements of the assignment may not be fulfilled.

1 - Beginning

Writing contains numerous errors in spelling, grammar, and/or sentence structure which interfere with comprehension. Style and/or format are inappropriate for the assignment. Fails to demonstrate thoroughness and competence in documentation.

Sentence structure, word choice, lack of transitions and/or sequencing of ideas make reading and understanding difficult.

The writer’s decisions about focus, organization, style/tone, and/or content interfere with communication. The purpose of the writing is not achieved. Requirements of the assignment have not been fulfilled.

Retrieved June 6, 2002 from http://www.csufresno.edu/cetl/assessment/ (click on WritingScoring.doc)

50

Page 53: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Scoring Rubric for Reflection Papers (Compiled by California Polytechnic State University Service-Learning Program)

Retrieved March 14, 2007 from http://www.ccccd.edu/servicelearning/faculty-ref-paper-rubric.html

- Excellent Paper - Civic Awareness and Responsibility

The paper demonstrates that the student: • understands the complex nature of social problems and has identified several of the

causes leading to the social problem addressed by the agency; • understands that there are forces in action which may cause misfortune over which

individuals have no control. (i.e. realizes that individuals are not always solely to blame when they are faced with misfortunes; that it's not just a matter of "pulling yourself up by the bootstraps;")

• sees a relationship between the work of grass roots service agencies and local, state and national government;

• can explain in great detail the programs and services provided by the agency; • is committed to continued involvement in the community and/or in political processes

while in school or after graduation (OR makes a thoughtful argument against or questioning such involvement);

• has identified ways in which he/she can contribute to the community, including both skills and knowledge;

• grasps the concept of social justice; • made commitments to the agency that exceeded those required by the class and fulfilled

all of them. Critical Thinking

The paper shows that the author: • views situations from multiple perspectives; able to observe multiple aspects of the

situation and place them in context; • perceives conflicting goals within and among the individuals involved in a situation and

recognizes that the differences can be evaluated; • recognizes that actions must be situationally dependent and understands many of the

factors which affect their choice; • makes appropriate judgements based on reasoning and evidence; • has reasonable assessment of the importance of the decisions facing clients and his or her

responsibility as a part of the clients' lives; • began to think in new ways; about the clients served, society and social problems in

general, him/herself as a person; • not only understands the purpose(s) and programs of the agency selected but uses critical

thinking skills to evaluate its effectiveness and to develop recommendations for improvement;

• realizes that he/she can learn outside the classroom because he/she has accessed information from a variety of sources in the field (i.e. observation, interview, reading

51

Page 54: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

materials, etc.) thereby demonstrating capacity for self-guided, life-long learning activities;

• able to use many sources of information within a social environment; • sees how and where skills and information gained through service involvement can be

applied to other situations; • reflects on and can articulate the meaning of a "real life" experience.

Personal Development The paper indicates that the student:

• realizes how much he or she can learn from others, including those considered to be "underprivileged;"

• appreciates people whose values, lifestyles or cultures are different from his or her own; • has examined his own beliefs in light of the experience; • sees evidence that the author continues in the process of developing a philosophy of life; • sees how service involvement could impact his personal career development; • understands some of the factors that make the people who are served and/or agency staff

different from him/herself.

- Proficient Paper - Civic Awareness and Responsibility

The paper demonstrates that the student: • is likely to continue his interest in his issue area; • appreciates the complex nature of the social issue addressed by the agency and names at

least two causes; • understands that there are forces in action which may cause misfortune over which

individuals have no control. (i.e. realizes that individuals are not always solely to blame when they are faced with misfortunes; that it's not just a matter of "pulling yourself up by the bootstraps");

• has fulfilled all commitments made to the agency including eight hours of service; • has a sense of the contributions that he/she can make in terms of his/her skills and

knowledge; • is committed to working with the same or a similar agency at some point in his or her

future (OR provides a well thought out argument against or questioning such involvement).

Critical Thinking The paper shows that the author:

• not only understands the purpose(s) and programs of the agency selected but uses critical thinking skills to evaluate its effectiveness and to develop at least two recommendations for improvement;

• sees how and where skills and information gained through service involvement can be applied to other situations;

• has accessed information from a variety of sources in the field (e.g. observation, interview, reading related materials, discussion groups), thereby demonstrating a capacity for applying "learn by doing" in the community as a method for life-long learning;

• observations are fairly thorough and nuanced although they tend not to be placed in a broader context;

52

Page 55: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

• provides a cogent critique from one perspective, but fails to see the broader system in which the aspect is embedded and other factors which may change;

• uses both unsupported, personal belief and evidence but is beginning to be able to differentiate between them;

• perceives legitimate differences of view point; • demonstrates a beginning ability to interpret.

Personal Development The paper indicates that the student:

• realizes that he/she can learn from people whose values, lifestyles or cultures are different from his/her own;

• understands some of the factors that make the people served and/or agency staff different from him/herself;

• sees how service involvement could impact his/her personal and career development.

- Acceptable Paper - Civic Awareness and Responsibility

The paper demonstrates that the student: • is aware at a general level of social problems and their complex nature; • recognizes a need for people to get involved; • demonstrates some idea of how and where his/her skills and knowledge can be used for

community betterment. Critical Thinking

The paper shows that the author: • understands the purpose(s) and programs of the agency selected and provides at least one

idea of how its services might be improved; • has accessed information from a variety of sources in the field (i.e. observation,

interview, reading related materials, discussion groups); • gives examples of observed behaviors or characteristics of the client or setting, but

provides no insight into reasons behind the observation; • observations tend to be one-dimensional and conventional or unassimilated repetitions of

what has been heard; • tends to focus on just one aspect of the situation; • uses unsupported personal beliefs frequently as "hard" evidence; • may acknowledge differences of perspective but does not discriminate effectively among

them. Personal Development

The paper indicates that the student: • realizes that he or she can learn from others, including those considered to be

"underprivileged;" • is tolerant of people whose values, lifestyles or cultures are different from his or her own.

- Unacceptable Paper -

Civic Awareness and Responsibility The paper demonstrates that the student:

• lacks information about social problems and/or interest in addressing them;

53

Page 56: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

• demonstrates no personal commitment to helping find a solution for community problems;

• has not fulfilled his/her commitments to the agency. Critical Thinking

The paper shows that the author: • does not see how skills and information gained through service involvement can be

applied to other situations. Personal Development

The paper indicates that the student: • believes he or she has little to learn from others, including those considered to be

"underprivileged;" • is not tolerant of individual differences and continues to rely on traditional stereotypes to

describe and deal with people who are different from him/herself; • has undergone no examination of his/her own beliefs in light of the service experience.

54

Page 57: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Intentional Learning Scoring Rubric*

Learning Outcome

Below Basic BB

Basic B

Proficient P

Advanced A

Self-Aware and Self-Directed 1. Articulate their reasons for study within the context of a liberal arts education

Does not provide reasons for study or provides irrelevant or inappropriate reasons for study within a context of liberal arts education.

Provides one or more valid reasons that focus on positive impact on one of the following broad areas: the student’s personal, professional, or civic life.

Provides valid reasons that focus on positive impact on at least two of the following broad areas: the student’s personal, professional, and civic life.

Discusses a variety of valid reasons that focus on positive impact on all of the following broad areas: the student’s personal, professional, and civic life.

2. Describe, evaluate, and improve their own learning processes

Does not address all three aspects of this outcome (describe, evaluate, and improve) or focuses only on memorization of isolated facts.

Identifies more than one learning strategy and goes beyond memorization of isolated facts, but concentrates on learning within specific courses and/or provides minimal discussion related to evaluation and improvement.

Identifies a variety of learning strategies and when they are most effective. Describes strategies for improving learning. The response goes beyond specific courses, suggesting awareness that learning is a life-long activity and/or that learning involves making connections across contexts.

Response has all the characteristics indicating proficiency, plus demonstrates sophisticated development of learning skills that are broadly applicable in and out of the classroom and that involve making connections across contexts, such as connecting academic learning to personal or professional experiences.

3. Develop plans for pursuing learning goals

Does not provide a plan to pursue learning goals or describes a plan that focuses on memorization of isolated facts.

Provides a plan that goes beyond memorization of isolated facts, but the plan lacks sufficient detail to make effective learning likely.

Provides a plan that is likely to result in effective learning. The plan addresses at least one major issue, such as: • time management • use of learning

skills refined through personal experience

• need to monitor learning and possibly adapt the plan

• need to make connections across contexts

Provides a plan that is likely to result in effective learning, as well as sophisticated discussion of at least two major issues, such as: • time management • use of learning

skills refined through personal experience

• need to monitor learning and possibly adapt the plan

• need to make connections across contexts

4. Set, pursue, and reflect upon their learning goals

Does not address all three aspects of this outcome: setting, pursuing, and reflecting on

Addresses setting, pursuing, and reflecting on learning goals, but the response suggests

Addresses setting, pursuing, and reflecting on learning goals in sufficient detail to suggest self-

Addresses setting, pursuing, and reflecting on important learning goals and indicates routine, on-

55

Page 58: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

learning goals.

need for external support from family members, friends, teachers, or others to initiate and/or complete at least one of these processes.

reliant learning. going reflection and flexibility in revising short- and long-term goals and/or learning strategies.

Multiple Perspectives 5. Identify diverse or conflicting concepts, viewpoints, and/or priorities (revised May 2008)

Does not identify diverse or conflicting concepts, viewpoints, or priorities or identifies conflicts that are irrelevant to the situation being addressed.

Identifies at least two diverse or conflicting concepts, viewpoints, or priorities in the situation being addressed, but does not elaborate in sufficient detail to demonstrate clear understanding and/or does not identify obvious conflicts.

Identifies major diverse or conflicting concepts, viewpoints, or priorities present in the situation being addressed.

Identifies major diverse or conflicting concepts, viewpoints, or priorities present in the situation being addressed, as well as subtle nuances and complexities.

6. Articulate the value of considering multiple perspectives

Does not articulate the value of considering multiple perspectives.

Recognizes that others’ opinions and viewpoints have value, but shows lack of discrimination or analysis, as if all perspectives are always equally valid or as if one’s own perspective is always superior.

Demonstrates the value of multiple perspectives and recognizes that one’s own perspective is not always superior and that all perspectives may not be equally valid.

Response has all the characteristics indicating proficiency, plus explores the processes of evaluating conflicting perspectives and/or demonstrates a commitment to seek out dissenting viewpoints.

7. Examine phenomena from multiple viewpoints. (revised May 2008)

Considers the phenomenon from one perspective or consistently favors a single perspective

Examines at least two perspectives.

Examines multiple perspectives and identifies some relevant commonalities and conflicts.

Examines the phenomenon from multiple viewpoints and explores subtle nuances and complexities among the viewpoints and/or provides sophisticated discussion evaluating their relative merit.

Make Connections 8. See connections in seemingly disparate information

Does not identify connections or focuses on invalid connections.

Identifies valid connections, but tends to focus on the obvious, such as connecting related disciplines.

Identifies valid connections that go beyond the obvious.

Identifies valid connections that are subtle, sophisticated, and/or creative and discusses insights or implications based on these observations.

9. Recognize links among topics and concepts presented in different courses

Does not identify links or identifies invalid links among topics and concepts presented in different

Identifies valid links among topics and concepts in different courses, but tends to focus on the obvious or does not fully

Identifies valid links among topics and concepts presented in different courses, goes beyond the obvious, and explains

Identifies valid links that are subtle, sophisticated, and/or creative and discusses insights or implications associated

56

Page 59: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

courses. explain the nature of the links.

the nature of the links.

with the links.

10. Synthesizes disparate facts, theories, and concepts

Does not synthesize disparate facts, theories, and concepts or provides an invalid synthesis.

Provides a valid synthesis, but does not explicitly address major relevant aspects of the disparate information.

Provides a valid synthesis that explicitly addresses major aspects of the disparate information.

Provides a valid synthesis that explicitly identifies sophisticated or creative connections involving subtle nuances and complexities in the disparate information.

11. Work within a context of diverse and conflicting concepts, viewpoints, and/or priorities (revised May 2008)

Does not propose a strategy, or proposes irrelevant or unreasonable strategy(ies) for this situation.

Proposes simplistic or undeveloped strategy(ies) for working within this situation.

Describes reasonable strategy(ies) for working within this situation.

Describes creative, sophisticated strategy(ies) for working within this situation.

Apply Skills and Knowledge to Different Contexts 12. Adapt what is learned in one situation to problems encountered in another

Does not adapt what is learned in one situation to problems in another situation or describes an invalid adaptation.

Describes a valid adaptation, but the solution relies on concrete similarities between the two contexts.

Describes a valid adaptation that goes beyond concrete similarity between the two contexts.

Describes a creative and/or sophisticated adaptation that has the potential for developing more effective solutions or new insights about the problem being addressed.

13. Connect intellectual study to personal life

Does not connect intellectual study to personal life or describes invalid connections.

Describes valid connections between intellectual study and personal life, but the connections rely on concrete similarities between the two contexts.

Describes valid connections between intellectual study and personal life that go beyond concrete similarity between the two contexts.

Describes creative and/or sophisticated connections between intellectual study and personal life that lead to new insights or behaviors.

14. Draw on a wide range of knowledge to make decisions

Does not present a decision, does not provide the rationale for a decision, or relies on one line of information to make a decision.

Makes a decision based on a narrow range of knowledge, perhaps applying ideas from a single course or discipline or from closely-connected disciplines.

Makes a reasonable decision based on more than a narrow range of knowledge.

Makes a creative or particularly effective decision based on sophisticated integration of ideas from a wide range of knowledge.

*Developed with support from a Teagle Foundation grant. Retrieved January 4, 2008 from Report on First Year at http://www.roanoke.edu/teagle

57

Page 60: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Student Conduct Essay Writing Rubric Created by Vicki Castillon and Leonard Valdez at Sacramento State University; shared at a session at the 2008 WASC conference by Lori Varlotta and Beth Merritt Miller. This rubric is used to assess required essays for students who violated campus policies. Retrieved July 13, 2008 from http://4255856193304126708-a-wascsenior-org-s-sites.googlegroups.com/a/wascsenior.org/assessing-student-learning-outside-the-classroom/Home/Sample_Instrumentspg1.doc?attredirects=0. 1. Student completely

lacks understanding/ knowledge. Significant improvement needed.

2. Student understanding/ knowledge is limited. Improvement needed.

3. Student understanding/ knowledge is adequate. There is room for improvement.

4. Student understanding/ knowledge is very good. Little need for improvement.

5. Student understanding/ knowledge is exceptional. No improvement needed.

SCORE

Demonstrates knowledge of policy violation

The student demonstrates no understanding of the policy and how his/her behavior violates policy.

The student demonstrates limited understanding of the policy and how his/her behavior violates policy.

The student adequately demonstrates understanding of the policy and how his/her behavior violates policy.

The student very clearly demonstrates understanding of the policy and how his/her behavior violates policy.

The student demonstrates thorough and complete understanding of the policy and how his/her behavior violates policy.

Accepts responsibility for actions

The student demonstrates no acceptance of responsibility for his/her actions.

The student demonstrates limited acceptance of responsibility for his/her actions.

The student adequately demonstrates acceptance of responsibility for his/her actions.

The student very clearly demonstrates acceptance of responsibility for his/her actions.

The student demonstrates thorough and complete acceptance of responsibility for his/her actions.

Demonstrates increased awareness and maturity.

The student demonstrates no increase in awareness and maturity.

The student demonstrates limited increase in awareness and maturity.

The student adequately demonstrates increase in awareness and maturity.

The student very clearly demonstrates increase in awareness and maturity.

The student demonstrates thorough and complete increase in awareness and maturity.

Demonstrates commitment not to violate University policy in the future.

The student demonstrates no commitment to refrain from future policy violations.

The student demonstrates limited commitment to refrain from future policy violations.

The student adequately demonstrates commitment to refrain from future policy violations.

The student very clearly demonstrates commitment to refrain from future policy violations.

The student demonstrates thorough and complete commitment to refrain from future policy violations.

Demonstrates understanding of the value of academic integrity.

The student demonstrates no understanding of the value of academic integrity.

The student demonstrates limited understanding of the value of academic integrity.

The student adequately demonstrates understanding of the value of academic integrity.

The student very clearly demonstrates understanding of the value of academic integrity.

The student demonstrates thorough and complete understanding of the value of academic integrity.

58

Page 61: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lecture / Discussion 3:

Direct and Indirect

Assessment Methods

Barbara Wright

59

Page 62: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Using Direct and Indirect Assessment Methods

Barbara D. WrightAssociate Director, ACSCU/[email protected]

September, 2009 Level I Assessment Retreat, Long Beach, CA

2

1. Outcomes, questions

2. Gathering evidence3. Interpretation

4. Use

The Assessment Loop

September, 2009 Level I Assessment Retreat, Long Beach, CA

3

What is an “assessment method”?

It’s how you collect the evidence, direct or indirect, that will tell you about the quality of your students’learning (step 2 on the loop) and how to improve it.

60

Page 63: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

September, 2009 Level I Assessment Retreat, Long Beach, CA

4

Direct? Indirect?

Direct evidence demonstrates your students’ learning directly, in an unfiltered way.

Indirect evidence is mediated by the person responding to a questionnaire, interviewer, etc. It is influenced by perceptions, experiences, etc.

September, 2009 Level I Assessment Retreat, Long Beach, CA

5

What do you use when?

Direct evidence tells you what your students know and can do in relation to your learning outcomes

Indirect evidence can reveal why and how students learned what they learned – or didn’t – if you ask the right questions.

September, 2009 Level I Assessment Retreat, Long Beach, CA

6

So what should we choose? It depends on your question.

Best practice: multiple methodsDirect evidence is the gold standardIndirect evidence fills out the pictureBoth are useful at step 3: interpretationDescriptive data are the third major source of evidence and also useful when combined w/ other methods

Choose the method(s) most likely to provide evidence that will answer your question.

61

Page 64: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

September, 2009 Level I Assessment Retreat, Long Beach, CA

7

Shifts in our understanding of assessment

Isolated facts, skills

Memorization, reproduction

Comparing performance against other students

A full range of knowledge, skills, dispositionsProblem solving, investigating, reasoning, applying, communicatingComparing performance to established criteria

from . . . to

September, 2009 Level I Assessment Retreat, Long Beach, CA

8

Shifts in assessment, cont.

Scoring right, wrong answers

a single way to demonstrate knowledge, e.g. m/c or short-answer testSimplified evidence

Looking at the whole reasoning processMultiple methods & opportunities, e.g., open-ended tasks, projects, observationsComplex evidence

September, 2009 Level I Assessment Retreat, Long Beach, CA

9

Shifts in assessment, cont.

A secret, exclusive & fixed processReporting only group means, normed scoresScientificA filterAn add-on

open, public & participatory

Disaggregation, analysis, feedbackEducativeA pumpEmbedded

62

Page 65: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

September, 2009 Level I Assessment Retreat, Long Beach, CA

10

Shifts in assessment, cont.“teacher-proof”assessment

Students as objects of measurement

episodic, conclusive

Reliability

Respect, support for faculty & their judgmentsStudents as participants, beneficiaries of feedback

continual, integrative, developmentalValidity

September, 2009 Level I Assessment Retreat, Long Beach, CA

11

Choice of assessment method matters.

Students value and learn what we teach and test.How we teach and test matters as much as whatWhat and how we assess also matters.We get more of what we test or assess, less of what we don’t.

September, 2009 Level I Assessment Retreat, Long Beach, CA

12

Higher-order thinking …( adapted from L. Resnick, 1987)

It’s nonalgorithmic, i.e., the path of action is not fully specified in advance.It’s complex, i.e., the total path is not “visible” from any single vantage point.It often yields multiple solutions, each with costs and benefits. It requires nuanced judgment and interpretationIt involves application of multiple criteria, which may conflict with one another

63

Page 66: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

September, 2009 Level I Assessment Retreat, Long Beach, CA

13

Higher order thinking, cont …

It often involves uncertainty; not everything about the task is known or can be.It requires self-regulation; someone else is not giving directions.It involves making meaning, discerning patterns in apparent disorder.It is effortful: the elaborations and judgments required entail considerable mental work and are likely to take time.

September, 2009 Level I Assessment Retreat, Long Beach, CA

14

Other approaches to higher-order learning:

Bloom’s taxonomyPerry Scheme of Intellectual DevelopmentBiggs’ and Entwistle’s work on surface and deep learning?

September, 2009 Level I Assessment Retreat, Long Beach, CA

15

The hierarchy of specificity

Institutional outcomes

College outcomes

Department & program outcomes

Course-level outcomes

64

Page 67: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

September, 2009 Level I Assessment Retreat, Long Beach, CA

16

The hierarchy of specificity

Oral and written communication

Professional communication

Ability to write for business

Ability to write a business plan

September, 2009 Level I Assessment Retreat, Long Beach, CA

17

Where are skills practiced? Where can evidence be

gathered? Think horizontally as well as vertically.

Oral & written communication

Professional communication

Ability to write for business

Ability to write a business plan

Internship * Student government * Business courses * Gen Ed

September, 2009 Level I Assessment Retreat, Long Beach, CA

18

The bottom line . . .

Choose methods that are consistent with shifts, trends in higher education and assessment practice.

Choose methods that support the educational outcomes you value, e.g. higher-order intellectual skills and dispositions

65

Page 68: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Direct Assessment Methods -- A Close-up Look

by

Barbara D. Wright

Associate Director,

Accrediting Commission for Senior Colleges and Universities Western Association of Schools and Colleges

Alameda, CA 94501

[email protected]

September 10, 2009

66

Page 69: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Portfolios: collections of student work (and sometimes other material such as transcripts, test scores, or performance reviews) intended to illustrate achievement of learning outcomes. The mantra is “collect, select, reflect, connect.” Advantages:

• Are adaptable to different levels of assessment (i.e. individual student, program, institution) purposes (i.e. cross-sectional snapshot; change/progress over time) kinds of materials (i.e. written work, tapes of performances,

student self- assessments) • Can tell us where student are and how they got there • Emphasize human judgment, meaning-making • Provide information likely to be used • Have become extremely popular, hence an easy sell • Engage students, faculty • Are educational for both students and faculty • Reduce fears of misuse • Can be managed by students – to some extent • Are supported by many different software programs

Disadvantages:

• Can be labor-intensive • Can be cumbersome to store, navigate through • Must relate contents to articulated outcomes • Require carefully defined criteria for review, e.g. rubrics • Require training for reviewers • Require distinguishing between usefulness of the portfolio for students (e.g., to

showcase work, impress prospective employers, inform advisors) and for assessment of learning

Solutions/responses:

• Collect samples of work, not everything from everybody • Use electronic storage and retrieval • Give students responsibility for maintaining the portfolio • Invest in outcomes, because they’re the basis for everything anyway • Invest in good criteria for education’s sake • Invest in training for faculty development’s sake

67

Page 70: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Capstones: a wide variety of culminating projects, assignments, performances, or even experiences, e.g., faculty-supervised community service, internships Advantages:

• Are cumulative • Are integrative • Are adaptable to demonstration of

skills general education professional field or major dispositions institutional outcomes combinations

• Are motivating for students • Set standards for degree completion, graduation • Provide an occasion for department-level discussion, interpretation • Invite external evaluation • Help students make the transition to

self-assessment professional assessment life-long learning

Disadvantages:

• Pose challenge of capturing all students in their final year/semester • Differences within/among majors demand flexibility plus commonality • May mean an additional course requirement • Require coordinating multiple dimensions of learning & assessment • Can be labor-intensive • Must relate to carefully articulated outcomes • Require carefully defined criteria for review, e.g. rubrics • Require distinguishing between purpose of the capstone for students and for

program assessment Solutions/responses:

• Require the capstone for graduation • Introduce as widely as possible across the institution • Include capstone experiences within existing courses • Provide resources, staff support • View resources, labor, as worthwhile investment

68

Page 71: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Performances: activities, live or recorded, designed to demonstrate specific outcomes, e.g. a poster presentation, conduct of a class, a musical or theatrical performance, client counseling, facilitation of a group discussion, “think aloud” analysis of a text. Advantages:

• Have face validity in terms of preparation for student’s real-life goals • Put emphasis on what the student can do (as opposed to knowing about):

require application may require spontaneous adaptation, problem-solving are integrative provide a reality check

• Give students with practical intelligence, skills, a chance to shine • Can elicit affective outcomes, e.g. poise, grace under pressure • Are motivating, encourage practice, rehearsing • Put the emphasis on active learning • Promote coaching relationship between students and faculty, especially when

there are external reviewers • Promote self-assessment, internalization of standards • Are highly adaptable, even to liberal arts

Disadvantages:

• Can be labor-intensive, time-consuming, expensive • Must relate to articulated outcomes • Require careful definition of criteria, e.g. rubrics • Require careful training of reviewers, including external reviewers • Require coordination, scheduling, esp. of external reviewers • May frighten off insecure students

Solutions/responses:

• Review a sample of students • Embed in routine, non-threatening situations (e.g., internship, clinical setting) • Use digital means to make performances accessible to reviewers • Regard outcomes, criteria, and training as an educational investment • Remind students they must demonstrate employability

69

Page 72: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Common assignments, template assignments, secondary readings, and other embedded assessments: student work produced in response to a course assignment is examined for multiple purposes, e.g., to determine command of course material but also to assess writing skill, information literacy, critical thinking, etc.

• “Common assignments”: the same assignment across multiple courses; • “template assignments”: the same format but not identical assignment across

multiple courses • “Secondary readings”: student work is examined “secondarily” for other qualities

beyond command of course material. Advantages:

• Use work produced by students as a normal part of their course work • Solve the problem of quality of student effort • Are efficient, low-cost • Have face validity • Provide maximally useful information for improvement with minimum slippage • Encourage discussion, collaboration among faculty & support staff • Can create campus-wide interest

Disadvantages:

• Require considerable coordination • Can be time-consuming to create, implement • Can be time-consuming, labor-intensive to score • Must be designed in relation to specific outcomes • Require careful definition of criteria for review, e.g., rubrics • Require careful training of reviewers

Solutions/responses:

• Focus on what’s important • Use “common questions” if an entire common assignment is impractical • Regard outcomes, criteria, and training as an educational investment • Provide support, “teaching circles’ to discuss implementation, findings • Remember the efficiencies, benefits • Make the investment

70

Page 73: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Course management programs: Software that allows faculty to set up chat rooms, threaded discussions, etc., and capture student responses Advantages:

• Are adaptable to wide range of learning goals, disciplines, environments • Use work produced electronically by students as a normal part of course

participation • Record threaded discussions, chat, ephemera that are impossible or cumbersome

to capture face to face • Give quiet students an opportunity to shine • Can preserve a large volume of material, allow sorting, retrieval, data analysis • Are efficient, low-cost • Are unintrusive • Solve the problem of quality of student effort • Allow prompt feedback • Develop students’ metacognition when assessment results are shared • Often include tests, quizzes, tasks as part of package, supporting multiple-method

approach, convenience Disadvantages:

• Rely heavily on student writing skill, comfort with technology • Pose challenges to higher levels of aggregation beyond individual course or

student • May discourage collaboration among faculty, staff, programs • Managing large volume of material can be difficult, intimidating • “No significant difference” bias may short circuit improvement • Tests, quizzes may promote recall, surface rather than deep learning • Built-in survey tools encourage collection of indirect rather than direct evidence • Direct observation of student performances is difficult or impossible • Software may drive the assessment effort, instead of assessment goals and values

driving choice, use of the software Solutions/responses:

• Develop good, focused outcomes, criteria, rubrics • Use built-in data management tools • Supplement if necessary, e.g. with “The Rubric Processor” • Invest in training of faculty, external reviewers • Use tests, quizzes with caution, supplement with authentic tasks • Negotiate with the maker, customize the software • Aim for program-level, not just individual or course-level improvement

71

Page 74: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Classroom Assessment/Research: an approach to assessment pioneered by K. Patricia Cross and Thomas A. Angelo; provides a large collection of techniques individual instructors can use in their classrooms to discover what students are learning – or not – and to make rapid adjustments. Advantages:

• Takes place at ground zero of learning process for: maximum relevance, usefulness minimum slippage

• Offers maximum privacy, minimum risk, anxiety • Is conducted continuously, has formative benefit • Can provide feedback on both

what students know and can do and how they got there, what helps or hinders

• Motivates students to become more active, reflective learners • Can also be used by faculty collectively for the bigger picture • Is faculty-friendly, respectful of privacy, autonomy • Offers significant resources (e.g., T. Angelo and K. P. Cross, Classroom

Assessment Techniques,1992) and support networks, especially for community college educators

Disadvantages:

• Is unstructured, highly dependent on individuals’ cooperation for administration of CATs (classroom assessment techniques) reporting of results

• Presents challenge of generalizing to program or institution level Solutions/responses:

• Provide consistent, careful leadership, oversight • Get buy-in from faculty, others • Start with agreement on shared outcomes, goals • Provide training • Make assessment a campus-wide conversation • Emphasize the potential for truly useful information for improvement

72

Page 75: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Student self-assessment: The student demonstrates the ability to accurately self-assess a piece of work or performance, usually in relation to one or more outcomes and a set of criteria, e.g. rubrics Advantages:

• The ultimate in active learning, engagement, ownership of one’s learning • Highly adaptable • Extremely educational for students • Promotes internalization of intellectual, personal, professional standards • Is an essential component of ongoing professional, personal development • Is an essential component of life-long learning • Faculty can aggregate individual results to identify general findings, trends

Disadvantages:

• Challenging, especially at outset, for both students and faculty • Requires clear outcomes, criteria (e.g., rubrics), expectations for level of

proficiency • Requires student to assess with candor, not spin • May cause anxiety, avoidance • Long-standing habits, personality traits may need to be overcome (e.g., self-

consciousness, excessive modesty, unrealistically high self-appraisal) • Requires tact and true coaching attitude from instructor, ability to critique the

work or performance, not the person • Requires careful management of others who may be present

Solutions/responses:

• Experienced instructors guide, mentor novice instructors • Students receive orientation, training • Outcomes, criteria, expectations are clear, widely distributed and understood • Examples of self-assessment are available • Process is presented as primarily developmental, formative • Examples of progress over extended time provide encouragement • Self-assessment is risk-free

73

Page 76: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Local tests: tests designed in relation to the specific course, program, or institution’s curriculum and learning outcomes, as opposed to generic, commercially available tests. Can be cumulative (e.g. comprehensives in the major) or less encompassing but still cross-cutting. Format may vary; need not be multiple choice, as in most commercial tests. Advantages:

• Tests are traditional, widely accepted academic practice • Testing across courses or programs requires active faculty participation • Can stimulate discussion about alignment of goals, curriculum, pedagogy, etc. • Can be designed to have content validity • Can adapt readily to institutional changes in curriculum, outcomes • Can be open-ended, integrative, highly creative in format • Can provide good quality of student effort if course-embedded • Provide directly relevant, useful information • Forestall comparison with other institutions

Disadvantages:

• Run risk of focusing more on surface than deep learning • Provide no norms for reference • May contain ambiguous, poorly constructed items • May offer questionable reliability and validity • May be expensive if test construction is contracted out • Will not elicit good quality of student effort if seen as add-on • Will create misunderstanding of assessment if seen as a threat • May become a missed opportunity to use more innovative approaches • May invite finger-pointing

Solutions/responses:

• If norms, benchmarks are important, supplement with purchased test • Use on-campus expertise • Be careful, pilot any test before large-scale administration • Provide a “gripe sheet” • Accept that assessment is ultimately human judgment, not psychometric science • Keep the focus on useful information & improvement, not test scores per se • Depersonalize issues, avoid finger-pointing

74

Page 77: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Commercially available, standardized tests: Advantages:

• Are a traditional, widely recognized & accepted means of assessment • Require little on-campus time or labor • Prepare students for licensure, other high-stakes testing • Are norm-referenced • Offer longitudinal data, benchmarks • Are technically high-quality • May reflect recent, important trends in the field (e.g., ETS Major Field Tests) • Can be useful as part of a multiple-method approach

Disadvantages:

• May offer poor content validity • Generally do not provide criterion-referenced scores • Test students’ ability to recognize “right” answers • Reflect students’ test-taking ability • Often elicit poor quality of student effort, particularly as add-on • Reinforce faculty bias toward “empty vessel” theory of education • Reinforce student bias toward education as memorizing, regurgitating “right”

answers (i.e. “surface” rather than “deep” learning) • Reinforce everybody’s bias toward assessment as testing • Carry risk of misuse of scores, invidious comparisons • Provide little insight into students’ problem-solving & thinking skills or ability to

discriminate among “good” and “better” answers • Offer no opportunity for test takers to construct their own answers verbally,

numerically, graphically, or in other ways • Give students no opportunity to demonstrate important affective traits, e.g.,

persistence, meticulousness, creativity, open-mindedness. • Are less likely than local methods to stimulate productive discussion • Are more likely to elicit finger-pointing, anxiety, resistance • Can be very expensive ($10-$30/student, plus administration costs) • Generally do not provide good value (i.e., useful information for cost)

Solutions/responses:

• Test samples of students, use matrix sampling • Negotiate with test maker • Supplement with other methods • Use with caution

75

Page 78: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Direct or indirect? Some methods can work both ways . . . Classroom research: Classroom research is included here as a direct method but it can function as either a direct or an indirect method. Of the dozens of classroom assessment techniques (or CATs) developed by Cross and Angelo, some demonstrate what students know and can do, while others elicit reflection, perceptions, and other forms of indirect evidence. Course management programs: Course management programs make it possible for faculty to capture discussions and other evidence that would be ephemeral in the classroom; hence they are classified here as a direct method. Such programs often include a survey or questionnaire template, however, that makes it easy to construct and administer surveys online. See discussion of surveys in handout on “Indirect Methods.” Focus groups: Focus groups are generally regarded as an indirect method of assessment because students are encouraged to talk about their personal experiences and perceptions. However, they can also function as a direct method, if the topic of discussion is an issue in the major and students are guided by the protocol to demonstrate their command of disciplinary concepts, theories and methods. In this case, students generally do not receive a grade for their role in the discussion, but the recording is analyzed by faculty to draw more general conclusions about the strengths and weaknesses of the academic program. Portfolios: Portfolios can function as both a direct and an indirect assessment method. They are direct in the sense that student work is displayed and can be rated, providing direct evidence of knowledge and skills. The reflective essays, in which students look back on various pieces of their work, describe what each represented in terms of challenges or achievements, and evaluate their personal progress as learners, are indirect evidence of a high order. Student self-assessment: Self-assessment is classified here as a direct method because the performance of self-assessment demonstrates directly how skilled students are at self-assessment. However, the process may be structured to elicit student reflection on how learning occurred, what helped or didn’t, etc. In other words, self-assessment can also function as an indirect method.

76

Page 79: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Indirect Assessment Methods -- A Close-up Look

by

Barbara D. Wright

Associate Director,

Accrediting Commission for Senior Colleges and Universities Western Association of Schools and Colleges

Alameda, CA 94501

[email protected]

September 10, 2009

77

Page 80: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Surveys: Common method of gathering information from people on a wide variety of topics (personal characteristics, expectations, experience, attitudes, values, behaviors, perceptions, satisfaction), generally in the form of a questionnaire, which may be distributed in hard copy or online or – less often – administered by phone. Advantages:

• Are well-known, broadly accepted • Are adaptable to many different kinds of research questions • Are adaptable to different audiences (students, alums, employers, non-completers) • Items can vary in format, e.g., yes/no, rating scales, lists, open-ended questions • Can reveal the “why” and “how” behind the “what” • Results allow statistical analysis, reporting • Self-reports are generally truthful, accurate • Many surveys are commercially available, usually can be customized, e.g., NSSE,

CCSSE, CSEP, CCSEP, CIRP, Noel-Levitz • Purchased surveys provide norms, benchmarks, detailed reports • Software programs are available, e.g., Survey Monkey, Zoomerang • Software and email make surveys swift, cheap to administer • Data are easy to store and analyze

Disadvantages:

• Construction of survey requires expertise, time, clarity about purposes • Hiring consultants and purchasing survey services can be costly • Surveys run danger of being too long, too broad • Response rate may be very low • Low response rate reduces representativeness, usefulness of results • Structured format reduces chance of unanticipated findings • Institutions often over-survey, leading to survey fatigue, wasted resources • Collected data are often not used, shared • Telephone surveys can be slow, expensive; becoming less popular

Solutions/responses:

• Use on-campus talent to construct the survey, analyze results • Reward contributors • Be clear about purpose and educational outcomes to be investigated • Keep the survey as appealing, brief, easy to deal with as possible • Create “buzz” with pre-survey communications • Use reminders, incentives to increase response rate • Use “captive audiences” when appropriate, e.g., class meetings, seniors lined up

for commencement • Pool and consolidate campus survey efforts, make maximum use of existing data

78

Page 81: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Interviews: One-on-one conversations designed to elicit a variety of information; may range from highly structured (much like an orally conducted survey) to open-ended and exploratory. Advantages:

• More personal than the written survey • More appropriate for some audiences, e.g. high-status trustees, wealthy donors,

elusive non-completers • Allow for more probing, rephrasing, to elicit targeted information • Can reveal the “why” and “how” behind the “what” • Are useful as follow-up to survey results

Disadvantages:

• Are labor-intensive at every stage: planning, scheduling, conducting, data recording and analysis

• Require skilled interviewers • Do not reach as large an audience as paper or online surveys • May elicit socially acceptable rather than accurate responses

Solutions/responses:

• Be clear about purpose • Use selectively • Use in combination with other methods • Develop a basic protocol

79

Page 82: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Focus groups: structured, in-depth, group discussions of specific topics, guided by a trained moderator and generally audiotaped, videotaped, or recorded by an assistant moderator. Advantages:

• Allow examination of otherwise elusive perceptions, feelings, attitudes, ideas • Adaptable to wide variety of target groups, topics, issues • Offer insights into strengths, weaknesses of educational experience • Can reveal the “why” and “how” behind the “what” • Are useful in tandem with a survey project:

at the front end, as a way to identify productive topics, questions at the back end to help interpret, clarify results

• May reveal new, entirely unanticipated problems, insights • Can be implemented relatively quickly, cheaply • Rubric or matrix may be used to score multiple focus groups and arrive at

findings • Can do double duty as a direct method, too

Disadvantages:

• Moderators must be identified, trained • Development of the topics, questions, and matrix requires care • Sensitive topics may not lend themselves to focus group discussion • Scheduling can be a challenge • Smaller numbers of students are reached than with surveys • Incentives for student participation may be needed • Conduct of individual focus groups will necessarily vary • Results may not led themselves to statistical analysis, generalization

Solutions/responses:

• Use campus expertise, volunteers, to keep costs down • Train new moderators by having them observe skilled moderators • Present participation in focus group to students as privilege, opportunity • Share interesting, surprising findings broadly, but keep identities confidential • Use as an opportunity to show the institution listens carefully, takes student

seriously

80

Page 83: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Ethnographic research: Selected students serve as participant-observers, gathering information about learning and/or student experience through conversations with fellow students, observation, and reflection on their own experiences. Participant-observers meet regularly with faculty and/or staff conducting the study to refine questions, share findings, analyze them, and plan next steps. Advantages:

• Provides an insider perspective otherwise unavailable • Allows longer-term inquiry, e.g., a semester as opposed to one-time interview • Allows in-depth study, exploration of “why” and “what to do” as well as “what” • Provides access to elusive values, attitudes • Can include non-verbal information such as body language, demeanor • Has potential to produce unanticipated, surprising findings • Has high likelihood of producing useful, actionable information • Is adaptable, e.g., to student life as well as academic issues

Disadvantages:

• Requires careful definition of the topic of study • Is time-consuming • Requires training, continuing attention, regular meetings • Quality, commitment of participant-observers may vary • Attrition of participant-observers may reduce usefulness of results • Few models are available

Solutions/responses:

• Choose participant-observers carefully • Provide training, incentives, support • Focus the inquiry but allow for evolution of project, adaptation to unexpected

findings • Provide incentives to participant-observers and /faculty/staff coordinating project • Create a risk-free environment • Avoid identification of individuals when reporting findings

81

Page 84: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Direct or indirect? Some methods can work both ways . . . Classroom research: Classroom research is described here as a direct method but it can function as either a direct or an indirect method. Of the dozens of classroom assessment techniques (or CATs) developed by Cross and Angelo, some demonstrate what students know and can do, while others elicit reflection, perceptions, and other forms of indirect evidence. Course management programs: Course management programs make it possible for faculty to capture student discussions and other performances that would be ephemeral in the classroom; hence they are classified here as a direct method. Such programs often include a survey or questionnaire template, however, that makes it easy to construct and administer surveys online. See discussion of surveys in this handout on “Indirect Methods.” Focus groups: Focus groups are generally regarded as an indirect method of assessment because students are encouraged to talk about their personal experiences and perceptions. However, they can also function as a direct method, if the topic of discussion is an issue in the major and students are guided by the protocol to demonstrate their command of disciplinary concepts, theories and methods, or other learning. In this case, students generally do not receive a grade for their role in the discussion, but the recording is analyzed by faculty to draw more general conclusions about the strengths and weaknesses of the academic program. Portfolios: Portfolios can function as both a direct and an indirect assessment method. They are direct in the sense that student work is displayed and can be rated, providing direct evidence of knowledge and skills. The reflective essays, in which students look back on various pieces of their work, describe what each represented in terms of challenges or achievements, and evaluate their personal progress as learners, are indirect evidence of a high order. Student self-assessment: Self-assessment is classified here as a direct method because the performance of self-assessment demonstrates directly how skilled students are at self-assessment. However, the process may be structured to elicit student reflection on how learning occurred, what helped or didn’t, etc. In other words, self-assessment can also function as an indirect method.

82

Page 85: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lecture / Discussion 4:

Unique Issues in Assessment for

Community Colleges

Fred Trapp

83

Page 86: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Unique Issues in Assessment for Community Colleges

Presented by Fred Trapp

Cambridge West Partnership, LLC Administrative Dean, Institutional Research/Academic Services (retired)

Long Beach Community [email protected]; [email protected]

2

What Is Unique?Transfer functionDevelopmental education functionState mandates, regulations & mandatory curriculum documentationWhat is a program?Grading vs. assessmentLiberal arts/general education “deli”Flex days, faculty professional development $Commission rubrics, deadlines, reportsCareer & technical programs (another show)

see also examples at end of Handout Packet 3

The Learning Improvement Cycle:The Learning Improvement Cycle:

1. Define/Refine student learning outcomesbased on input from stakeholders.

2. DesignAssessment tools, criteria & standards directly linked to each outcome

3. Implement Assessment tool(s) to gather evidence of student learning.

4. Analyze andevaluate the collected data(make sense of it)

5. Identify gapsbetween desired & actualresults.

6. Document results & outline needed changes incurriculum, instructional materials,teaching strategies, or assessment means.

84

Page 87: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

4

The Transfer Function

Using Administrative DataHow well do our students perform upon transfer?• GPA after one year

• GPA in senior year

• Transcript analysis

http://www.asd.calstate.edu/performance/index.shtml 5

GPA ComparisonCC Transfers After One Year at CSU

0.00

0.50

1.00

1.50

2.00

2.50

3.00

Fall 1 Fall 2 Fall 3 Fall 4 Fall 5

CC System Your CC

http://www.ed.gov/policy/gen/guid/fpco/index.html 6

Family Educational Rights & Privacy Act (FERPA)

Protects student educational recordsAllows disclosure for research to improve instruction. (section 99.31 of FERPA regs.)

When research reported, personal identity of students must be suppressed.Data must be destroyed when study is completed.

85

Page 88: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

7

Senior Year Student Performance Report

C3Organization Behavior491MGMT

B3Operations Mgmt I372MGMT

B3Real Estate Law451FIN

C3Real Estate Valuation353FIN

C3Economic Statistics I230ECON

C3Admin Accounting301ACCY

GradeCr. HrCourse TitleCrs No.Crs Abv.

http://www.CalPASS.org 8

Transcript AnalysisECON 490: Money and Banking (CSU upper division)ECON 490: Money and Banking (CSU upper division)

Community College Community College (99 students)(99 students)

Non Community College Non Community College (1,054 students)(1,054 students)

Average Grade 2.50Average Grade 2.50

Average Grade 2.69Average Grade 2.69

9

Transferred Students Opinion Data

Survey former students– Do you feel as prepared for junior and senior

level work as other students?

– What did we do well to help you?

– What do we need to do to improve?

Collaborate with the four-year schoolExploit holidays if you send the survey yourself

Perhaps focus groups

86

Page 89: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

http://www.ncde.appstate.edu/ 10

Developmental Ed FunctionPrograms and services that commonly address

Academic preparednessDevelopment of specific learning strategiesAffective barriers to learning

Typical program componentsWritingMathematicsReadingStudy skills

* http://www.cccbsi.org/ 11

Global Outcomes & Skills for Developmental Education

Ability to Write at the college levelPerform mathematical processesRead with comprehensionUse appropriate study skills

Basic skills initiative (BSI)*Events, effective practices, resources, publications

12

Developmental EducationAssessment Plan

Program Intended Ed. OutcomesProgram completers will …

1. Be prepared to perform successfully in college

Level composition courses.

2. Perform mathematical processes well enough

to complete freshman college mathematics

requirements.

3. Be successful in first semester college courses

that require significant reading.

Goal/Statement of Purpose- To provide quality developmental education for students who need basic academic skills.

87

Page 90: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

act.org; ets.org 13

Developmental EducationMeans of Assessment

Rely on diagnostic assessment and placementStandardized exams

ASSETCOMPASS AccuplacerMDTP

Locally prepared/managed exams

http://www.cccco.edu/ChancellorsOffice/Divisions/StudentServices/Matriculation/MatriculationArchives/tabid/627/Default.aspx

14

Developmental Education, Questions & Means of Assessment

How well are they doing compared to a standard?

Post-test overall exam score

How much did they gain?Pre & Post test

Where are large numbers of students still weak?

Analysis of sub-scores

15

Developmental Education, Questions & Means of Assessment

Over several assignments, how well are they doing compared to our expectations?

Portfolio of work

How well do they do in subsequent courses?Success in entry-level college courses

Have their attitudes changed and how?Surveys of attitudes

88

Page 91: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

16

Developmental Education Assessment Plan

Program Intended Ed.

OutcomesProgram completers will …

1. Be prepared to perform

successfully in college level

composition courses.

2. Perform mathematical processes

well enough to complete freshman

college mathematicsrequirements.

3. Be successful in first semester

college courses that require

significant reading.

Means of Assessment & Criteria for Success1a. At least 70% of the students will complete college composition

on the first attempt.

1b. A faculty panel will use a rubric & rate at least 90% of the

completers as satisfactory (4 or 5) on a 5-point scale for each

category of the exit writing sample.

2a. Seventy percent of the completers will score 75% or higher

on a locally devised test of pre-college math competencies.

2b. Seventy percent of completers will score 75% or higher on each

sub score area of the locally devised test.

3a. Eighty percent of completes will score 85% or better on a standard

reading comprehension exit test.

3b. Eighty percent of completers will score at or above the national

average on the post test sub score for reading & vocabulary. No completer

will score below the 40th percentile.

17

Developmental Education Assessment Plan & Report

Means of Assessment & Criteria for Success1a. At least 70% of the students will complete college composition

on the first attempt.

1b. A faculty panel will use a rubric & rate at least 90% of the

completers as satisfactory (4 or 5) on a 5-point scale for each

category of the exit writing sample.

2a. Seventy percent of the completers will score 75% or higher

on a locally devised test of pre-college math competencies.

2b. Seventy percent of completers will score 75% or higher on each

sub score area of the locally devised test.

3a. Eighty percent of completes will score 85% or better on a standard reading comprehension exit test.

3b. Eighty percent of completers will score at or above the national

average on the post test sub score for reading & vocabulary. No

completer will score below the 40th percentile.

Summary of Data Collected1a. Eight-two percent (338 or 412 students)

completed ENGL 1.

1b. Organization - 92%; Grammar- 78%

Rhetoric- 89%

2a. Eight-one percent of completers scored

75% or better.

2b. Linear equations- 90%; Inequalities- 84%

Quadratic equations- 88%; Graphing- 62%

Powers & roots- 86%; Proportions- 78%

3a. Eighty-nine percent (213 of 239 students)

Scored 85% or better.

3b. Sixty-two percent scored at or above the

national average on reading; 88% scored at or

above the national average on vocabulary.

see your campus Research Office 18

Developmental EducationLogistics of Assessment*

Obtaining data for students recommended to but who do not take developmental courses

Tracking students though basic skills migration and into the first college-level course in the discipline

Obtaining feedback from college-level course faculty

Tracking persistence to completion

Basic skills initiative work on campus will dovetail

89

Page 92: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

19

State Mandates, Regulations & Required Curriculum Documentation

State Mandates & Regulations- CCAR, Title 5, Chapter 6

Curriculum documentation- Outline of record (section 55002)Integrated course outline (Academic Senate)

Learning OutcomesTypical

Means of classroom instructionOut of class AssignmentsMeans of evaluation and criteria

Equates to a plan for course-level assessment

20

Objectives vs. SLOsBoth can be expressed in terms of what the student knows, believes and can do

To some, objectives are more discrete or smaller pieces of learning

To some, SLO’s are the “big ideas”

see the Course-level SLO Assessment Plan Worksheet 21

Integrated Course Outline of RecordAs an Assessment Plan

Statement of Purpose: Role of the course in curriculum (commonly captured on supporting documents and not necessarily in the outline of record)

Intended Ed. Outcome*Student learning

outcomes

Means & Criteria for Assessment*Representative assignments, instructional methods, kinds and criteria for evaluation

Results of Assessment**Actual performance data on assessments.

Not part of the outline, but recorded separately (course portfolio, program plans/reviews).

Instructional Strategies*

Not part of the assessment plan, but recorded in the outline of record.

Use of Results**

Action steps taken

Not part of the outline, but recorded separately (course portfolio, program plans/reviews).

*in the Course Outline of Record

**in the assessment report

90

Page 93: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

22

Curriculum Review & Approval ProcessCurriculum review and approval process (Title 5,

section 55002)

Integration and alignment questionsIs the course outline integrated?

• Course outcomes, assignments or activities, methods of instruction, and evaluation strategies

How does the course support program outcomes?How does the course support GE outcomes?

Course sequences & prerequisites (T5 section 55500-30)

23

Good Practice in Course Design to Assess Learning Outcomes

Where do we find evidence of student learning? How do we observe and document that this learning is taking place? What are some key elements of student work to which we pay attention? (Primary traits) Development and use of rubrics, exams, etc. to evaluate learning evidence.

Ruth Stiehl The Outcomes Primer 2002 24

Good Practice in Course Design- Shaping Outcomes and Instructional Activities

Key Questions for Faculty1. Intended outcomes- What do students need to be able to do “out

there” in life roles for which this course prepares them?

2. Assessment task- What can students do “in here” to demonstrate the intended outcomes?

3. Concepts & issues- What do students need to understand (knowledge) in order to demonstrate the intended outcomes?

4. Skills- What skills do students need that are essential to the intended outcomes?

91

Page 94: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

25

Course-level Assessment Example

Student Learning Outcomes

Assessment TasksSkills

Themes, Concepts,

Issues

see Bloom's Revised Taxonomy in the Handout Packet 26

Curriculum AlignmentAssessments Objectives/

Outcomes

Instructional Activities/ Materials

Curriculum Curriculum AlignmentAlignment

27

What is an Instructional Program?Regulation & Local Definition

State/Governing Authority Authorized Degree or Certificate Program Award

https://misweb.cccco.edu/webproginv/prod/invmenu.htm

General Education Patterns or Certificate of Achievement

Within a vocational certificate (SCANS)

Developmental InstructionLocally defined programs

Discipline Pathways and Transfer PreparationLocally defined programs

Noncredit Instruction

92

Page 95: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Ruth Stiehl The Outcomes Primer 28

Good Practice in Program Design-Shaping Outcomes and Instructional Activities

Key Questions for Faculty1. Intended outcomes- What do students need to be able to do

“out there” in life roles for which this program prepares them?

2. Capstone assessment tasks- What can students do in this program to show final evidence of the intended outcomes?

If a capstone is not available…..

3. Courses- What learning experiences (courses) are necessary to prepare the student?

4. Prerequisites- What must students be able to do before engaging in this learning?

29

Program Assessment Example

Student Learning Outcomes

CapstoneAssessment

TasksCoursesPrerequisites

Anne Arundel College, Arnold, MD 30

Alignment IllustrationComputer Information Science

College (GE)CompetencyCritical Thinking & Problem Solving

Program OutcomeAnalyzes & designs a solution when presented with a business, math or science problem according to specifications set by the department

CIS 200Course OutcomesAnalyzes given problems for specifications in terms of input, output and processing.

Designs a solution to a given problem.

Means ofAssessment and Std for SuccessFor each problem the learner correctly:

Determines what is to be read, printed, and converted

Completes a flowchart or algorithm

Designs test cases

Assessment Results & Use of Results

93

Page 96: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

31

Curriculum Map for Program Assessment(no capstone course)

MCrs 6

MR/D/PCrs 5

R/D/PIMCrs 4

R/D/PR/D/PMCrs 3

IIR/D/PCrs 2

ICrs 1

Outcome 4Outcome 3Outcome2Outcome 1

I= introduced R/D/P= re-enforced/developed/practiced M= mastery demonstrated

Do your program-level assessment work in the courses where mastery is demonstrated.

Bellevue College, Bellevue, WA 32

Embedded Assessments MatrixLinking Course to GE/Program Outcomes

Linking courses to general education/program core competencies

For each course provide a rating score on this outcome using these behavioral descriptors

0 Does not include instruction in this area1 Includes some instruction or practice & assessment2 Addresses the outcome as the focus in 20% or more of the course.3 Addresses the outcome as the focus in 33% or more of the course

33

Embedded Assessments MatrixLinking Course to GE/Program Outcomes

3Crs 6

32Crs 5

213Crs 4

223Crs 3

112Crs 2

1Crs 1

Outcome 4Outcome 3Outcome2Outcome 1

1= very little emphasis 2= modest emphasis 3= major emphasis

Do your program-level assessment work in the courses where a major emphasis is placed.

94

Page 97: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

see Handout- Using the Grading Process for Assessment 34

Means of Assessment- Grades

Evaluation of individual students (grading) = assessment

Focus is individual not groups of studentsA summative, not formative actObjectivity of single evaluator vs. groupGenerally not accepted as direct evidenceUse the grading process

Agreed upon course exam or part of examRow and column model for assignments

35

Embedded Assessment Strategy(Row and Column Concept)

Criteria Tim Jane Mary Joe Dave AverageSpelling 3 4 1 2 3 2.6Grammar 2 5 3 2 5 3.4Punctuation 4 5 2 3 4 3.6Structure 3 2 3 5 3 3.8Total 13 17 10 12 15

Student Grade C A D C B

Total down the column for individual grading. Analyze across the row for assessment of intended outcomes from the group.

Jim Nichols

see Handouts- Rubrics & Johnson Co. CC example 36

The Grading ProcessesImplications for Assessment

Using the Grading Process & Existing Assignments (stealth assessment)

Build a grading rubric for an assignmentDetermine the criteria on which you evaluate student work (primary trait analysis)

Describe shades of performance quality

As a faculty group, sample student work and apply the rubric

95

Page 98: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

see Handout Packet- Choosing the Right Assessment Tools 37

Course-level Assessment(a requirement once unique to ACCJC)

Strategies to select a starting pointCourses with major work (emphasis) aligned to a GE outcomeStudent equity & basic skills initiative coursesCTE program capstone course or course with a capstone projectStand-alone courses via routine review cycle on course outlines or discipline (program) review cycle

see list of URLs in the Handout Packet 38

Liberal Arts- Assessing Outcomes

Disciplinary/Professional SocietiesCarnegie Academy for the Scholarship of Teaching & Learning (CASTL)American Psychological Association (APA)Association of College & Research Libraries (ACRL)National Institute for Science Education (NISE)National Communications Association (NCA)American Sociological Association (ASA)American Historical Association (AHA)Association for Institutional Research (AIR)

see list of URLs in the Handout Packet 39

Liberal Arts- Assessing Outcomes

Consortia and AssociationsQuality Undergraduate Education Project (QUE)Association of American Colleges & Universities (AAC&U)League for Innovation in the Community Colleges (Learning Outcomes for the 21st century & Getting Results)National Postsecondary Educational Cooperative- Sourcebooks on Assessment (NPEC)Research, Planning & Assessment Group (RP Group)California State University, Institute for Teaching and Learning(CSU, ITL)

96

Page 99: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

40

General Ed-Assessment Strategies

EmbeddedEmbeddedMap the curriculum to the outcomes

Mira Costa exampleIdentify courses that qualify

• Bellevue exampleCascade to course outcomes and activities

• Palomar exampleOperationally define the outcomes

• Build rubricsSample assignments

Exams (standardized or locally developed)Exams (standardized or locally developed)Outcomes related to exam contentSample classesCAAP- share the burden

see http://www.miracosta.edu/Governance/Outcomes/index.htm 41

Mira Costa ExampleStarted with three programs- general ed.; career & technical education; noncredit

Created mission statements for each area of GEMapped courses to GE outcomesUse on-line reporting form- 1 outcome per course

see Handouts- Area B Mission Statement & Matrix; Assessment Reporting Form

Palomar College, San Marcos, CA 42

Palomar College- Communication, GE Outcome- (1 of 6 core skills)

Students will communicate effectively in many different situations, involving diverse people and viewpoints. (core skill)

1. Speaking: Students will speak in an understandable and organized fashion to explain their ideas, express their feelings, or support a conclusion.

2. Listening: Students will listen actively and respectfully to analyze the substance of others’ comments.

3. Reading: Students will read effectively and analytically and will comprehend at the college level.

4. Writing: Students will write in an understandable and organized fashion to explain their ideas, express their feelings, or support a conclusion.

Performance benchmarks (beginning, developing and accomplished) are available for each outcome.

http://www.palomar.edu/alp/ (historic URL, look for core skills hot link)

97

Page 100: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Palomar College, San Marcos, CA 43

Speaking Core Skill Demonstrated Competence Levels

Beginner (1 of 4 descriptors)Clearly state and address assigned topic

Developing (1 of 8 descriptors)Develop a clear thesis

Accomplished (1 of 5 descriptors)Support a clear thesis, with supporting points, that move to a conclusion

44

Flex Days and Assessment(a resource to “die for”)

Faculty learning communitiesReviewing sample student workComparing it to criteria and standards for performanceMaking sense of the dataPlanning/implementing future changes

Poster sessions or sharing of experiences with othersDepartment meetings

see 5 WASC Rubrics in the Assessment Retreat Materials 45

WASC Senior Commission Rubrics

Quality of academic program learning outcomesUse of capstone experiences for assessmentGeneral education assessment processUse of portfolios for assessmentIntegration of student learning assessment into program review

98

Page 101: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

see Handout Packet- ACCJC SLO Progress Rubric 46

ACCJC SLO Assessment RubricLevels of Implementation

AwarenessDevelopment (presently expected)Proficiency (by Fall 2012)Sustainable continuous quality improvement

see Handout Packet- ACCJC SLO Progress Rubric 47

Homework(using the ACCJC SLO Progress Rubric)

Compare each bullet point in the development vs. proficiency levels of implementation

1. What is the difference between the levels for each point?2. How is each point accomplished at your college?3. What evidence do you have to support your answer to

question #2?4. Where is that evidence kept?

see Handout Packet- examples of assessment cycles 48

The Key Ideas . . .

Being focused on learning evidence: “How do I know that my students are learning?”

Engaging in dialogue with colleagues on what is working with our courses/programs.

This is an ongoing cycle of continuous improvement.

99

Page 102: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Handout Packet for Unique Aspects of Assessment

In Community Colleges Presentation

Fred Trapp, Ph.D. Cambridge West Partnership, LLC

Administrative Dean, Institutional Research/Academic Services (retired)

Long Beach City College

September 2009

100

Page 103: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Web References cited in Unique Issues in Assessment for Community Colleges

California State University performance index http://www.asd.calstate.edu/performance/index.shtml Family Educational Rights & Privacy Act (FERPA) http://www.ed.gov/policy/gen/guid/fpco/index.html CalPASS http://www.CalPASS.org National Center for Developmental Education http://www.ncde.appstate.edu California Community College Basic Skills Initiative http://www.cccbsi.org/ American College Testing http://www.act.org Educational Testing Service http://www.ets.org California Community College Chancellor’s Office, Student Services, Matriculation, Matriculation Archives http://www.cccco.edu/ChancellorsOffice/Divisions/StudentServices/Matriculation/MatriculationArchives/tabid/627/Default.aspx California Community College Chancellor’s Office, Academic Affairs, Program Inventory https://misweb.cccco.edu/webproginv/prod/invmenu.htm Carnegie Academy for the Scholarship of Teaching and Learning (CASTL) http://www.carnegiefoundation.org/programs/index.asp?key=21 American Psychological Association (APA) http://www.apa.org/ed/eval_strategies.html Association of College and Research Libraries (ACRL) http://www.ala.org/ala/mgrps/divs/acrl/issues/infolit/index.cfm Field Tested Learning Assessment Guide for Science, mathematics Engineering & Technology (FLAG) Project of the National Institute for Science Education (NISE) http://www.flaguide.org/

101

Page 104: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Fred Trapp, Cambridge West Partnership, LLC; [email protected] 3

National Communications Association (NCA) http://www.natcom.org/index.asp?bid=264 American Sociological Association (ASA) http://www.e-noah.net/ASA/ASAShopOnlineService/productslist.aspx?CategoryID=ASACDDM&selection=3 American Historical Association (AHA) http://www.historians.org/perspectives/issues/2009/0903/0903for2.cfm Association for Institutional Research (AIR) http://www.airweb.org/?page=1217 Quality Undergraduate Education Project (QUE) http://www2.gsu.edu/~wwwque/about/index.html (Chemistry, Biology, History, English) Association of American Colleges and Universities (AAC&U) http://www.aacu.org/ League for Innovation in the Community Colleges (Project-Learning Outcomes) also (Getting Results: On-line Professional Development for Faculty) http://www.league.org/gettingresults/web/module6/assessing/index.html National Postsecondary Education Cooperative (NPEC) http://nces.ed.gov/NPEC/ Research, Planning & Assessment Group (RP Group) of the California Community Colleges http://www.rpgroup.org/ California State University, Institute for Teaching and Learning (CSU, ITL) http://www.calstate.edu/ITL/ Rubistar (free tool to help create rubrics) http://rubistar.4teachers.org/index.php Mira Costa Community College http://www.miracosta.edu/governance/Outcomes/index.htm Palomar College (historic site for outcomes, look for core skills) http://www.palomar.edu/alp/

102

Page 105: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Course-Level Student Learning Outcome Assessment Plan & Report Course: Department/Program:

Statement of Purpose (role of the course in the curriculum), GE area________, required in ______ program, elective in _______ program, etc.: Intended Educational Outcomes:

Means and Criteria for Assessment:

Results of Assessment Use of Results:

What do students demonstrate that they know or can do in your course? (SLO)

What activities/assignment/ instrument/methodology will you use to produce evidence of student mastery of this outcome? Describe the approach you will take to assess the outcome (Who, When & What is Success?)

Describe what actually happened- when, how many & in what way were students assessed. How well did the students perform- how many accomplished your standard of success? What sense do you make of these results?

Comparing your expectations to the results, what changes have you made in pedagogy, assessment means, or standard of success? What are the implications for further assessment work?

1. 1a. 1b. 2. 2a. 2b.

103

Page 106: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Bloom's Revised Taxonomy

Bloom created a learning taxonomy in 1956. During the 1990's, a former student of Bloom's, Lorin Anderson, updated the taxonomy, hoping to add relevance for 21st century students and teachers. This new expanded taxonomy can help instructional designers and teachers to write and revise learning outcomes.

Bloom's six major categories were changed from noun to verb forms.

The new terms are defined as:

Remembering Retrieving, recognizing, and recalling relevant knowledge from long-term memory.

Understanding Constructing meaning from oral, written, and graphic messages through interpreting, exemplifying, classifying, summarizing, inferring, comparing, and explaining.

Applying Carrying out or using a procedure through executing, or implementing.

Analyzing Breaking material into constituent parts, determining how the parts relate to one another and to an overall structure or purpose through differentiating, organizing, and attributing.

104

Page 107: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Evaluating Making judgments based on criteria and standards through checking and critiquing.

Creating Putting elements together to form a coherent or functional whole; reorganizing elements into a new pattern or structure through generating, planning, or producing.

Because the purpose of writing learning outcomes is to define what the instructor wants the student to do with the content, using learning outcomes will help students to better understand the purpose of each activity by clarifying the student’s activity. Verbs such as "know", "appreciate", "internalizing", and "valuing" do not define an explicit performance to be carried out by the learner. (Mager, 1997)

Unclear Outcomes Revised Outcomes

Students will know described cases of mental disorders.

Students will be able to review a set of facts and will be able to classify the appropriate type of mental disorder.

Students will understand the relevant and irrelevant numbers in a mathematical word problem.

Students will distinguish between relevant and irrelevant numbers in a mathematical word problem.

Students will know the best way to solve the word problem.

Students will judge which of the two methods is the best way to solve the word problem.

Examples of unclear and revised outcomes.

Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and assessing: A revision of Bloom's Taxonomy of educational outcomes: Complete edition, New York : Longman.

Cruz, E. (2003). Bloom's revised taxonomy. In B. Hoffman (Ed.), Encyclopedia of Educational Technology.

Retrieved August 22, 2007, from http://coe.sdsu.edu/eet/articles/bloomrev/start.htm Forehand, M. (2005). Bloom's taxonomy: Original and revised.. In M. Orey (Ed.), Emerging perspectives on

learning, teaching, and technology. Retrieved August 22, 2007, from http://projects.coe.uga.edu/epltt/

105

Page 108: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Using the Grading Process for Assessment To be helpful to faculty who want to improve student performance as well as to serve the goals of program and general education assessment of student learning, grading must be seen as a process that includes:

1. Identify the most valuable kinds of learning in a course and articulate those outcomes 2. Construct exams and assignments that will match and test that learning outcome 3. Set standards and criteria that is assignment, exam or performance specific 4. Use primary trait analysis to build a scoring rubric* 5. Guide student learning 6. Implement changes in teaching that are based on information from the grading process

The classroom grading process, with well-constructed rubrics, can be harnessed for program or general education assessment. In doing so, two assumptions are being made:

1. Whatever learning you are trying to promote across the curriculum is being taught and assessed now. 2. Learning skills such as critical thinking or problem solving is context-specific in the disciplines.

A program faculty or general education committee might want to do or know the following:

1. Assure that effective classroom assessment is taking place. 2. Find the common learning expectations among courses. 3. Check the sequence of skills taught in the program. 4. Identify what is required of graduates. 5. Isolate strengths and weaknesses in student performance at the conclusion of the program. 6. Track student performance over time.

*see the handout on rubrics Source: Walvoord, Barbara and Anderson, Virginia. Effective Grading: A Tool for Learning and Assessment. Jossey Bass, San Francisco, 1998. ISBN 0-7879-4030-5 Other good sources: Milton, Ohmer; Pollio, Howard; and Eison, James. Making Sense of College Grades. Jossey Bass, San Francisco, 1986. ISBN 0-87589-687-1 Wiggins, Grant. Educative Assessment: Designing Assessments to Inform and Improve Student Performance. Jossey-Bass, San Francisco, 1998. ISBN 0-7879-0848-7

106

Page 109: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Rubrics Handout A rubric is a scoring tool that divides assignments into component parts or criteria used for evaluation. It provides a detailed description of what is acceptable vs. unacceptable qualities of performance. An analytic rubric makes clear distinctions among the evaluation criteria while a holistic rubric merges the criteria together to stimulate a general judgment about the quality of student work.

Questions To Ask When Constructing Rubrics

1. What criteria or essential elements must be present in the student’s work to ensure that it is high in quality?

2. How many levels of achievement (mastery) do I wish to illustrate for students?

3. For each criteria or essential element of quality, what is a clear description of performance at each achievement level?

4. What are the consequences of performing at each level of quality?

5. What rating scheme will I use in the rubric?

6. When I use the rubric, what aspects work well and what aspects need improvement?

Additional Questions To Consider

1. What content must students master in order to complete the task well?

2. Are there any important aspects of the task that are specific to the context in which the assessment is set?

3. In the task, is the process of achieving the outcome as important as the outcome itself? Source: Huba, Mary E. and Freed, Jann E. Learner-Centered Assessment on College Campuses. Allyn & Bacon, Boston, MA, 2000. ISBN 0-205-28738-7. Additional good references: Moskal, Barbara M. (2000). Scoring Rubrics: What, When and How? Practical Assessment, Research & Evaluation, 7(3). Available online: http://ericae.net/pare/getvn.asp?v=7&n=3. Stevens, Dannelle and Levi, Antonio. Introduction to Rubrics. Stylus Publishing, Herdon, VA 2004. ISBN 1-57922-114-9. September 2004 The assessment leader at Winona State University (MN) has an excellent set of rubrics at this URL http://www.winona.edu/AIR/ Once there click on the sample rubrics link in the left frame. The Center for Learning and Teaching Excellence at Arizona State University has a bank of rubrics at this URL http://clte.asu.edu/resources/instructors/ Select the Assessment Web link in the center of the page. CSU System Office has an excellent rubrics at this URL http://www.calstate.edu/itl/sloa/index.shtml Example in action: Raymond Walters College has been making extensive use of rubrics and primary trait assessment, for individual course assignments. See examples link at http://www.rwc.uc.edu/phillips/index_assess.html

107

Page 110: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Johnson County Community College- Writing Outcome

Outcomes Statement: Upon receipt of an associate degree from Johnson County Community College, a student should be able to write a clear, well-organized paper using documentation and quantitative tools when appropriate.

Outcome Rubric:

6 = Essay demonstrates excellent composition skills including a clear and thought-provoking thesis, appropriate and effective organization, lively and convincing supporting materials, effective diction and sentence skills, and perfect or near perfect mechanics including spelling and punctuation. The writing perfectly accomplishes the objectives of the assignment. 5 = Essay contains strong composition skills including a clear and thought-provoking thesis, although development, diction, and sentence style may suffer minor flaws. Shows careful and acceptable use of mechanics. The writing effectively accomplishes the goals of the assignment. 4 = Essay contains above average composition skills, including a clear, insightful thesis, although development may be insufficient in one area and diction and style may not be consistently clear and effective. Shows competence in the use of mechanics. Accomplishes the goals of the assignment with an overall effective approach. 3 = Essay demonstrates competent composition skills including adequate development and organization, although the development of ideas may be trite, assumptions may be unsupported in more than one area, the thesis may not be original, and the diction and syntax may not be clear and effective. Minimally accomplishes the goals of the assignment. 2 = Composition skills may be flawed in either the clarity of the thesis, the development, or organization. Diction, syntax, and mechanics may seriously affect clarity. Minimally accomplishes the majority of the goals of the assignment. 1 = Composition skills may be flawed in two or more areas. Diction, syntax, and mechanics are excessively flawed. Fails to accomplish the goals of the assignment.

Standards: Ten percent of students who have met the requirements for an associate degree at JCCC will earn 6 (excellent) on each of the communication rubrics. Thirty percent of students earning an associate degree will score 5 (very good) or 6 (excellent). Eighty percent will earn scores of 4 (satisfactory) or higher and the top 98 percent will earn scores of 3 (minimal accomplishment of educational goals) or higher. The remaining 2 percent of the associate degree recipients are expected to earn the score of 2 (unsatisfactory) on the communication rubrics. The score of 1 represents a skill level beneath the expectation of all associate degree recipients at JCCC. Hence, no associate degree recipients are expected to score at the level of 1 on the communications rubrics.

Suggested Assignment Guidelines

An appropriate assignment (e.g., paper, homework, project) would allow students to demonstrate composition skills by asking them to:

• develop a clear thesis statement; • develop main points with appropriate and convincing supporting materials; • utilize appropriate and effective organization of content; • demonstrate a clear and coherent writing style that uses effective diction and sentence skills; and • demonstrate correct mechanical skills including spelling and punctuation.

108

Page 111: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Choosing the Right Assessment Tools, Gary Williams, Crafton Hills College

Data: Direct or Indirect

Domain: Cognitive,

Psychomotor, or Affective

Formative or Summative

Bloom's Tax: Knowledge, Comprehension, Application or Analysis/

Pros Cons

Assessment Tool

Synthesis/Eval D C F, S easily graded with

rubric allows other students to see and learn what each student learned connects general education goals with discipline-specific courses

difficult for ESL students stressful for students takes course time must fairly grade course content beyond delivery

variable K, C, A, ASE Oral Speech

D C, A F, S provides immediate feedback to the student reveals thinking and ability to respond based on background knowledge and critical thinking ability

requires good rubric more than one evaluator is helpful difficult for ESL students stressful for students takes course time Debate K, C, A, ASE

D C, P, A F, S students can must have clearly display skills. defined criteria Product Creation & Special Reports

variable K, C, A, ASE

knowledge, and abilities in a way that is suited to them

and evaluative measures "the look" can not over-ride the content

D C F, S displays original more difficult to synthetic grade, requiring a thinking on the checklist or rubric part of the student for a variety of different

answers

perhaps the difficult for some best way to display

overall high level students to do on the spot Flowchart or

Diagram C, A, ASE

thinking and articulation abilities

109

Page 112: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Data:

Direct or Indirect

Domain: Cognitive,

Psychomotor, or Affective

Bloom's Tax: Knowledge, Comprehension, Application or Analysis/

Formative or Summative

Pros Cons

Assessment Tool

Synthesis/Eval D C, P S provides the

students with a clear record of their work and growth

time consuming to grade different content in portfolio makes evaluating difficult

best evidence of growth and change over

and may require training bulky to manage

Portfolios variable time depending on size students can display skills. knowledge, and

abilities in a

way that is suited to them promotes self- assessment

D, I A S provides good summative data easy to manage data if Likert-scaled

Likert scales limit feedback, open-ended responses are bulky to manage,

Exit Surveys ASE responses are used D C, P F, S provides best

display of skills stressful for students

and abilities provides

may take course time

excellent some students opportunity for

peer review students can

may take the evaluation very hard - evaluative Performance variable K, C, A, ASE

display skills. statements must knowledge, and be carefully abilities in a framed way that is suited to them D C, P , A F, S best method to focus and breadth of assessment Measure growth overtime are important

with regards to a course or program - cumulative

understanding all the variables to produce assessment

Capstone project or ASE course results is also important may result in

additional course requirements

110

Page 113: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Data: Direct or Indirect

Domain: Cognitive,

Psychomotor, or Affective

Bloom's Tax: Knowledge, Comprehension, Application or Analysis/

Formative or Summative

Pros

Cons

Assessment Tool

Synthesis/Eval requires coordination and

agreement on standards

D C, A F, S connects general education goals with discipline-specific courses

must fairly grade individuals as well as team grading is slightly more complicated student interaction may be a challenge

Team Project variable K, C, A, ASE

D, I C, A S provides invaluable ability to evaluate affective growth in students

must use evidence to support conclusions, not just self-opinionated assessment

Reflective

self-assessment essay

ASE

I C, P, A S provides good respondents may indirect data be influenced by data can be factors other than Satisfaction compared those being

and Perception longitudinally can be used to

considered C, A, ASE Surveys determine validity and outcomes over a long

period of time reliability most be closely watched

111

Page 114: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Mira Costa College

Area B (Physical Universe and its Life Forms)

Area B Mission Statement (drafted 10/20/06) Students in Area B will be able to investigate and explain physical phenomena through the application of empirical knowledge using mathematical and scientific processes and concepts. Anthropology Students completing courses in anthropology within Area B will understand what it means to be human from a biological perspective. They will garner this understanding through integration of scientific method and evidence, including comparisons with other animal species and development of ecological and evolutionary paradigms. Life Sciences Students in the Life Sciences will become scientific thinkers who are curious and knowledgeable about biological systems and who rely on experimentation, logic, evidence, objective reasoning and healthy skepticism to explain natural phenomena.

112

Page 115: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Mira Costa College Student Learning Outcome Status Report

Department: ____________ Discipline: ____________ Course: ____________ SLO Written (semester & year): ____________ Assessment Administered (semester & year): ____________ Evaluation of Assessment Data Completed (semester & year): ____________ GE Program-Level Outcomes: Effective Communication, Critical Thinking, Global Awareness and Responsible Citizenship, Information Literacy, Aesthetic Literacy and Appreciation, Productive Work Habits. CTE Program-Level Outcomes: Technical Skills, Application of Discipline Skills, Critical Thinking and Problem Solving, Communication, Professional Behavior. A) Student Learning Outcome

B) General Education or CTE SLO(s) to which course SLO aligns (see above)

C) Assessment Task(s)

D) Expected Level of Achievement/Baseline

E) How Data were Gathered and Evaluated

F) Results of Evaluation

G) Use of Data/Plans

Accrediting Commission for Community and Junior Colleges (ACCJC)

113

Page 116: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Fred Trapp, Cambridge West Partnership, LLC; [email protected] 15

Western Association of Schools and Colleges

Rubric for Evaluating Institutional Effectiveness – Part III: Student Learning Outcomes (See attached instructions on how to use this rubric.)

Levels of Implementation

Characteristics of Institutional Effectiveness in Student Learning Outcomes

(Sample institutional behaviors)

Awareness

• There is preliminary, investigative dialogue about student learning outcomes. • There is recognition of existing practices such as course objectives and how they relate to student learning outcomes. • There is exploration of models, definitions, and issues taking place by a few people. • Pilot projects and efforts may be in progress. • The college has discussed whether to define student learning outcomes at the level of some courses or programs or degrees; where to begin.

Development

• College has established an institutional framework for definition of student learning outcomes (where to start), how to extend, and timeline. • College has established authentic assessment strategies for assessing student learning outcomes as appropriate to intended course, program, and degree learning outcomes. • Existing organizational structures (e.g. Senate, Curriculum Committee) are supporting strategies for student learning outcomes definition and assessment. • Leadership groups (e.g. Academic Senate and administration), have accepted responsibility for student learning outcomes implementation. • Appropriate resources are being allocated to support student learning outcomes and assessment. • Faculty and staff are fully engaged in student learning outcomes development.

Proficiency

• Student learning outcomes and authentic assessment are in place for courses, programs and degrees. • Results of assessment are being used for improvement and further alignment of institution-wide practices. • There is widespread institutional dialogue about the results. • Decision-making includes dialogue on the results of assessment and is purposefully directed toward improving student learning. • Appropriate resources continue to be allocated and fine-tuned. • Comprehensive assessment reports exist and are completed on a regular basis. • Course student learning outcomes are aligned with degree student learning outcomes. • Students demonstrate awareness of goals and purposes of courses and programs in which they are enrolled.

Sustainable Continuous Quality Improvement

• Student learning outcomes and assessment are ongoing, systematic and used for continuous quality improvement. • Dialogue about student learning is ongoing, pervasive and robust. • Evaluation and fine-tuning of organizational structures to support student learning is ongoing. • Student learning improvement is a visible priority in all practices and structures across the college. • Learning outcomes are specifically linked to program reviews.

114

Page 117: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Pierce College

Speech 101, Public Speaking, Course Assessment Loop

Start Faculty created an assessment rubric All full-time faculty used the rubric on randomly with three main criteria: delivery, selected (every 5th student) speeches and rated organization, research students 1-4 according to the rubric on each criteria Student speeches were assessed and

the data was discussed by the Speech faculty End (for now) Changed pedagogy- 1. Beefed up research instruction 2. Provided supplemental experiences with research.

Fred Trapp, Cambridge West Partnership, LLC; [email protected] 16

The faculty concluded that the weakest area of student performance was research

115

Page 118: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Cabrillo College

Critical Thinking, General Education Assessment Loop

Start Instructors individually assessed Faculty scored the student work with a rubric one critical thinking assignment in, and analyzed student performance and needs one class Department met to share results.

They concluded that students needed help with reading. End (for now) Faculty changed pedagogy- 1. Revamped classes to include reading techniques.

Fred Trapp, Cambridge West Partnership, LLC; [email protected] 17

The department received funds for training in integrating reading and writing.

116

Page 119: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Capital College (CT)

Common Writing Assignment, General Education Assessment Loop

Start GE outcomes were articulated & 12 faculty teaching 15 classes in 100 sample papers were grading rubrics were developed different disciplines provided a scored twice by two common writing assignment readers each using a holistic to their students then an analytic rubric Students were weakest in the development

of ideas supported by evidence and in the use of language. Students who reported having written essays in classes other than English demonstrated greater levels of writing skills & were more likely to achieve a proficiency score. End (for now) 1. Committee on writing standards started a coordinated dialogue & professional development activities to improve writing across the college by supporting early and continuous student practice in writing with emphasis on development of ideas and use of language. 2. College policy and practice were reviewed to find ways to enforce the ideal of completing developmental English first or if placed into upper level composition completing it within the first 15 units of college work.

Fred Trapp, Cambridge West Partnership, LLC; [email protected] 18

117

Page 120: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Fred Trapp, Cambridge West Partnership, LLC; [email protected] 19

Bakersfield College

Biology Allied Health Curriculum Pathway (Locally Defined Program) Assessment Loop

Start Wrote the SLOs for courses 1. Looked at success & retention Analyzed & discussed in the health-care pathway data in individual pathway courses the data & the program in total. 2. Assessed the course SLOs. Changed the curriculum-

1. Rewrote course linkages & layout. 2. Reduced hours in courses. End (for now) Changed curriculum- 1. Added a supplemental instructional lab for those who needed it. 2. Added a capstone course for students on the waiting list to get into programs. Assessed with embedded exam questions

118

Page 121: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lecture / Discussion 5:

Developing and Applying

Rubrics

Mary Allen

119

Page 122: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Developing and Applying Rubrics Mary J. Allen, [email protected], September 2009

Rubrics provide the criteria for classifying products or behaviors into categories that vary along a continuum. They can be used to classify virtually any product or behavior, such as essays, research reports, portfolios, works of art, recitals, oral presentations, performances, and group activities. Judgments can be self-assessments by students; or judgments can be made by others, such as faculty, other students, fieldwork supervisors, and external reviewers. Rubrics can be used to provide formative feedback to students, to grade students, and/or to assess programs. There are two major types of scoring rubrics: • Holistic scoring — one global, holistic score for a product or behavior • Analytic rubrics — separate, holistic scoring of specified characteristics of a product or

behavior

Rubric Examples

• Holistic Critical Thinking Scoring Rubric (Facione & Facione) • Holistic Critical Thinking Rubric (Portland State University) • Critical Thinking Rubric (Northeastern Illinois University) • Scoring Guide for Critical Thinking (California State University, Fresno) • Information Competence (CA State University) • Writing Rubric (Roanoke College)

120

Page 123: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Holistic Critical Thinking Scoring Rubric Facione and Facione

4

Consistently does all or almost all of the following: Accurately interprets evidence, statements, graphics, questions, etc. Identifies the salient arguments (reasons and claims) pro and con. Thoughtfully analyzes and evaluates major alternative points of view. Draws warranted, judicious, non-fallacious conclusions. Justifies key results and procedures, explains assumptions and reasons. Fair-mindedly follows where evidence and reasons lead.

3

Does most or many of the following: Accurately interprets evidence, statements, graphics, questions, etc. Identifies relevant arguments (reasons and claims) pro and con. Offers analyses and evaluations of obvious alternative points of view. Draws warranted, non-fallacious conclusions. Justifies some results or procedures, explains reasons. Fair-mindedly follows where evidence and reasons lead.

2

Does most or many of the following: Misinterprets evidence, statements, graphics, questions, etc. Fails to identify strong, relevant counter-arguments. Ignores or superficially evaluates obvious alternative points of view. Draws unwarranted or fallacious conclusions. Justifies few results or procedures, seldom explains reasons. Regardless of the evidence or reasons, maintains or defends views based on

self-interest or preconceptions.

1

Consistently does all or almost all of the following: Offers biased interpretations of evidence, statements, graphics, questions,

information, or the points of view of others. Fails to identify or hastily dismisses strong, relevant counter-arguments. Ignores or superficially evaluates obvious alternative points of view. Argues using fallacious or irrelevant reasons, and unwarranted claims. Does not justify results or procedures, nor explain reasons. Regardless of the evidence or reasons, maintains or defends views based on

self-interest or preconceptions. Exhibits close-mindedness or hostility to reason.

(c) 1994, Peter A. Facione, Noreen C. Facione, and The California Academic Press. 217 La Cruz Ave., Millbrae, CA 94030. Permission is hereby granted to students, faculty, staff, or administrators at public or nonprofit educational institutions for unlimited duplication of the critical thinking scoring rubric, rating form, or instructions herein for local teaching, assessment, research, or other educational and noncommercial uses, provided that no part of the scoring rubric is altered and that "Facione and Facione" are cited as authors. Retrieved September 2, 2005 from http://www.insightassessment.com/pdf_files/rubric.pdf

121

Page 124: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Portland State University Studies Program Holistic Critical Thinking Rubric*

Inquiry and Critical Thinking Rubric Students will learn various modes of inquiry through interdisciplinary curricula—problem posing, investigating, conceptualizing—in order to become active, self-motivated, and empowered learners. 6 (Highest)—Consistently does all or almost all of the following: • Accurately interprets evidence, statements, graphics, questions, etc. • Identifies the salient arguments (reasons and claims) pro and con. • Thoughtfully analyzes and evaluates major alternative points of view. • Generates alternative explanations of phenomena or event. • Justifies key results and procedures, explains assumptions and reasons. • Fair-mindedly follows where evidence and reasons lead. • Makes ethical judgments. 5—Does most the following: • Accurately interprets evidence, statements, graphics, questions, etc. • Thinks through issues by identifying relevant arguments (reasons and claims) pro and con. • Offers analysis and evaluation of obvious alternative points of view. • Generates alternative explanations of phenomena or event. • Justifies (by using) some results or procedures, explains reasons. • Fair-mindedly follows where evidence and reasons lead. 4—Does most the following: • Describes events, people, and places with some supporting details from the source. • Make connections to sources, either personal or analytic. • Demonstrates a basic ability to analyze, interpret, and formulate inferences. • States or briefly includes more than one perspective in discussing literature, experiences, and

points of view of others. • Takes some risks by occasionally questioning sources or by stating interpretations and

predictions. • Demonstrates little evidence of rethinking or refinement of one’s own perspective. 3—Does most or many of the following: • Respond by retelling or graphically showing events or facts. • Makes personal connections or identifies connections within or between sources in a limited

way. Is beginning to use appropriate evidence to back ideas. • Discusses literature, experiences, and points of view of others in terms of own experience. • Responds to sources at factual or literal level. • Includes little or no evidence of refinement of initial response or shift in dualistic thinking. • Demonstrates difficulty with organization and thinking is uneven.

122

Page 125: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

2—Does many or most the following: • Misinterprets evidence, statements, graphics, questions, etc. • Fails to identify strong, relevant counter arguments. • Draws unwarranted or fallacious conclusions. • Justifies few results or procedures, seldom explains reasons. • Regardless of the evidence or reasons, maintains or defends views based on self-interest or

preconceptions. 1 (lowest)—Consistently does all or almost all of the following: • Offers biased interpretations of evidence, statements, graphics, questions, information, or the

points of view of others. • Fails to identify or hastily dismisses strong, relevant counterarguments. • Ignores or superficially evaluates obvious alternative points of view. Argues using fallacious

or irrelevant reasons and unwarranted claims. • Does not justify results or procedures, nor explain reasons. • Exhibits close-mindedness or hostility to reason. X—No basis for scoring. (Use only for missing or malfunctioning portfolios.) *taken verbatim from Stevens, D. D., & Levi, A. J. (2005). Introduction to Rubrics. Sterling, VA: Stylus, pp. 122-123

123

Page 126: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Northeastern Illinois University General Education Critical Thinking Rubric Retrieved 3/2/05 from http://www.neiu.edu/~neassess/gened.htm#rubric

Quality

Macro Criteria No/Limited Proficiency (D&E)

Some Proficiency (C) Proficiency (B) High Proficiency (A)

1. Identifies & Explains Issues Fails to identify, summarize, or explain the main problem or question. Represents the issues inaccurately or inappropriately.

Identifies main issues but does not summarize or explain them clearly or sufficiently

Successfully identifies and summarizes the main issues, but does not explain why/how they are problems or create questions

Clearly identifies and summarizes main issues and successfully explains why/how they are problems or questions; and identifies embedded or implicit issues, addressing their relationships to each other.

2. Distinguishes Types of Claims

Fails to label correctly any of the factual, conceptual and value dimensions of the problems and proposed solutions.

Successfully identifies some, but not all of the factual, conceptual, and value aspects of the questions and answers.

Successfully separates and labels all the factual, conceptual, and value claims

Clearly and accurately labels not only all the factual, conceptual, and value, but also those implicit in the assumptions and the implications of positions and arguments.

3. Recognizes Stakeholders and Contexts

Fails accurately to identify and explain any empirical or theoretical contexts for the issues. Presents problems as having no connections to other conditions or contexts.

Shows some general understanding of the influences of empirical and theoretical contexts on stakeholders, but does not identify many specific ones relevant to situation at hand.

Correctly identifies all the empirical and most of theoretical contexts relevant to all the main stakeholders in the situation.

Not only correctly identifies all the empirical and theoretical contexts relevant to all the main stakeholders, but also finds minor stakeholders and contexts and shows the tension or conflicts of interests among them.

4. Considers Methodology Fails to explain how/why/which specific methods of research are relevant to the kind of issue at hand.

Identifies some but not all methods required for dealing with the issue; does not explain why they are relevant or effective.

Successfully explains how/why/which methods are most relevant to the problem.

In addition to explaining how/why/which methods are typically used, also describes embedded methods and possible alternative methods of working on the problem.

5. Frames Personal Responses and Acknowledges Other Perspectives

Fails to formulate and clearly express own point of view, (or) fails to anticipate objections to his/her point of view, (or) fails to consider other perspectives and position.

Formulates a vague and indecisive point of view, or anticipates minor but not major objections to his/her point of view, or considers weak but not strong alternative positions.

Formulates a clear and precise personal point of view concerning the issue, and seriously discusses its weaknesses as well as its strengths.

Not only formulates a clear and precise personal point of view, but also acknowledges objections and rival positions and provides convincing replies to these.

124

Page 127: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

California State University, Fresno General Education Scoring Guide for Critical Thinking Retrieved 3/2/05 from http://www.csufresno.edu/cetl/assessment/CTScoring.doc

Scoring Level Interpretation Analysis & Evaluation Presentation

Analyzes insightful questions Examines conclusions Argues succinctly Refutes bias Uses reasonable judgment Discusses issues thoroughly Critiques content Discriminates rationally Shows intellectual honesty Examines inconsistencies Synthesizes data Justifies decisions 4 - Accomplished

Values information Views information critically Assimilates information

Asks insightful questions Formulates conclusions Argues clearly Detects bias. Recognizes arguments Identifies issues Categorizes content. Notices differences Attributes sources naturally Identifies inconsistencies Evaluates data Suggests solutions 3 - Competent

Recognizes context

Seeks out information

Incorporates information

Identifies some questions Identifies some conclusions Misconstructs arguments Notes some bias Sees some arguments Generalizes issues Recognizes basic content Identifies some differences Cites sources States some inconsistencies Paraphrases data Presents few options 2 - Developing

Selects sources adequately Assumes information valid Overlooks some information

Fails to question data Fails to draw conclusions Omits argument Ignores bias Sees no arguments Misrepresents issues Misses major content areas Overlooks differences Excludes data Detects no inconsistencies Repeats data Draws faulty conclusions

1 - Beginning

Chooses biased sources Omits research Shows intellectual dishonesty

125

Page 128: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Rubrics for Assessing Information Competence in the California State University ACRL Standard Beginning Proficient Advanced

1. Determine the Extent of the Information Needed

Student is unable to effectively formulate a research question based on an information need.

Student can formulate a question that is focused and clear. Student identifies concepts related to the topic, and can find a sufficient number of information resources to meet the information need.

Question is focused, clear, and complete. Key concepts and terms are identified. Extensive information sources are identified in numerous potential formats.

2. Access the Needed Information Effectively and Efficiently

Student is unfocused and unclear about search strategy. Time is not used effectively and efficiently. Information gathered lacks relevance, quality, and balance.

Student executes an appropriate search strategy within a reasonable amount of time. Student can solve problems by finding a variety of relevant information resources, and can evaluate search effectiveness.

Student is aware and able to analyze search results, and evaluate the appropriateness of the variety of (or) multiple relevant sources of information that directly fulfill an information need for the particular discipline,

3. Evaluate Information and its Sources Critically

Student is unaware of criteria that might be used to judge information quality. Little effort is made to examine the information located

Student examines information using criteria such as authority, credibility, relevance, timeliness, and accuracy, and is able to make judgments about what to keep and what to discard.

Multiple and diverse sources and viewpoints of information are compared and evaluated according to specific criteria appropriate for the discipline. Student is able to match criteria to a specific information need, and can articulate how identified sources relate to the context of the discipline.

4. Use Information Effectively to Accomplish a Specific Purpose

Student is not aware of the information necessary to research a topic, and the types of data that would be useful in formulating a convincing argument. Information is incomplete and does not support the intended purpose.

Student uses appropriate information to solve a problem, answer a question, write a paper, or other purposes

Student is aware of the breadth and depth of research on a topic, and is able to reflect on search strategy, synthesize and integrate information from a variety of sources, draw appropriate conclusions, and is able to clearly communicate ideas to others

5. Understand the Economic, Legal, and Social Issues surrounding the Use of Information, and Access and Use Information Ethically and Legally

Student is unclear regarding proper citation format, and/or copies and paraphrases the information and ideas of others without giving credit to authors. Student does not know how to distinguish between information that is objective and biased, and does not know the role that free access to information plays in a democratic society.

Student gives credit for works used by quoting and listing references. Student is an ethical consumer and producer of information, and understands how free access to information, and free expression, contribute to a democratic society.

Student understands and recognizes the concept of intellectual property, can defend him/herself if challenged, and can properly incorporate the ideas/published works of others into their own work building upon them. Student can articulate the value of information to a free and democratic society, and can use specific criteria to discern objectivity/fact from bias/propaganda.

*Prepared by the CSU Information Competence Initiative, October 2002, based on the 2000 ACRL Information Literacy Competency Standards For Higher Education. For more information, see http://www.calstate.edu/LS/1_rubric.doc.

126

Page 129: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Draft of Writing Rubric—Retrieved August 28, 2008 from http://web.roanoke.edu/Documents/Writing%20Rubrics.July%2007.doc Below Basic Basic Proficient Advanced Ideas Shows minimal engagement

with the topic, failing to recognize multiple dimensions/ perspectives; lacking even basic observations

Shows some engagement with the topic without elaboration; offers basic observations but rarely original insight

Demonstrates engagement with the topic, recognizing multiple dimensions and/or perspectives; offers some insight

Demonstrates engagement with the topic, recognizing multiple dimensions and/or perspectives with elaboration and depth; offers considerable insight

Focus and Thesis

Paper lacks focus and/or a discernible thesis.

Some intelligible ideas, but thesis is weak, unclear, or too broad.

Identifiable thesis representing adequate understanding of the assigned topic; minimal irrelevant material

Clear, narrow thesis representing full understanding of the assignment; every word counts

Evidence Little to no evidence Some evidence but not enough to develop argument in unified way. Evidence may be inaccurate, irrelevant, or inappropriate for the purpose of the essay

Evidence accurate, well documented, and relevant, but not complete, well integrated, and/or appropriate for the purpose of the essay

Evidence is relevant, accurate, complete, well integrated, well documented, and appropriate for the purpose of the essay.

Organization Organization is missing both overall and within paragraphs. Introduction and conclusion may be lacking or illogical.

Organization, overall and/or within paragraphs, is formulaic or occasionally lacking in coherence; few evident transitions. Introduction and conclusion may lack logic.

Few organizational problems on any of the 3 levels (overall, paragraph, transitions). Introduction and conclusion are effectively related to the whole.

Organization is logical and appropriate to assignment; paragraphs are well-developed and appropriately divided; ideas linked with smooth and effective transitions. Introduction and conclusion are effectively related to the whole.

Style and Mechanics

Multiple and serious errors of sentence structure; frequent errors in spelling and capitalization; intrusive and/or inaccurate punctuation such that communication is hindered. Proofreading not evident.

Sentences show errors of structure and little or no variety; many errors of punctuation, spelling and/or capitalization. Errors interfere with meaning in places. Careful proofreading not evident.

Effective and varied sentences; some errors in sentence construction; only occasional punctuation, spelling and/or capitalization errors.

Each sentence structured effectively, powerfully; rich, well-chosen variety of sentence styles and length; virtually free of punctuation, spelling, capitalization errors.

127

Page 130: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Rubrics have many strengths: • Complex products or behaviors can be examined efficiently. • Developing a rubric helps to precisely define faculty expectations. • Well-trained reviewers apply the same criteria and standards. • Rubrics are criterion-referenced, rather than norm-referenced. Raters ask, “Did the student

meet the criteria for level 5 of the rubric?” rather than “How well did this student do compared to other students?” This is more compatible with cooperative and collaborative learning environments than competitive grading schemes and is essential when using rubrics for program assessment because you want to learn how well students have met your standards.

• Ratings can be done by students to assess their own work, or they can be done by others, e.g., peers, fieldwork supervisions, or faculty.

Rubrics can be useful for grading, as well as assessment. Below is a rubric for assessing oral presentation skills, followed by four examples of grading rubrics based on adapting the assessment rubric. With calibration, these grading rubrics can be used to assess the program learning outcome by aggregating the results for Organization, Content, and Delivery across courses. Below Expectation Satisfactory Exemplary

Organization No apparent

organization. Evidence is not used to support assertions.

The presentation has a focus and provides some evidence which supports conclusions.

The presentation is carefully organized and provides convincing evidence to support conclusions.

Content The content is inaccurate or overly general. Listeners are unlikely to learn anything or may be misled.

The content is generally accurate, but incomplete. Listeners may learn some isolated facts, but they are unlikely to gain new insights about the topic.

The content is accurate and complete. Listeners are likely to gain new insights about the topic.

Delivery The speaker appears anxious and uncomfortable, and reads notes, rather than speaks. Listeners are largely ignored.

The speaker is generally relaxed and comfortable, but too often relies on notes. Listeners are sometimes ignored or misunderstood.

The speaker is relaxed and comfortable, speaks without undue reliance on notes, and interacts effectively with listeners.

128

Page 131: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Example 1. Numbers are used for grading; categories (Below Expectation, Satisfactory, Exemplary) are used for assessment. Individual faculty determine how to assign numbers for their course grading. Faculty may circle or underline material in the cells to emphasize criteria that were particularly important during the assessment/grading.

Analytic Rubric for Grading Oral Presentations Below Expectation Satisfactory Exemplary

Score

Organization No apparent

organization. Evidence is not used to support assertions.

(0-4)

The presentation has a focus and provides some evidence which supports conclusions.

(5-6)

The presentation is carefully organized and provides convincing evidence to support conclusions.

(7-8)

Content The content is inaccurate or overly general. Listeners are unlikely to learn anything or may be misled.

(0-8)

The content is generally accurate, but incomplete. Listeners may learn some isolated facts, but they are unlikely to gain new insights about the topic.

(9-11)

The content is accurate and complete. Listeners are likely to gain new insights about the topic.

(12-13)

Delivery The speaker appears anxious and uncomfortable, and reads notes, rather than speaks. Listeners are largely ignored.

(0-5)

The speaker is generally relaxed and comfortable, but too often relies on notes. Listeners are sometimes ignored or misunderstood.

(6-7)

The speaker is relaxed and comfortable, speaks without undue reliance on notes, and interacts effectively with listeners.

(8-9)

Total Score

129

Page 132: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Example 2. Weights are used for grading; categories (Below Expectation, Satisfactory, Exemplary) are used for assessment. Individual faculty determine how to assign weights for their course grading. Faculty may circle or underline material in the cells to emphasize criteria that were particularly important during the assessment/grading.

Analytic Rubric for Grading Oral Presentations Below Expectation Satisfactory Exemplary

Weight

Organization No apparent organization. Evidence is not used to support assertions.

The presentation has a focus and provides some evidence which supports conclusions.

The presentation is carefully organized and provides convincing evidence to support conclusions

30%

Content The content is inaccurate or overly general. Listeners are unlikely to learn anything or may be misled.

The content is generally accurate, but incomplete. Listeners may learn some isolated facts, but they are unlikely to gain new insights about the topic.

The content is accurate and complete. Listeners are likely to gain new insights about the topic.

50%

Delivery The speaker appears anxious and uncomfortable, and reads notes, rather than speaks. Listeners are largely ignored.

The speaker is generally relaxed and comfortable, but too often relies on notes. Listeners are sometimes ignored or misunderstood.

The speaker is relaxed and comfortable, speaks without undue reliance on notes, and interacts effectively with listeners.

20%

Comments

130

Page 133: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Example 3. The faculty member checks off characteristics of the speech and determines the grade based on a holistic judgment. The categories (Below Expectation, Satisfactory, Exemplary) are used for assessment. Individual faculty might add scores or score ranges (see Example 1) or a “Weight” column (see Example 2) for grading purposes.

Analytic Rubric for Grading Oral Presentations Below Expectation Satisfactory Exemplary

Organization No apparent

organization. Evidence is not

used to support assertions.

The presentation has a focus.

Student provides some evidence which supports conclusions.

The presentation is carefully organized.

Speaker provides convincing evidence to support conclusions

Content The content is inaccurate or overly general.

Listeners are unlikely to learn anything or may be misled.

The content is generally accurate, but incomplete.

Listeners may learn some isolated facts, but they are unlikely to gain new insights about the topic.

The content is accurate and complete.

Listeners are likely to gain new insights about the topic.

Delivery The speaker appears anxious and uncomfortable.

Speaker reads notes, rather than speaks.

Listeners are largely ignored.

The speaker is generally relaxed and comfortable.

Speaker too often relies on notes.

Listeners are sometimes ignored or misunderstood.

The speaker is relaxed and comfortable.

Speaker speaks without undue reliance on notes.

Speaker interacts effectively with listeners.

Comments:

131

Page 134: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Example 4. Combinations of Various Ideas. As long as the nine assessment cells are used in the same way by all faculty, grading and assessment can be done simultaneously.

Analytic Rubric for Grading Oral Presentations Below

Expectation 1

Satisfactory 2

Exemplary 3

Weight

Organization No apparent organization.

Evidence is not used to support assertions.

The presentation has a focus.

Speaker provides some evidence which supports conclusions.

The presentation is carefully organized.

Speaker provides convincing evidence to support conclusions

20%

Content The content is inaccurate or overly general.

Listeners are unlikely to learn anything or may be misled.

The content is generally accurate, but incomplete.

Listeners may learn some isolated facts, but they are unlikely to gain new insights about the topic.

The content is accurate and complete.

Listeners are likely to gain new insights about the topic.

40%

Delivery The speaker appears anxious and uncomfortable.

Speaker reads notes, rather than speaks.

Listeners are largely ignored.

The speaker is generally relaxed and comfortable.

Speaker too often relies on notes.

Listeners are sometimes ignored or misunderstood.

The speaker is relaxed and comfortable.

Speaker speaks without undue reliance on notes.

Speaker interacts effectively with listeners.

20%

References Speaker fails to refer to journal articles.

Speaker refers to 1 or 2 journal articles.

Speaker refers to 3 or more journal articles.

20%

132

Page 135: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Assessment vs. Grading Concerns

• Grading rubrics may include criteria that are not related to the learning outcome being assessed. These criteria are used for grading, but are ignored for assessment.

• Grading requires more precision than assessment. • Assessment rubrics should focus only on the outcome being assessed. • If multiple faculty will use the rubric for grading or assessment, consider calibrating them.

This is especially important when doing assessment.

Rubrics Can:

• Speed up grading • Provide routine formative feedback to students • Clarify expectations to students • Reduce student grade complaints • Improve the reliability and validity of assessments and grades • Make grading and assessment more efficient and effective by focusing the faculty member on

important dimensions • Help faculty create better assignments that ensure that students display what you want them to

demonstrate

Suggestions for Using Rubrics in Courses 1. Hand out the grading rubric with the assignment so students will know your expectations and

how they'll be graded. 2. Use a rubric for grading student work and return the rubric with the grading on it. 3. Develop a rubric with your students for an assignment or group project. Students can then

monitor themselves and their peers using agreed-upon criteria that they helped develop. Many faculty find that students will create higher standards for themselves than faculty would impose on them.

4. Have students apply your rubric to some sample products before they create their own. Faculty report that students are quite accurate when doing this, and this process should help them evaluate their own products as they are being developed. The ability to evaluate, edit, and improve draft documents is an important skill.

5. Have students exchange paper drafts and give peer feedback using the rubric, then give students a few days before the final drafts are turned in to you. You might also require that they turn in the draft and scored rubric with their final paper.

6. Have students self-assess their products using the grading rubric and hand in the self-assessment with the product; then faculty and students can compare self- and faculty-generated evaluations.

133

Page 136: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Rubric Category Labels • Unacceptable, Marginal, Acceptable, Exemplary • Below Expectations, Developing, Meets Expectations, Exceeds Expectations • Novice, Apprentice, Proficient, Expert • Emerging, Developing, Proficient, Insightful • Below Basic, Basic, Proficient, Advanced (AAC&U Board of Directors, Our Students Best

Work, 2004)

Creating a Rubric

1. Adapt an already-existing rubric. 2. Analytic Method 3. Expert-Systems Method

Managing Group Readings 1. One reader/document. 2. Two independent readers/document, perhaps with a third reader to resolve discrepancies. 3. Paired readers. Before inviting colleagues to a group reading, 1. Develop and pilot test the rubric. 2. Select exemplars of weak, medium, and strong student work. 3. Develop a system for recording scores. 4. Consider pre-programming a spreadsheet so data can be entered and analyzed during the

reading and participants can discuss results immediately. Inter-Rater Reliability • Correlation Between Paired Readers • Discrepancy Index

Scoring Rubric Group Orientation and Calibration

1. Describe the purpose for the review, stressing how it fits into program assessment plans. Explain that the purpose is to assess the program, not individual students or faculty, and describe ethical guidelines, including respect for confidentiality and privacy.

134

Page 137: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

2. Describe the nature of the products that will be reviewed, briefly summarizing how they were obtained.

3. Describe the scoring rubric and its categories. Explain how it was developed. 4. Explain that readers should rate each dimension of an analytic rubric separately, and they

should apply the criteria without concern for how often each category is used. 5. Give each reviewer a copy of several student products that are exemplars of different levels

of performance. Ask each volunteer to independently apply the rubric to each of these products, and show them how to record their ratings.

6. Once everyone is done, collect everyone’s ratings and display them so everyone can see the degree of agreement. This is often done on a blackboard, with each person in turn announcing his/her ratings as they are entered on the board. Alternatively, the facilitator could ask raters to raise their hands when their rating category is announced, making the extent of agreement very clear to everyone and making it very easy to identify raters who routinely give unusually high or low ratings.

7. Guide the group in a discussion of their ratings. There will be differences, and this discussion is important to establish standards. Attempt to reach consensus on the most appropriate rating for each of the products being examined by inviting people who gave different ratings to explain their judgments. Usually consensus is possible, but sometimes a split decision is developed, e.g., the group may agree that a product is a “3-4” split because it has elements of both categories. You might allow the group to revise the rubric to clarify its use, but avoid allowing the group to drift away from the learning outcome being assessed.

8. Once the group is comfortable with the recording form and the rubric, distribute the products and begin the data collection.

9. If you accumulate data as they come in and can easily present a summary to the group at the end of the reading, you might end the meeting with a discussion of five questions: a. Are results sufficiently reliable? b. What do the results mean? Are we satisfied with the extent of student learning? c. Who needs to know the results? d. What are the implications of the results for curriculum, pedagogy, or student or faculty

support services? e. How might the assessment process, itself, be improved?

Assessment Standards: How Good Is Good Enough? Examples: 1. We would be satisfied if at least 80% of the students are at level 3 or higher. 2. We would be satisfied if no more than 5% of students are at level 1 and at least 80% are at

level 3. 3. We would be satisfied if at least 80% of the students are at level 3 and at least 10% are at

level 4.

135

Page 138: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lecture / Discussion 6:

Analyzing Evidence of

Student Learning to Improve Our Practice

Amy Driscoll

136

Page 139: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

1

Analyzing Evidence Analyzing Evidence of Student Learning of Student Learning

to Improve Our Practiceto Improve Our Practice

Amy DriscollAmy DriscollWASC Educational SeminarWASC Educational Seminar

Level ILevel ISept. 24, 2009Sept. 24, 2009

2

Deficiencies in current academic currency:

• Inability to communicate outcomes of multiple learning experiences

• Lack of agreed upon achievement criteria

• Inconsistent faculty judgments

3

Definitions

Reliability: Agreement among faculty that outcomes have been achieved

Validity: Criteria describes what is intended as a common referent

Improvements: Usefulness in revising pedagogy and curriculum for increased student learning

137

Page 140: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

4

Preparation for Analysis of Student Work

• Outcomes, criteria, standards, evidence• Student permission• Systematic collection of representative

student work samples (evidence)• Faculty motivation, interest and

commitment• Time and resources ($, expertise)• Previous assessment experiences- value

for teaching and learning

5

Assessment Protocols

Goal Outcomes

Evidence

Criteria

Standards:

a) Exemplary Achievement

b) Satisfactory Achievement

c) Unsatisfactory Achievement

6

Assessing Student Learning: Course, Program and Institutional Levels

1. Preparation: Determine purpose(s) and definition of assessment; examine mission and values

2. Design assessment: articulate goals; develop clear outcomes, evidence, criteria and standards

3. Alignment of curriculum and pedagogy within learning outcomes

4. Make outcomes, evidence, criteria and standards “public and visible” (syllabi, programs, brochures, etc.)

5. Collect evidence of student achievement

6. Review and analyze student evidence

7. Revise outcomes and criteria, improve pedagogy and curriculum for learner’s success

138

Page 141: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

7

Release Form For Use of Student Work Samples

CSUMB is currently collecting samples of student work—work that demonstrates the outcomes and criteria of the University Learning Requirements. Faculty groups will analyze the work as part of a process of studying the ULR’s and related assessment processes. Some of the work in our class will be collected for use in the analysis project. Student names will not appear on the work samples at any time, and analysis will not occur until the course is complete, student work has been evaluated and grades have been assigned.

You are asked to sign the release form below to indicate your permission for use of your work in this class. If you choose not to permit use of your work, you are also asked to sign the form below.

Course Instructor: _______________________________________________________________________

Course Name and Number: ________________________________________________________________

RELEASE FORM

DATE: _______________________

I understand that CSUMB is collecting student work samples for analysis in the process of examining the ULR’s and related assessment processes. My work may be copied and saved for the analysis project.

I understand that my name will not appear on the work samples at any time, and that the analysis of my work will not occur until after the course is complete, my work has been evaluated and my grade has been assigned.

___ I give permission to use my work in the ULR analysis project.

___ I do not give permission to use my work in the ULR analysis project.

Print your name: __________________________ Signature: ________________________________

8

Analysis Process

• Holistic reading: Check on achievement of outcomes (Reliability)

• Verification of criteria (Validity

• Implications for improvement (Usefulness)

9

The Review/Analysis Process Produced the Following:

• Verification and documentation of student achievement of most outcomes

• Revision of outcomes• Revision of criteria• Changes in courses and pedagogy• Improvement of assessment• Decision to have ongoing review/analysis

139

Page 142: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lecture / Discussion 7:

The Administrators’ Role in Assessment of Student

Learning

Barbara Wright

140

Page 143: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Assessment Tips for the Top Here are some points for top administrators to think about as they work to implement assessment of student learning and defuse resistance on their campus. 1. Be informed a) Assessment of student learning has evolved greatly in the last 15-20 years. It is not the phenomenon it was when the movement to assess gathered momentum in the mid-80s. As a result, assumptions, methods, purposes, and even the definitions of basic vocabulary have changed. On a campus, this can lead to enormous confusion about what people are being asked to do and why. So first and foremost, know what the options are, and what it is you are trying to accomplish on your campus. b) Plan to promote what experience has shown is best practice. Assess for quality improvement, not just quality assurance Focus on student learning, not surrogates (GPA, satisfaction surveys, etc.) Pursue outcomes that are educationally important, not just easily measurable Use authentic methods (i.e., those that reflect what students will need to do once they’re out in the real world), not just traditional academic ones (e.g., multiple-choice tests) Gather qualitative as well as quantitative evidence of learning Promote good, inclusive process as well as products (i.e. findings, changes) Close the loop not just with feedback but with actions leading to improvement Support assessment with formal structures, planning, budgeting 2. Communicate a) Many bodies – state departments of higher education, system offices, regional and professional accrediting associations – require assessment of student learning. However, the real reason to do it is because it’s the right thing to do if we care about our students and what they learn. This is a reason that makes sense to faculty. It can’t be repeated too often. b) It’s OK to leverage the pressure from external bodies, but don’t overdo it; otherwise the message in 2.a. is undermined. c) When there are good things to celebrate, we should do it. When there are less than wonderful findings, we need to acknowledge them candidly, then emphasize that this is a great opportunity for improvement with maximum value added. d) We need to report on the findings and results of assessment efforts regularly in publications like web pages and the student newspaper or alumni magazine,

e) Assessment expectations should be included in the catalogue and view book as well as mission statements at all levels. Job descriptions, faculty and staff handbooks, employment contracts, and the like should name assessment as a routine responsibility whenever appropriate. f) Make sure communication is a two-way street. 3. Provide reassurance a) The campus needs to know you do not plan to use assessment as a witch hunt or a thinly disguised ploy to cut lines and terminate programs. This may be the last thing on your mind, but it’s the first thing on a lot of faculty minds. Experience shows that if faculty do harbor these fears, they will not face problems candidly but rather seek to conceal them. That undermines the integrity and usefulness of the whole process. b) Faculty and programs need to know that if those external entities demanding assessment have any vile plans for the findings, you’re on the side of your faculty and programs. You’ll protect them if push comes to shove. Again, repetition is key. c) Give reluctant programs a face-saving way to comply. d) Assure everyone, especially faculty, that assessment is not an attack on academic freedom, not a required curriculum. There are many legitimate paths to the same outcome.

141

Page 144: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

4. Provide support a) Don’t ask your campus to plan for assessment or carry it out without some training b) Provide parameters that reflect the conclusions you’ve come to about the assessment effort on your campus (see 1.a). Allow flexibility – it’s OK for programs’ plans to reflect their modes of inquiry and the intellectual traditions of their disciplines – but within those parameters. Don’t make faculty waste time second guessing you or figuring assessment out entirely for themselves. That just breeds resentment. c) Use training as an opportunity to get everyone on campus on the same page regarding assumptions, methods, purposes, and even the definitions of basic vocabulary d) Use training to clear away misconceptions, reduce fears, attack obstacles 5. Be efficient, inclusive -- and respectful of what's already going on a) Piggy-back onto existing processes whenever possible, e.g., connect departmental reporting on assessment to the annual reports programs are used to submitting. Link outcomes and assessment to routine course approval. But be careful when folding assessment into program review. Traditional program review focuses on inputs and processes in order to increase inputs -- usually faculty lines and budget -- or to protect against cuts in inputs. Redefine the review so that the emphasis shifts to learning outcomes and programs are rewarded for revealing a problem and then fixing it, not just for making themselves look good. (See 6. below.) b) Draw on existing expertise and models. On every campus there are pockets of assessment activity and faculty expertise to draw on, even if the work has not been labeled "assessment." c) Involve existing structures or offices such as the university senate, institutional research, or the center for teaching and learning. 6. Provide rewards a) The idea is not to buy compliance by paying for every little bit of faculty work on assessment. In fact, that’s a vicious circle you don’t want to get caught in. However judicious rewards for special contributions can help a lot. b) The idea here is not to reward the programs that keep coming up with proof that “We’re excellent”; the idea is to reward programs that say “Here’s the problem, and here’s how we solved it.” This message must be very clear when the choice of program and the reward are announced publicly. In other words, you’re rewarding “quality improvement,“ not “quality assurance.” To put it another way, the reward is for maximum value added, not status quo, no matter how good that is. c) Whenever possible and appropriate, reward programs rather than individuals. One of the biggest challenges of assessment, but also one of its biggest benefits, is that it requires faculty to act collectively, not as atomistic individuals responsible solely for their own courses or areas of specialization. d) It should become institutional policy to expect contributions to assessment as part of reappointment and tenure dossiers. Across campus, assessment efforts must be acknowledged as a form of scholarship and be clearly seen to help faculty earn promotion, tenure, and merit increases. e) The flip side of reward is punishment: doing assessment should not be a “punishment” in the form of additional workload without some sort of compensation, either for individuals or the program. In other words, a faculty member who contributes to assessment needs to be relieved of some other ongoing responsibility. (This also sends a message about the value of work in assessment. Add-ons are seldom taken seriously and they never last.) f) The fact that a junior faculty member has worked on assessment should never be allowed to count against him/her in promotion or tenure proceedings. 7. Provide funding a) People follow the money and faculty are especially good at this! To be taken seriously, assessment has to have money behind it. Money is both a practical aid to getting things done and a powerful symbolic

142

Page 145: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

message. The pot doesn’t have to be big – some highly effective assessment strategies are actually very cheap – but it needs to be very visible. b) Plan on continued budgeting for assessment, not a one-shot infusion. Money is essential for start-up costs – training, consultants, instrument development – as well as for ongoing activities: retreats, conference presentations, reports and publications, etc. c) Think about program-level performance funding. This phrase has horrible connotations at the state level, but it can work on campus. The key thing here is that “performance” refers to the carrying out of high-quality, high-impact assessment, not achievement of high scores or other traditional indicators of quality.

In one model, when departments submit annual budget requests, they have to back up academic requests with assessment findings. The budgets get reviewed by a faculty/staff budget review committee, which makes recommendations before passing the budget up to higher levels. The committee looks for assessment findings to back up requests and bases its recommendations on the quality of the program’s evidence, analysis, and plans for improvement. Administration generally follows the recommendations, thus enhancing the status of both the committee and of assessment. This process is useful for several reasons: 1) it’s highly motivating; 2) it underscores the seriousness of the assessment effort – and consequences of failure to engage in it; 3) it exerts powerful negative pressure: the requests of departments not doing assessment really do go unfunded and their programs gradually fall behind; 4) it provides transparency; and 5) it educates the campus, from committee members out, in widening circles, about how to do assessment well. 8. Aim for broad involvement To change campus culture, you need broad involvement or the change will remain superficial and fail to take hold. That means a) the whole range of campus experience eventually needs to be assessed: not just the major but general education, first-year experience, student government, internships, community service, dorm life, extracurricular opportunities, etc. b) not just faculty but professional academic staff, students, external advisors, alums, employers, etc. need to participate as appropriate. c) The whole chain of command needs to be on board and on the same page about the institution’s philosophy and purpose in doing assessment, from president and AAVP through deans and department/program chairs 9. Institutionalize Formal structures will legitimize assessment. Eventually the assessment effort needs to move out of the realm of a special grant-funded project or one program’s experiment and become standard practice supported by an office, advisory committee, staff, budget, reporting relationships, and oversight. This can happen gradually and it needn’t be elaborate, but it needs to happen. 10. Codify a) Eventually, appropriate documents need to refer explicitly to assessment. (See the list in 2.d, e.) b) In negotiations with AAUP or other unions, it may be best to emphasize that assessment is not the end; it is merely the means to a worthy end: better education. Just as computers are a powerful tool, one that no campus wants to be without, so too assessment is our key tool for improving student learning and enhancing institutional quality.

Barbara D. Wright Revised September 2009

[email protected]

143

Page 146: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

144

Page 147: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

The Administrators’ Role in Assessment of Student Learning: Setting the Context for Success

Cyd Jenefsky

Setting the Context Systems view – engaging whole institution in enhancing student learning Strategic – intentional, aspirational, aligning resources/ processes/goals Tending to culture as fertile soil for change – setting expectations, priorities Intentional change management

Integrated System Promoting Student Success

“Consisting of more than isolated cells of activity” “Institutional way of behaving” (Maki, 2004)

“integrating processes of inquiry into the culture, governance, learners, and

organizational relationships that exist in an institution” (Keeling, et. al., 2008)

“We are all connected by the mutual intellectual endeavor of organizing ourselves to see that students succeed.” Bresciani, et. al. (2009)

Stages of Institutional Development

Teaching-Centered – taking teaching seriously Student/Learner-Centered – taking students seriously Learning-Centered – taking learning seriously throughout an organization

Promoting continuous organizational learning (including via assessment) in order to enhance student learning.

Shared Responsibility for Assessment

Who ‘owns’ assessment on campus? Faculty: “can’t do it without administrative leadership” Administrators: “it’s faculty who have to do it” What does it look like as a “shared responsibility”? What part do administrators own?

Administrators’ Areas of Responsibility

Structures, processes, communication channels at program, institutional levels Support and resources to initiate, build, sustain core commitment Valuing assessment of student learning as “core institutional process” (Maki, 2004)

145

Page 148: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Structures, Processes, Communication Channels Examples:

develop practices, policies, processes for using SLO results (& institutional data) for evidence-based decision-making at program, college, institutional levels

integrate assessment of SL with strategic planning & budgeting (inc. calendar cycles) create events, processes for formal & informal discussion of results across disciplines

and divisions build linkages between existing assessment activities integrate assessment of student learning into program and performance review

processes funnel results to existing committees for decision-making and planning establish expectations for standards of performance

Support & Resources to Initiate, Build Sustain Commitment Examples:

cultivate ‘champions’ (fac/staff/admin) build scaffolded professional development for fac/staff/admin (tailored to adult

learners) allot resources for doing assessment assist with streamlining assessment processes provide resources for ‘closing the loop’ share assessment results guide fac/staff/leadership to develop standards/benchmarks offer grant competition for innovation in assessment, improvement in student

learning, scholarship on student learning, etc. support scholarship of teaching, learning and assessment

Valuing Assessment of Student Learning as Core Institutional Process Examples:

collaboratively develop institutional “principles of commitment” (Maki) communicate importance of assessment of student learning, organizational learning

and evidence-based decision-making at all levels (& within university materials, syllabi, catalogue, web, etc.)

articulate coherent vision, clear expectations frame inquiry into student and organizational learning as collective enterprise

(Shulman’s “community property”) practice asking questions about student learning reward initiative, innovation, improvement in assessment of student learning and use

of results, including in RPT (review, promotion, tenure) processes celebrate achievements!

Start with what matters:

What do you most want to learn about student learning? About your organization?

146

Page 149: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

References: Allen, M. (2007). Campus support for assessment and educational effectiveness. Unpublished manuscript. Bresciani, M.J., Gardner, M.M., & Hickmott, J. (forthcoming, 2009). Demonstrating student success: A practical guide to outcomes-based assessment of student learning and development in student affairs. Sterling, VA: Stylus. D. Cooperrider & D. Whitney (2005). Appreciative Inquiry: A Positive Revolution in Change. San Francisco, CA: Berrett-Koehler. Huba, M.E. & Freed, J.E. (2000). Learner-Centered Assessment College Campuses: Shifting the Focus from Teaching to Learning. Boston: Allyn & Bacon. Keeling, R.P., Wall, A.F., Underhile, R., & Dungy, G.J. (2008). Assessment reconsidered: Institutional Effectiveness for Student Success. ICSSIA. Maki, P. L. (2004). Assessing for learning: Building a sustainable commitment across the institution. Sterling, VA: Stylus. Shulman, L. (2004). Teaching as Community Property. San Francisco, CA: Jossey-Bass.

147

Page 150: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Lecture / Discussion 8:

Assessment for

Community College Career and Technical Educational Programs

Fred Trapp

148

Page 151: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Assessment for Career & Technical Education Programs

Presented byDr. Fred Trapp

Cambridge West Partnership, LLCAdministrative Dean, Institutional Research/Academic Services

Long Beach Community College (retired)[email protected]

2

Overview

Purposes of career and technical assessmentTypes of programsAssessment plansThe assessment planCategories of assessmentsUsing the grading processDale McIver’s problemACCJC assessment rubric

3

Purposes of Assessment

Improve learning & instructionCertify individual masteryEvaluate program success

149

Page 152: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

4

Improve Learning & Instruction

Provide information on knowledge & skills learnedAdministered oftenGraded quicklyTechnical quality of assessments less critical

5

Certify Individual Mastery(not our focus in this retreat)

Focus on specific skills & knowledgeMultiple & single measuresUsed for decisions about selection, promotion, certificationQuality of measurers (reliability, validity, fairness) more important

6

Continuum of Knowledge & Skills-What is your program seeking to accomplish?

Use a computer program to assign patients to a diagnosis-related grouping.

Use computer programs to process client information.

Use health care terminology.

Evaluate medical records for completeness & accuracy.

Locate information in medical records.

Be aware of the history of health care.

Read, write, perform mathematical operations, listen & speak

Health Information Technology

Health Information Services

Health ServicesAll Workers

Specific Occupational

Skills

Occupational Cluster Skills

Industry Core Skills &

Knowledge

General Workforce Preparation

150

Page 153: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

7

Your Program and Assessment Plan

Statement of purpose

List of required courses

Restricted electives list

Curriculum matrix and analysis

Plan elements (four parts)

8

Your Program and Assessment Plan

Plan elements

1. Intended learning outcomes

Long list vs. short list

2. Means of assessment and criteria for success

3. Report of results

4. Use of results

9

Your Program and Assessment Plan

Example of an assessment plan from LBCC

Electricity Program

151

Page 154: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

10

Common Outcomes for CTETechnical skills (discipline-specific/technology/technical competency/currency)Application of discipline skills (theoretical knowledge/subject matter mastery/observations skills)Critical thinking & problem solving (assessment skills)CommunicationProfessional behavior (professional practices/ethics/teamwork)

See Handout- Determining Purposes

11

Strategies for Assessment WorkSelected vs. constructed responseStandard exams, licensing, etc.Wage data follow upOpinion dataUse student work in courses to address course and program learning outcomesConsider student’s development point in the program

Just starting vs. about to completeCapstone project or courseProgram portfolio

Specific assignment from key coursesMultiple measures are always recommended

12

Broad Categories of Assessments

XPortfolios

XProjects (research paper, project, oral presentation)

XPerformance tasks

XEssay, problem based, scenario

XShow work, concept map, journal response

XMultiple choice, true/false, matching

Written assessments

ConstructedSelectedCategory

TypeResponse

See Handout- Selecting Assessment Tasks

152

Page 155: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

13

Features of Selected & Constructed Response Assessments

XXEffective for complex cognitive skills

XXEffective for factual knowledge

XXSound basis for determining quality of scores

XXEmbodies desired learning activities

XXCredible to stakeholders

XXEfficient (requires limited time)

XXAuthentic

XXEasy to score

XXEasy to administer

XXEasy to develop

UsuallySome-times

RarelyUsuallySome-times

RarelyFeature

Selected Response Constructed Response

14

Standardized ExamsJob Ready Tests

National standardized test for two-year technical programs

Written objective test

Performance test - institution administered

15

Fields in Which Job Ready Tests Are Available

Examples:

Commercial Foods RefrigerationDiesel Engine Mechanics HorticultureChild Care Services Forestry Products Accounting/Bookkeeping Electronics

This is not an endorsement of NOCTI

National Occupational Competency Testing Institute:1-800-334-6283www.nocti.org

153

Page 156: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

16

Engineering Technician and Technologist Certification Programs

National Institute for Certification in Engineering Technologies (NICET)

• Construction Materials Testing • Fire Protection• Electrical Communications System • Low Voltage• Geo-synthetic Materials • Installation Inspection

• Land Management and Water Control • Geo-technical• Underground Utilities Construction • Transportation• Building Construction/Water and Waste Water Plants

Sponsored byNational Society of Professional Engineers Telephone: 888-476-4238

http://www.nicet.org

This is not an endorsement of NICET

17

Standardized ExamsWork Keys

Foundational skillsApplied math, technology, listening, reading for information, writing, locating information, observation

Interpersonal skillsTeamwork

www.ACT.org1-800/WORKKEY (967-5539)

This is not an endorsement of ACT Work Keys

18

Licensure Examinations Commonly Found at 2-Year Colleges

Nursing (RN, LVN, CNA)

Aviation Maintenance

Diagnostic Medical Imaging

Dietetics

Computer ApplicationsComputer Systems Engineering and Network Management

154

Page 157: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

19

Employment/Wage Follow Up as Assessment

Employment Development Department (EDD) file match

By degree/certificate/area of concentration (12 units or more)

Wages in final year at community collegeWages one year out of collegeWages three years out of college

Excludes those who transferDepends upon unemployment insurance so does not include self-employed or out of state

20

Opinion as Assessment

Student/Graduate/Alumni• Affirmation of accomplishments• Indirect measure• Primarily support evidence

21

Opinion as Assessment

Student/Graduate/AlumniCompleters

Associate Degree or Certificate of Achievement (18 units plus)

Early Leavers With Marketable Skills (ELMS)12 units in the discipline but not program award

155

Page 158: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

22

Opinion as Assessment

Student/Graduate/AlumniAre you employed in the field for which you were educated at the college?

How long?Full-time or part time?

Income range?To what extent are you able to perform _____ as a result of your education?

Provide option for N/AWhat should we emphasize more/less?Interest in continuing education opportunities

23

Opinion as Assessment

Employer/Supervisor Survey (primary, direct evidence)

Legal restrictions

Internship/Co-Op SupervisorMore specific the better

Use the Chamber of Commerce

Limit the number of questions askedHow well do graduates of our program perform?

Compared to your expectations

Compared to other educational programs

24

Opinion as Assessment

Mechanics of Survey

Identify students from college administrative data

Past three to five years

Test the name & address (post card or software)

Mail the survey with cover letter

How well do graduates of our program perform?

Include self-addressed, stamped envelope

Incentive drawing

Follow up reminder card or phone calls

156

Page 159: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

25

Intern Evaluations

Rad. Tech- Positioning Skills RatingsCorrect orientation of patient 1 2 3 4Align part to center of film 1 2 3 4Center CR to film and/or angle 1 2 3 4

CR to part and filmRemove unnecessary parts from field 1 2 3 4Correct placement of film markers, away

from body parts 1 2 3 4

26

Process Checklist

Cleaned up after work

Recorded observations

Sought peer help if needed

Measurement accurate

Correct equipment used

Selected approach

CommentsObservedProcedure

27

Problem-Based Exam, Scenario, Case Study

Short written answers to longer narratives

Score in terms of content and conventions

creativity

Apply knowledge and skills to new settings

Level of detail and complexity of problem varies

157

Page 160: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

28

Program Intended Educational Outcomes:

2. Graduates of the Automotive Technology Program will be technically proficient.

Means of Program Assessment:

2a. At the close of their final term, graduates will be able to identify and correct within a given period of time all of the mechanical problems in five test cars that have been "prepared" for the students by Automotive Technology Program faculty.

2b. The Automotive Technology Program graduates will pass the National Automotive Test.

Automotive Technology Simulation

29

See handout for elaboration of project

Automotive Technology Simulation

30

Means of Assessment- Grades

Evaluation of individual students = assessment

Focus is individual not groups of studentsA summative, not formative actObjectivity of single evaluator vs. groupGenerally not accepted as direct evidenceUses of the grading process

Agreed upon course exam or part of examRow and column model for assignments

158

Page 161: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

31

Embedded Assessment StrategyRow and Column Concept

Criteria Tim Jane Mary Joe Dave AverageSpelling 3 4 1 2 3 2.6Grammar 2 5 3 2 5 3.4Punctuation 4 5 2 3 4 3.6Structure 3 2 3 5 3 3.8Total 13 17 10 12 15

Student Grade C A D C B

Total down the column for individual grading. Analyze across the row for assessment of intended outcomes from the group.

Jim Nichols

32

The Grading ProcessesImplications for Assessment

Using the Grading Process & Existing Assignments (stealth assessment)

Build a grading rubric for an assignmentDetermine the criteria on which you evaluate student work (primary trait analysis)

Describe shades of performance quality

As a faculty group sample student work and apply the rubric

33

Analytic Rubrics-Numerical ScalesNumber of points on scale

Larger scale- harder to differentiate among pointsSmaller scale- less diagnostic information

5 to 6 points*Use prior experience

off task = lowest, excellent = highestAllocate other three in between

Multiple dimensionsUse same scale if possible (easier grading)Use different scales to reflect relative value

*When first doing a rubric it is better to start with a 3-point scale

159

Page 162: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

34

Qualitative Scale- Descriptions

DescriptionsNo, minimal, partial, complete evidenceTask not attempted, partial completion, completed, goes beyondOff task, attempts to address task, minimal attention to task, addresses task but no elaboration, fully elaborated & attentive to task and audience

35

Qualitative Scale- Evaluation

Evaluate or judge by standardsCriteria embed ideas of excellence, competence or acceptable outcomesDiscipline-based standards (criteria referenced)Comparing students’ relative status (norm referenced) especially for developmental considerations

36

Portfolio Assessment Criteria and Scoring

What are the criteria for selecting samples going into the portfolio?How or will progress be evaluated?How will different tasks or products-videos, art work, essays, journal entries, etc. be compared or weighted?What is the role of student reflection?

160

Page 163: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

37

Program PortfolioFor programs without a capstoneSelect assignments at key points in the required courses of the program

Beginning, middle, endRequire all students to provide the completed assignmentDevelop an assessment rubricSample from the student work at various stages in the program

38

Reliable Scoring- Train the Raters

Scoring guideFully explicit scoring criteriaExamples illustrating each scoring pointAbbreviated version of criteria for reference during scoringSample form for recording scores

Orientation and practice scoringRecord scores and discuss

39

Quality of Information of Selected & Constructed Response Measures

May have greater fairness because tasks are more authentic

Quantitative techniques help identify potential unfairness

Fairness

Variation in administration conditions can complicate interpretation of results

Greater match between assessment tasks & real world demands

Large inferences from item to occupational behavior

Validity

Greater between-task variability in student performance

Strong theoretical basis for measuring reliability

Fewer responses per topic reduces consistency of score

Many responses per topic increases consistency of score

Rating process can increase errors

Automatic scoring is error freeReliability

Constructed ResponseSelected ResponseDimension of Quality

161

Page 164: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

40

Dale Mc’Iver’s Problem

Case study

41

Issues in Assessment Planning

Single vs. multiple measuresHigh vs. low stakesStand-alone vs. embedded tasksStandardization vs. adaptabilitySingle vs. multiple purposesVoluntary vs. mandatory participation

42

ACCJC Assessment RubricLevels of Implementation

AwarenessDevelopment (now)Proficiency (by 2012)Sustainable continuous quality improvement

162

Page 165: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Handout Packet Assessment for Career &

Technical Education Programs

Fred Trapp, Ph.D. Cambridge West Partnership, LLC

Administrative Dean, Institutional Research/Academic Services (retired)

Long Beach City College

September 2009

163

Page 166: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Plan for Electricity Program-Level Assessment

Scott Fraser and John Hauck, LBCC

Plan for Electricity Program-Level Assessment The Program Curriculum The Electricity program awards both an Associate Degree and a Certificate of Achievement. Students must complete a total of 45 units, of which 40 are core subject offerings. Certificates of Completion are awarded at the conclusion of one or several courses in the program to denote milestones of accomplishment. The requirements for all of these awards are listed in the college catalog. Core Required Day Courses Core Required Evening Courses 1st Course ELECT 200A ELECT 204, 210A, 240, 202 2nd Course ELECT 200B ELECT 209, 210B, 242 3rd Course ELECT 200C ELECT 212, 210C, 242 4th Course ELECT 200D ELECT 214, 210D, 245, 250 ELECT 253 ELECT 253 ELECT 225 ELECT 225 ELECT 435A ELECT 435A Day Program Courses ELECT 200A First Semester Industrial Electricity ELECT 200B Second Semester Industrial Electricity ELECT 200C Third Semester Industrial Electricity ELECT 200D Fourth Semester Industrial Electricity ELECT 253 OSHA Standards for Construction Safety ELECT 225 Algebra & Trigonometry for Technicians ELECT 435A Electrical Motor Control Evening Program Courses ELECT 204 First Semester Fundamentals of DC Electricity ELECT 210A Laboratory Practices ELECT 209 Second Sem Fund of Motors/Generators ELECT 210B Laboratory Practices ELECT 212 Third Semester Fund of AC Electricity ELECT 210C Laboratory Practices ELECT 214 Fourth Semester AC Principles & Pract ELECT 210D Laboratory Practices ELECT 240 Electrical Code-Residential ELECT 202 Electrical Mathematics ELECT 242 Electrical Code-Grounding ELECT 245 Electrical Code-Commercial ELECT 250 Electrical Code-Advanced

164

Page 167: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Plan for Electricity Program-Level Assessment

Scott Fraser and John Hauck, LBCC

Recommended Courses A listing of 19 additional courses is provided for students to select offerings that are recommended as instruction that would complement and extend the required curriculum. In all cases these courses are required elements of other related programs. These courses include: Networking Cabling Installation Cisco Networking I, Introduction Technical Applications of Minicomputers Electrical Motors and Transformers Solid State Fundamentals for Electricians 2 Variable Speed Drives Industrial Drive Systems Robotics Technology Electrical Cost Estimating 2 Electrical Pipe Bending Blueprint Reading for Electricians Traffic Signals Systems 1 Traffic Systems Communication Traffic Signal Controllers & Digital Systems Electrical Motor Control AutoCAD I, Fundamentals Basic AutoCAD for Architecture

165

Page 168: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Plan for Electricity Program-Level Assessment

Scott Fraser and John Hauck, LBCC

Assessment Plan Notes

Electricity Core Courses Required

# Learning Outcomes 1st Crs 2nd Crs 3rd Crs 4th Crs ELECT 253

ELECT 225

ELECT 435A

1

Be technically competent: Perform basic wiring tasks. 3 3 3 3 3

2

Be technically competent: Complete circuit box panel schedules Demand factors Load cycles 2 2 3 3

3

Be technically competent: Trouble shoot successfully Identify symptoms Diagnose problem Fix problem Test fix to verify problem

solution 3 3 3 3 3

4

Be technically competent: Install electrical wiring or equipment to national electrical code standards 3

5 Recognize safe work practices 1 1 1 1 3 1

6 Demonstrate safety practice during lab work 1 1 1 1 1 1

7

8 A possible rating scheme is to code each cell using this approach. 0- Course does not include instruction and assessment of this outcome. 1- Course includes instruction or practice of the outcome, and performance/knowledge of this outcome is assessed. 2- Course includes instruction or practice in the outcomes of this outcome, performance/

knowledge is assessed, and 20% or more of the course focuses on it. 3- Course includes instruction or practice in the outcome, performance/knowledge is assessed,

and 1/3 or more of the course focuses on it. The purpose of completing a matrix like the one above is to get an overall feel for the extent to which each program learning outcome is being addressed throughout the required courses and to what extent it is addressed where.

166

Page 169: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Plan for Electricity Program-Level Assessment

Scott Fraser and John Hauck, LBCC

Expanded Statement of Purpose: Prepare students for entry level positions in the field of electrical technology. Intended Educational Outcomes: Means and Criteria for Assessment: Results of Assessment: Use of Results:

1. Be technically competent • Perform basic wiring tasks. • Accomplish bends in conduit • Complete circuit box panel

schedules o Demand factors o Load cycles o Wire sizes o Circuit breakers

• Trouble shoot successfully o Identify symptoms o Diagnose problem o Fix problem o Test fix to verify

problem solution • Install electrical wiring or

equipment to national electrical code standards

1A. Worksheets on parts of the residential site wiring diagram and other lab projects are scored with an instructor rubric for accuracy. 85% of the students will achieve an overall accuracy score of 75% or better (see the accuracy column of the related example ELECT 200A grading rubric) 1B. Within either ELECT 200D or 214 students will successfully create an industrial building wiring plan which is evaluated by an instructor grading rubric for accuracy, completeness, and neatness.

• Wiring layout document • Lighting document • Title 5 documents • Panel schedules • One-line diagram

95% of the students will achieve an overall score of 75% or better 1C. Within ELECT 435A students will successfully

• Recognize electrical symbols • Prepare a wiring design • Demonstrate component knowledge • Troubleshoot circuit errors

The project is evaluated using an instructor grading rubric for accuracy, completeness and neatness. (see the related ELECT 200A grading rubric and the lab task process notes below for examples of these criteria)

167

Page 170: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Plan for Electricity Program-Level Assessment

Scott Fraser and John Hauck, LBCC

Intended Educational Outcomes: 2. Recognizes safe work practices 3. Demonstrates work safety in the laboratory 4. Employers will be satisfied with the competence of program graduates.

Means and Criteria for Assessment: 2. Within ELECT 253 students will complete an exam in each of eight modules with the OSHA minimum test score of 80% correct overall. 3. Instructor observations during labs will result in no

cases of safe work practices. 4. Of those employers who respond to a mailed

survey, ___ % will report satisfaction with the training provided by the program.

Results of Assessment: Use of Results:

168

Page 171: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Plan for Electricity Program-Level Assessment

Scott Fraser and John Hauck, LBCC

ELECT 435A Laboratory Task/Process and Assessment Procedures Description

1. Students simulate the task on the computer 2. Students wire the lab task which is evaluated by the instructor on

a. neatness b. proper connections c. proper routing

3. Students test the operation by a. completing a check sheet of ohm meter readings b. complete the expected readings for the balance of the circuit diagram c. record the actual results d. synthesize by comparing the actual recorded to the expected results

4. Energize the circuit and verify proper operation after the instructor has signed off on the work in item 3 above 5. Instructor injects faults to the circuit which are typical in the industry without the student present 6. The students trouble shoot and correct faults by doing the following to record their thought processes. All of the faults must be

located or the circuit will not operate. a. List each fault located b. Describe the cause of he fault c. What was malfunctioning that caused the fault? d. What did the student do to correct the fault

7. Student prepares their box for final inspection.

169

Page 172: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Questions to Help Determine Learning Outcomes Write out answers to these questions in sufficient detail that others can agree on what the outcomes mean and whether or not students have obtained them.

1. What important cognitive skills do I want my students to develop? 2. What social and affective skills do I want my students to develop? 3. What metacognitive skills do I want my students to develop? 4. What types of problems do I want them to be able to solve? 5. What concepts and principles do I want my students to be able to apply?

Draw upon the work of national or state professional groups and consult with colleagues.

Generic Skills List (cross disciplines)

1. Communicating clearly 2. Questioning 3. Formulating problems 4. Thinking and reasoning 5. Solving complex, multi-step problems 6. Synthesizing knowledge from a variety of sources 7. Using cooperation and collaboration

Big Ideas, Skills, Concepts, Processes & Techniques (characterize a specific discipline)

1. Developing a hypothesis (hunch) 2. Designing experiments (tests) 3. Drawing inferences from data 4. Using observation and analyzing similarities and differences in phenomena 5. Working with laboratory (test) equipment or tools 6. Re-testing to ensure repair was correct 7. Complete write up of process, findings, repair if required

Adapted from Herman, Joan, et. Al. A Practical Guide to Alternative Assessment

170

Page 173: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Selecting Assessment Tasks

Fred Trapp, January 2008 [email protected]

Selecting Assessment Tasks Goal- Matching the assessment task to your intended learning outcome (skills knowledge, attitudes).

1. Does the task match the specific instructional intention? 2. Does the task adequately represent the content and skills you expect students to attain? 3. Does the task enable students to demonstrate their progress and capabilities? 4. Can the task be structured to provide measures of several outcomes?

Generating good ideas for assessment tasks.

1. Brainstorm with colleagues. 2. Draw upon the work of national or state professional groups. 3. Ask experts in the field who are locally located. 4. Consider ideas from professional journals, conferences, training sessions, etc.

Describing the assessment task by specifying the following.

1. What student learning outcomes are intended for the assessment? 2. What are the content or topic areas? 3. What is the nature and format of questions to be posed to students? 4. Is it group or individual work? If group work, what roles are to be filled? 5. What options/choices are allowed for the finished product? Who makes the choices? 6. What materials/equipment/resources will be available to the students? 7. What directions will be given to the students? 8. What constraints (time allowed, order of tasks, answering student questions, how much

help will be provided) are going to be imposed? 9. What scoring scheme and procedures will be used?

Criteria to critique tasks.

1. Do the tasks match your important instructional goals and student learning outcomes? 2. Do they impose enduring problem types, typical ones students are likely to face

repeatedly? 3. Are the tasks fair and free of bias? 4. Are the tasks creditable to important stakeholders? 5. Will the tasks meaningfully and engaging to students so they will be motivated? 6. Are the tasks instructionally feasible? Do you have the resources and expertise to teach

them? 7. Are the tasks feasible for implementation in the classroom or lab (space, equipment, time,

costs)? Adapted from Herman, Joan, et. Al. A Practical Guide to Alternative Assessment

171

Page 174: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

CR9T070R LONG BEACH CITY COLLEGESCHOOL NORMS VS. NATIONAL PASSING NORMSSCHOOL: SCHOOL NAME:

FS 8080-08-147

CURRENT QUARTER TWO YEAR ACCUMULATIVE **

TYPE TEST NO APPLS NO APPLS PASS PCT APPLS PASS AVG GRADE APPLS SCHL NORM NATL APPLSAMP 2 2 100 80 12 100 4452

4 2007WESTERN PACIFIC WP05

FOR QTR OCT NOV DEC

COMPUTER TEST SCHOOL NORM VS NATIONAL NORM

NATL NORM94

SCHOOL NORM 1 YRNATIONAL NORM 1 YR

Recip Eng

1.171.35

Turb EngA B

0.671.61

CEng Insp

0.330.41

HEng Inst

0.500.81

IFire Prot

0.170.44

JEng Elect

0.501.20

KLub Sys

0.671.55

LIgn Sys

1.002.07

MFuel

Meter

1.671.81

NFuel Sys

0.500.57

OInduct Sys

0.170.76

PCool Sys

0.000.37

QExhst Sys

0.330.64

RProp

TAPU

1.672.59

0.170.17

0.00

0.40

0.80

1.20

1.60

2.00

2.40

2.80

A B C H I J K L M N O P Q R T

SCHOOL NORM VS NATIONAL NORM - 1 YR

POWERPLANT TEST

School Norm 1 Yr

National Norm 1 Yr

** CURRENT DB: EFF DATE 10/1998. 172

Page 175: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

CAPSTONES Rubric for Assessing the Use of Capstone Experiences for Assessing Program Learning Outcomes

Criterion Initial Emerging Developed Highly Developed

Relevant Outcomes and Lines of Evidence Identified

It is not clear which program outcomes will be assessed in the capstone course.

The relevant outcomes are identified, e.g., ability to integrate knowledge to solve complex problems; however, concrete plans for collecting evidence for each outcome have not been developed.

Relevant outcomes are identified. Concrete plans for collecting evidence for each outcome are agreed upon and used routinely by faculty who staff the capstone course.

Relevant evidence is collected; faculty have agreed on explicit criteria statements, e.g., rubrics, and have identified examples of student performance at varying levels of mastery for each relevant outcome.

Valid Results It is not clear that potentially valid evidence for each relevant outcome is collected and/or individual faculty use idiosyncratic criteria to assess student work or performances.

Faculty have reached general agreement on the types of evidence to be collected for each outcome; they have discussed relevant criteria for assessing each outcome but these are not yet fully defined.

Faculty have agreed on concrete plans for collecting relevant evidence for each outcome. Explicit criteria, e.g., rubrics, have been developed to assess the level of student attainment of each outcome.

Assessment criteria, such as rubrics, have been pilot-tested and refined over time; they usually are shared with students. Feedback from external reviewers has led to refinements in the assessment process, and the department uses external benchmarking data.

Reliable Results

Those who review student work are not calibrated to apply assessment criteria in the same way; there are no checks for inter-rater reliability.

Reviewers are calibrated to apply assessment criteria in the same way or faculty routinely check for inter-rater reliability.

Reviewers are calibrated to apply assessment criteria in the same way, and faculty routinely check for inter-rater reliability.

Reviewers are calibrated, and faculty routinely find assessment data have high inter-rater reliability.

Results Are Used

Results for each outcome may or may not be are collected. They are not discussed among faculty.

Results for each outcome are collected and may be discussed by the faculty, but results have not been used to improve the program.

Results for each outcome are collected, discussed by faculty, analyzed, and used to improve the program.

Faculty routinely discuss results, plan needed changes, secure necessary resources, and implement changes. They may collaborate with others, such as librarians or Student Affairs professionals, to improve results. Follow-up studies confirm that changes have improved learning.

The Student Experience

Students know little or nothing about the purpose of the capstone or outcomes to be assessed. It is just another course or requirement.

Students have some knowledge of the purpose and outcomes of the capstone. Communication is occasional, informal, left to individual faculty or advisors.

Students have a good grasp of purpose and outcomes of the capstone and embrace it as a learning opportunity. Information is readily avail-able in advising guides, etc.

Students are well-acquainted with purpose and outcomes of the capstone and embrace it. They may participate in refining the experience, outcomes, and rubrics. Information is readily available.

173

Page 176: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

How Visiting Team Members Can Use the Capstone Rubric Conclusions should be based on discussion with relevant department members (e.g., chair, assessment coordinator, faculty). A variety of capstone experiences can be used to collect assessment data, such as: ! courses, such as senior seminars, in which advanced students are required to consider the discipline broadly and integrate what they have learned

in the curriculum ! specialized, advanced courses ! advanced-level projects conducted under the guidance of a faculty member or committee, such as research projects, theses, or dissertations ! advanced-level internships or practica, e.g., at the end of an MBA program Assessment data for a variety of outcomes can be collected in such courses, particularly outcomes related to integrating and applying the discipline, information literacy, critical thinking, and research and communication skills. The rubric has five major dimensions: 1. Relevant Outcomes and Evidence Identified. It is likely that not all program learning outcomes can be assessed within a single capstone course or

experience. Questions: Have faculty explicitly determined which program outcomes will be assessed in the capstone? Have they agreed on concrete plans for collecting evidence relevant to each targeted outcome? Have they agreed on explicit criteria, such as rubrics, for assessing the evidence? Have they identified examples of student performance for each outcome at varying performance levels (e.g., below expectations, meeting, exceeding expectations for graduation)?

2. Valid Results. A valid assessment of a particular outcome leads to accurate conclusions concerning students’ achievement of that outcome. Sometimes faculty collect evidence that does not have the potential to provide valid conclusions. For example, a multiple-choice test will not provide evidence of students’ ability to deliver effective oral presentations. Assessment requires the collection of valid evidence and judgments about that evidence that are based on well-established, agreed-upon criteria that specify how to identify low, medium, or high-quality work. Questions: Are faculty collecting valid evidence for each targeted outcome? Are they using well-established, agreed-upon criteria, such as rubrics, for assessing the evidence for each outcome? Have faculty pilot tested and refined their process based on experience and feedback from external reviewers? Are they sharing the criteria with their students? Are they using benchmarking (comparison) data?

3. Reliable Results. Well-qualified judges should reach the same conclusions about individual student’s achievement of a learning outcome, demonstrating inter-rater reliability. If two judges independently assess a set of materials, their ratings can be correlated. Sometimes a discrepancy index is used. How often do the two raters give identical ratings, ratings one point apart, ratings two points apart, etc.? Data are reliable if the correlation is high and/or if the discrepancies are small. Raters generally are calibrated (“normed”) to increase reliability. Calibration usually involves a training session in which raters apply rubrics to pre-selected examples of student work that vary in quality, then reach consensus about the rating each example should receive. The purpose is to ensure that all raters apply the criteria in the same way so that each student’s product receives the same score, regardless of rater. Questions: Are reviewers calibrated? Are checks for inter-rater reliability made? Is there evidence of high inter-rater reliability?

4. Results Are Used. Assessment is a process designed to monitor and improve learning, so assessment findings should have an impact. Faculty should reflect on results for each outcome and decide if they are acceptable or disappointing. If results do not meet faculty standards, faculty should determine which changes should be made, e.g., in pedagogy, curriculum, student support, or faculty support. Questions: Do faculty collect assessment results, discuss them, and reach conclusions about student achievement? Do they develop explicit plans to improve student learning? Do they implement those plans? Do they have a history of securing necessary resources to support this implementation? Do they collaborate with other campus professionals to improve student learning? Do follow-up studies confirm that changes have improved learning?

The Student Experience. Students should understand the purposes different educational experiences serve in promoting their learning and development and know how to take advantage of them; ideally they should also participate in shaping those experiences. Thus it is essential to communicate to students consistently and include them meaningfully. Questions: Are purposes and outcomes communicated to students? Do they understand how capstones support learning? Do they participate in reviews of the capstone experience, its outcomes, criteria, or related activities?

174

Page 177: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Automotive Technology Program

Means of Program Assessment and Program Intended

Educational Outcomes:

1. Graduates of the Automotive Technology Program will be successfully employed in the field. (achievement, not Student Learning Outcome) 2. Graduates of the Automotive Technology Program will be technically proficient. 3. Employers of the Automotive Technology Program graduates will rate them competent based on the education received in the program.

Expanded Statement of Institutional

Purpose Mission Statement: Your Community College is an open-admission, community-based, comprehensive college. Goal Statement: Serve persons of all ages in preparing for job entry and careers in automotive technology.

Program Intended Educational Outcomes:

1. Graduates of the Automotive Technology Program will be successfully employed in the field. (achievement, not Student Learning Outcome) 2. Graduates of the Automotive Technology Program will be technically proficient. 3. Employers of the Automotive Technology Program graduates will rate them competent based on the education received in the program.

Summary of Data Collected:

1a. 73% reported employment. 1b. 81% reported employment one year after graduation. 2a. 79% overall success rate. Electrical system malfunction undetected by 34% of students. 2b. Pass rate on National Automotive Test was 83%; however, on “hydraulic theory” subscale students missed an average of 34% of questions. 3. 90% reported willingness to employ graduates, but only 50% of body shops.

Use of Results: 1a. Revised criteria for success to 70%. 1b. No action necessary at this time, however, will continue to monitor. 2a. Expanded electrical trouble-shooting component of AT 202 to include automotive electrical systems. 2b. Modified means of teaching hydraulic theory during AT 102 (Basic Auto Systems). 3. Added body shop representative to Advisory Committee and are reviewing curriculum to determine if separate program is needed. 175

Page 178: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Common elements to a rubric

1. One or more traits or dimensions (criteria) that serve as the bases for judging student responses.

2. Definitions or examples to clarify the meaning of each trait or dimension 3. A scale of values (a counting system) on which to rate each dimension 4. Standards of excellence for specified performance levels accompanied by examples of

each level. Questions to discovering the dimensions or criteria.

1. By what are the qualities or features will I know whether students have produced an excellent response to my assessment task?

2. What will they do that shows me the extent to which they have mastered the learning outcomes?

3. What do I expect to see if the task is done excellently, acceptably, poorly? 4. Do I have examples of student work, from my class or another source, that exemplify the

criteria and I could use to judge the task? 5. What criteria for this or similar tasks exist in the assessment program for the program,

college, etc.? 6. What criteria or dimensions might I adapt from work done at the state or national level or

in professional associations? Evaluating scoring criteria.

1. All important outcomes are addressed by the criteria. 2. Rating strategy matches decision purpose-

a. Holistic for global, evaluative view b. Analytic for diagnostic view

3. Rating scale has usable, easily interpreted scores 4. Criteria employ concrete references, clear language understandable to students 5. Criteria reflect current concepts of good work in the field 6. Criteria is reviewed for developmental, ethnic, gender bias 7. Criteria reflect teachable outcomes 8. Criteria are limited to a feasible number of dimensions 9. Criteria are generalizable to other similar tasks

Adapted from Herman, Joan, et. Al. A Practical Guide to Alternative Assessment

176

Page 179: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

A rubric is a scoring tool that divides assignments into component parts or criteria used for evaluation and provides a detailed description of what is acceptable vs. unacceptable qualities of performance. An analytic rubric makes clear distinctions among the evaluation criteria while a holistic rubric merges the criteria together to stimulate a general judgment about the quality of student work.

Questions To Ask When Constructing Rubrics

1. What criteria or essential elements must be present in the student’s work to ensure that it is high in quality?

2. How many levels of achievement (mastery) do I wish to illustrate for students?

3. For each criteria or essential element of quality, what is a clear description of performance at each achievement level?

4. What are the consequences of performing at each level of quality?

5. What rating scheme will I use in the rubric?

6. When I use the rubric, what aspects work well and what aspects need improvement?

Additional Questions To Consider

1. What content must students master in order to complete the task well?

2. Are there any important aspects of the task that are specific to the context in which the assessment is set?

3. In the task, is the process of achieving the outcome as important as the outcome itself? Source: Huba, Mary E. and Freed, Jann E. Learner-Centered Assessment on College Campuses. Allyn & Bacon, Boston, MA, 2000. ISBN 0-205-28738-7. Additional good references: Moskal, Barbara M. (2000). Scoring Rubrics: What, When and How? Practical Assessment, Research & Evaluation, 7(3). Available online: http://ericae.net/pare/getvn.asp?v=7&n=3. Definitions and construction ideas for scoring rubrics are found at this URL http://ericae.net/faqs/rubrics/scoring_rubrics.htm Stevens, Dannelle and Levi, Antonio. Introduction to Rubrics. Stylus Publishing, Herdon, VA 2004. ISBN 1-57922-114-9. (forthcoming in September, 2004) The assessment leader at Winona State University (MN) has an excellent set of rubrics at this URL http://www.winona.edu/AIR/ Once there click on the sample rubrics link in the left frame. The Center for Learning and Teaching Excellence at Arizona State University has a bank of rubrics at this URL http://clte.asu.edu/resources/instructors/ Select the Assessment Web link in the center of the page. CSU System Office has an excellent rubrics at this URL http://www.calstate.edu/itl/sloa/index.shtml Example in action: Johnson County Community College has been making extensive use of rubrics for general education assessment. An overview is provided at http://www.jccc.net/home/depts.php/6111/site/assmnt/cogout Raymond Walters College has been making extensive use of rubrics and primary trait assessment, for individual course assignments. See examples link at http://www.rwc.uc.edu/phillips/index_assess.html

Prepared by Fred Trapp for the Research and Planning (RP) Group Student Learning Outcomes and Assessment Workshops

177

Page 180: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

ECCS Department – Web Site Rubric Course: _______________________________________ Date: _____________________ Student(s): _________________________________________________________________________ URL: ______________________________________________________________________________ 3 - Expert 2 - Practitioner 1 - Apprentice 0 - Novice Informational Content Score: _____

! Presented information is well-written, concise, accurate and complete.

! Answers to all expected questions can be found on the site.

! Contact information of all students is present and will be still valid in the future.

! Presented information is accurate and complete.

! Answers to most expected questions can be found on the site.

! Contact information of all students is present.

! Presented information is incomplete or inaccurate in places.

! Answers to some expected questions can be found on the site.

! Contact information of some students is not present.

! Presented information is grossly incomplete and/or inaccurate.

! Answers to few expected questions can be found on the site.

! Contact information is not available.

Layout and Design Score: _____

! Pages are attractive and consistent in style throughout the site.

! Site is well organized and is easily navigated from any page.

! Graphic elements are appropriate, of high quality, and are creatively used to enhance content.

! Pages are attractive, but not consistent in style throughout the site.

! Site is well organized. ! Graphic elements are

appropriate and are of acceptable quality to enhance content.

! Pages are not attractive, but do not distract.

! Site is not well organized. ! Graphic elements are not

always appropriate or are of inferior quality.

! Pages are unattractive. ! Site is not organized or

consists of a single page. ! Graphic elements are not

appropriate or not used, or are of such poor quality that they detract from content.

Technical Elements Score: _____

! All links work. ! Graphics used on page

download quickly; thumbnails are used as appropriate.

! All pages have an appropriate title.

! One broken link is present.

! Graphics used on page download in a short amount of time.

! Most pages have an appropriate title.

! Two broken links are present.

! Graphics used on page causes user to wait for page to download images that contain acceptable content.

! Some pages have an appropriate title.

! Three or more broken links are present.

! Graphics used on page causes user to wait for page to download images that do not contain acceptable content.

! Few if any pages have an appropriate title.

Color and Typography Score: _____

! Font type used is visually appealing and is of appropriate size and style to make text easy to read.

! Color selection for foreground and background are consistent, visually pleasing and usable.

! Font type used is of appropriate size and style for readable text.

! Color selection for foreground and background are mostly consistent and usable.

! Font type used is not always of appropriate size or style, causing some difficulties in reading.

! Color selection for foreground and background cause difficulties for those with color-impaired vision

! Font type used is distracting and/or unattractive; size and/or style used makes text hard to read.

! Color selection for foreground and background lack sufficient contrast, are clashing, of contain too many colors to the point that the text is hard for anyone to read.

Comments:

Evaluator Signature: __________________________________________________________

178

Page 181: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Accreditation Board for Engineering & Technology, Inc. (ABET) Outcome 1: An ability to apply math & science in engineering

Level 5 performance characterized by: * Combines mathematical and/or scientific principles to formulate models of chemical, physical and/or biological processes and systems relevant to civil engineering * Applies concepts of integral and differential calculus and/or linear algebra to solve civil engineering problems * Shows appropriate engineering interpretation of mathematical and scientific terms * Translates academic theory into engineering applications and accepts limitations of mathematical models of physical reality * Executes calculations correctly o By hand o Using mathematical software * Correctly analyzes data sets using statistical concepts Level 3 performance characterized by: * Chooses a mathematical model or scientific principle that applies to an engineering problem, but has trouble in model development * Shows nearly complete understanding of applications of calculus and/or linear algebra in problem-solving * Most mathematical terms are interpreted correctly * Some gaps in understanding the application of theory to the problem and expects theory to predict reality * Minor errors in calculations o By hand o Applying math software * Minor errors in statistical analysis of data Level 1 performance characterized by: * Does not understand the connection between mathematical models and chemical, physical, and/or biological processes and systems in civil engineering * Does not understand the application of calculus and linear algebra in solving civil engineering problems * Mathematical terms are interpreted incorrectly or not at all * Does not appear to grasp the connection between theory and the problem * Calculations not performed or performed incorrectly o By hand o Does not know how to use math software * No application of statistics to analysis of data Department of Civil and Environmental Engineering University of Delaware | Newark, DE 19716-3120 phone: 302-831-2442 | e-mail CEE | fax: 302-831-3640

179

Page 182: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

PORTFOLIOS Rubric for Assessing the Use of Portfolios for Assessing Program Learning Outcomes

Criterion Initial Emerging Developed Highly Developed

Clarification of Students’ Task

Instructions to students for portfolio development provide insufficient detail for them to know what faculty expect. Instructions may not identify outcomes to be addressed in the portfolio.

Students receive some written instructions for their portfolios, but they still have problems determining what is required of them and/or why they are compiling a portfolio.

Students receive written instructions that describe faculty expectations in detail and include the purpose of the portfolio, types of evidence to include, role of the reflective essay (if required), and format of the finished product.

Students in the program understand the portfolio requirement and the rationale for it, and they view the portfolio as helping them develop self-assessment skills. Faculty may monitor the developing portfolio to provide formative feedback and/or advise individual students.

Valid Results It is not clear that valid evidence for each relevant outcome is collected and/or individual reviewers use idiosyncratic criteria to assess student work.

Appropriate evidence is collected for each outcome, and faculty have discussed relevant criteria for assessing each outcome.

Appropriate evidence is collected for each outcome; faculty use explicit criteria, such as agreed-upon rubrics, to assess student attainment of each outcome. Rubrics are usually shared with students.

Assessment criteria, e.g., in the form of rubrics, have been pilot-tested and refined over time; they are shared with students, and student may have helped develop them. Feedback from external reviewers has led to refinements in the assessment process. The department also uses external benchmarking data.

Reliable Results

Those who review student work are not calibrated to apply assessment criteria in the same way, and there are no checks for inter-rater reliability.

Reviewers are calibrated to apply assessment criteria in the same way or faculty routinely check for inter-rater reliability.

Reviewers are calibrated to apply assessment criteria in the same way, and faculty routinely check for inter-rater reliability.

Reviewers are calibrated; faculty routinely find that assessment data have high inter-rater reliability.

Results Are Used

Results for each outcome are collected, but they are not discussed among the faculty.

Results for each outcome are collected and discussed by the faculty, but results have not been used to improve the program.

Results for each outcome are collected, discussed by faculty, and used to improve the program.

Faculty routinely discuss results, plan needed changes, secure necessary resources, and implement changes. They may collaborate with others, such as librarians or Student Affairs professionals, to improve student learning. Students may also participate in discussions and/or receive feedback, either individual or in the aggregate. Follow-up studies confirm that changes have improved learning.

If e-Portfolios Are Used

There is no technical support for students or faculty to learn the software or to deal with problems.

There is informal or minimal formal support for students and faculty.

Formal technical support is readily available and proactively assists in learning the software and solving problems.

Support is readily available, proactive, and effective. Tech support personnel may also participate in refining the overall portfolio process.

180

Page 183: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

How Visiting Team Members Can Use the Portfolio Rubric Portfolios can serve many purposes besides assessment; in fact, these other purposes are actually much more common. Portfolios may be compiled so students can share their work with family and friends. They may be designed to build students’ confidence by showing development over time or by displaying best work. They may be used for advising and career counseling, or so students can show their work during a job interview. The first thing a team needs to do is determine that the portfolios are used for assessment, and not for another purpose. Conclusions about the quality of the assessment process should be based on discussion with relevant department members (e.g., chair, assessment coordinator, faculty, students) and a review of the program’s written portfolio assignment. Two common types of portfolios are: ! Showcase portfolios—collections of each student’s best work ! Developmental portfolios—collections of work from early, middle, and late stages in the student’s academic career that demonstrate growth Faculty generally require students to include a reflective essay that describes how the evidence in the portfolio demonstrates their achievement of program learning outcomes. Sometimes faculty monitor developing portfolios to provide formative feedback and/or advising to students, and sometimes they collect portfolios only as students near graduation. Portfolio assignments should clarify the purpose of the portfolio, what kinds of evidence should be included, and the format (e.g., paper vs. e-portfolios); and students should view the portfolio as contributing to their personal development. The rubric has five major dimensions and a fifth dimension limited to e-portfolios: 1. Clarification of Students’ Task. Most students have never created a portfolio, and they need explicit guidance. Questions. Does the portfolio

assignment provide sufficient detail so students understand the purpose, the types of evidence to include, the learning outcomes to address, the role of the reflective essay (if any), and the required format? Do students view the portfolio as contributing to their ability to self-assess? Do faculty use the developing portfolios to assist individual students?

2. Valid Results. Sometimes portfolios lack valid evidence for assessing particular outcomes. For example, portfolios may not allow faculty to assess how well students can deliver oral presentations. Judgments about that evidence need to be based on well-established, agreed-upon criteria that specify (usually in rubrics) how to identify work that meets or exceeds expectations. Questions: Do the portfolios systematically include valid evidence for each targeted outcome? Are faculty using well-established, agreed-upon criteria, such as rubrics, to assess the evidence for each outcome? Have faculty pilot tested and refined their process? Are criteria shared with students? Are they collaborating with colleagues at other institutions to secure benchmarking (comparison) data?

3. Reliable Results. Well-qualified judges should reach the same conclusions about a student’s achievement of a learning outcome, demonstrating inter-rater reliability. If two judges independently assess a set of materials, their ratings can be correlated. Sometimes a discrepancy index is used. How often do the two raters give identical ratings, ratings one point apart, ratings two points apart, etc.? Data are reliable if the correlation is high and/or if discrepancies are small. Raters generally are calibrated (“normed”) to increase reliability. Calibration usually involves a training session in which raters apply rubrics to pre-selected examples of student work that vary in quality, then reach consensus about the rating each example should receive. The purpose is to ensure that all raters apply the criteria in the same way so that each student’s product would receive the same score, regardless of rater. Questions: Are reviewers calibrated? Are checks for inter-rater reliability made? Is there evidence of high inter-rater reliability?

4. Results Are Used. Assessment is a process designed to monitor and improve learning, so assessment findings should have an impact. Faculty should reflect on results for each outcome and decide if they are acceptable or disappointing. If results do not meet their standards, faculty should determine what changes should be made, e.g., in pedagogy, curriculum, student support, or faculty support. Questions: Do faculty collect assessment results, discuss them, and reach conclusions about student achievement? Do they develop explicit plans to improve student learning? Do they implement those plans? Do they have a history of securing necessary resources to support this implementation? Do they collaborate with other campus professionals to improve student learning? Do follow-up studies confirm that changes have improved learning?

5. If e-Portfolios Are Used. Faculty and students alike require support, especially when a new software program is introduced. Lack of support can lead to frustration and failure of the process. Support personnel may also have useful insights into how the portfolio assessment process can be refined. Questions: What is the quality and extent of technical support? Of inclusion in review and refinement of the portfolio process? What is the overall level of faculty and student satisfaction with the technology and support services?

181

Page 184: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Dale McIver’s Problem

Dale McIver teaches in the office automation program at Watson Technical, an area vocational technical school in Dade County, Florida. She teaches a five-course sequence leading to a certificate in office computer applications software. However, few of her students complete the full sequence. Instead, her classes are primarily composed of students wanting to gain some initial familiarity with some computer applications and very basic information system concepts or wanting to upgrade their skills in particular ways. Dale is frustrated with her current grading system, which is based on unit tests from the class workbooks and textbook. The test scores do not give her or the students enough information about the students’ abilities to respond to realistic office demands that involve several software programs and problem solving. She is looking for an assessment system that will engage her students more, help them understand their own strengths and weaknesses, and provide her with information to improve her instruction. She believes there is too much emphasis on rote learning of commands and functions. She wants her students to be better problem solvers when it comes to using computers in the office environment. What assessment plan would you recommend to Dale?

182

Page 185: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Accrediting Commission for Community and Junior Colleges

Western Association of Schools and Colleges

Rubric for Evaluating Institutional Effectiveness – Part III: Student Learning Outcomes (See attached instructions on how to use this rubric.)

Levels of Implementation

Characteristics of Institutional Effectiveness in Student Learning Outcomes

(Sample institutional behaviors)

Awareness

• There is preliminary, investigative dialogue about student learning outcomes. • There is recognition of existing practices such as course objectives and how they relate to student learning outcomes. • There is exploration of models, definitions, and issues taking place by a few people. • Pilot projects and efforts may be in progress. • The college has discussed whether to define student learning outcomes at the level of some courses or programs or degrees; where to begin.

Development

• College has established an institutional framework for definition of student learning outcomes (where to start), how to extend, and timeline. • College has established authentic assessment strategies for assessing student learning outcomes as appropriate to intended course, program, and degree learning outcomes. • Existing organizational structures (e.g. Senate, Curriculum Committee) are supporting strategies for student learning outcomes definition and assessment. • Leadership groups (e.g. Academic Senate and administration), have accepted responsibility for student learning outcomes implementation. • Appropriate resources are being allocated to support student learning outcomes and assessment. • Faculty and staff are fully engaged in student learning outcomes development.

Proficiency

• Student learning outcomes and authentic assessment are in place for courses, programs and degrees. • Results of assessment are being used for improvement and further alignment of institution-wide practices. • There is widespread institutional dialogue about the results. • Decision-making includes dialogue on the results of assessment and is purposefully directed toward improving student learning. • Appropriate resources continue to be allocated and fine-tuned. • Comprehensive assessment reports exist and are completed on a regular basis. • Course student learning outcomes are aligned with degree student learning outcomes. • Students demonstrate awareness of goals and purposes of courses and programs in which they are enrolled.

Sustainable Continuous Quality Improvement

• Student learning outcomes and assessment are ongoing, systematic and used for continuous quality improvement. • Dialogue about student learning is ongoing, pervasive and robust. • Evaluation and fine-tuning of organizational structures to support student learning is ongoing. • Student learning improvement is a visible priority in all practices and structures across the college. • Learning outcomes are specifically linked to program reviews.

JP;DB: cg 8/2007

183

Page 186: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Cambridge West Partnership, LLC Vocational Information Center http://www.khake.com/page50.html Competency Model Clearinghouse Authentic Assessment Toolbox Career and Technical Education National Research Center http://www.nccte.org/repository/ Repository of skills standards U.S. Department of Labor http://www.careeronestop.org/ and http://www.careeronestop.org/CompetencyModel/default.aspx and http://www.careeronestop.org/CompetencyModel/learnCM.aspx General

• Learn about competency models • Find competency model resources • Build a competency model

Exploring industries • Advanced manufacturing • Financial services • Retail industry • Hospitality/hotel & lodging • Energy/generation, transmission & distribution

184

Page 187: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Plenary

The Learning-centered

Institution: Curriculum, Pedagogy and Assessment

for Student Success

Amy Driscoll

185

Page 188: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

The Learning-centered Institution:Curriculum, Pedagogy and

Assessment for Student Success

Amy Driscoll WASC EDUCATIONAL SEMINAR

September 25 2009

2

Outcomes for PlenaryArticulate and discuss the rationale for and impact of learning-centered curriculum, pedagogy, and assessment on student success with colleagues, administration, students, and othersDesign assessment that promotes learning-centered curriculum and pedagogy and ultimately student success

3

Questions for your campus team:

What are four indicators (or more) of student success at your institution? Is there an indicator that is distinctive to your institution? That we would not use to describe student success at other institutions?

186

Page 189: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

4

National Indicators of Student Success

Academic AchievementCitizenshipRetention and GraduationCollaborationWell-being and Health (American Council on Education, AACU’s LEAP, NSSE)

5

Success Indicators for Graduate/Doctoral Programs

National rankings (U S News, NRC periodic review)Career trajectories of alumniStudent report/self evaluations (Survey of Doctoral Education, Nat’l Assoc of Graduate and Professional Student survey)Graduation/attrition rates - “time to degree”External Review Committees

6

Key Ideas

Defining Student SuccessHallmarks of a Learning-centered EducationA Learning-centered Assessment Process Designing Learning-centered Courses, Programs, and PedagogyCycle of Assessment for Improving Learning

187

Page 190: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Learning)-Centered = Student Success

WHY?

8

Reasons for Student Success and Learning-centered Focus:

Institutions that focus on student success (learning-centered) are better positioned to help their students attain their educational objectives (or goals and outcomes)

Assessment and accountability efforts need to be focused on what matters to student success (learning-centered)(Kuh, 2006)

9

Hallmarks of Learning-centered Education

Curriculum

Pedagogy

Assessment

188

Page 191: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

10

Learning-centered CurriculumSynthesizes contentBuilds on previous learningIntegrates education and experienceCommunicates values for and connection to student livesAttends to learning needs

= RELEVANCE, LONG-TERM MEMORY, MOTIVATION, RESPONSIVE

How do we get to know our students?

Their previous learning, their experiences, their needs, their lives, their assets, their challenges, and so on?

12

Learning-centered PedagogyStudents have clear expectationsStudents are actively involvedStudents apply knowledge to important issues and problemsStudent find relevance and valueStudents experience support and feedback for learningStudents are able to practice and take risks

= CONFIDENCE, ENGAGEMENT, PERSONAL LEARNING, SECURE ENVIRONMENT

189

Page 192: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

13

Learning-centered AssessmentAssessment is ongoing not episodic.Students understand and value the criteria, standards, and methods by which they are assessed.The purpose of assessment is to improve student learning.Faculty consider student perspectives.The assessment activity makes sense to students. = OWNERSHIP, SECURITY, CLARITY

14

Processes for Developing Learning-centered Education

Reflect on purpose of assessment and defineAlign pedagogy/assessment with mission and values of institution, department, facultyArticulate goals, outcomes, evidence, criteria, and standardsDesign curriculum and pedagogy to achieve learning outcomesConduct collaborative review of student evidenceUse review to improve learning

15

Developing Learning-centered Assessment

Describe educational culture - fit?Engage in inquiry process - focus on student successStudy list of possible purpose(s)Consider how assessment can support your intentions for student successDefine assessment with the inclusion of student success

190

Page 193: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

16

An Inquiry Process for Assessment:

What questions are you trying to answer with your assessment?

What questions do you care about?

What answers do you already have?

17

Possible Purposes for Assessment:which purposes support student success?

Provide feedback to studentsClassify or grade achievementEnable students to correct errors and improve learningMotivate students focusing on their sense of achievementConsolidate and summarize student learningEstimate students’ potential to progress to other coursesHelp students apply learning to practical contextsGive us feedback on how effective we are at promoting learningProvide data for internal and external accountability

18

Possibilities:Purpose/Definition“The purpose of assessment is to improve learning” (Angelo, 2000)

“Assessment is a dynamic pedagogy that extends, expands, enhances, and strengthens learning” (Driscoll, 2001)

191

Page 194: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

19

Thinking about AssessmentDoes assessment flow from the institution’s mission and reflect the educational values? from the department’s mission?Does assessment address questions that people really care about?Does assessment help faculty fulfill their responsibilities to students, to the public?Does assessment promote student success?

20

Fundamental questions for graduate/doctoral education:

What is the purpose of the doctoral program?

What is the rationale or educational purpose of each element of the doctoral program?

How do you know what is working? What should be changed or eliminated? What should be affirmed or retained?

(Carnegie Institute on the Doctorate, 2007)

Starting with Mission…

Missions contribute meaning to our definition of student success with the uniqueness of the institution.

192

Page 195: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

22

Aligning Mission with Educational Goals for Assessment

Our central mission is to develop life-long learning skills, impart society’s cultural heritage, and educate and prepare for both the professions and advanced study.

23

Aligning Values With Educational Goals

ESU has a commitment to academic and personal integrity.

GOALS: Academic IntegrityPersonal Integrity

24

School/Departmental Missions

“respond effectively to issues of diversity, ambiguity, and conflict as natural parts of American politics”(Division of Political Science)“work toward influencing health behaviors through modification of lifestyles and changes to the environment” (School of Community Health)

193

Page 196: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

25

Important Foundational Questions:

What goals or indicators of student success emerge or pop out of you mission, values, vision, etc.

What promises do your brochures or websites make about student success?

26

Assessment Protocols for Learning-centered Assessment

GOAL OUTCOMES

Evidence

Criteria

Standards:a) Exemplary Achievementb) Satisfactory Achievementc) Unsatisfactory Achievement

27

Goals

Broad descriptions

Categories of learning outcomes

End toward which efforts are directed

194

Page 197: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

28

Examples of Goals

Critical Thinking

Citizenship in a Democracy (Grad. School of Education)

Team work and Collaboration (School of Community Health

Ethics

29

Goals for Graduate Programs“State of the Art” Disciplinary KnowledgeLeadershipScholarly CommunicationAssessment and Evaluation

30

Impact of Goals on Student Learning & Success

Focuses student learning efforts for increased success

Translates mission and values to help make sense of learning

Provides rationale for and makes meaning of curriculum and pedagogy to motivate for success

195

Page 198: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

31

Student Learning Outcomes

Refer to Results in Terms of Specific Student Learning, Development, and Performance (Braskamp and Braskamp, 1997)

Answer the Question – “What Do We Expect of Our Students?” (CSU Report 1989)

Describe Actual Skills, Understandings, Behaviors, Attitudes, Values Expected of Students

32

Examples of OutcomesMath: Use arithmetical, algebraic, geometric and statistical

methods to solve problems.

Ethics: Identify and analyze real world ethical problems or dilemmas and identify those affected by the dilemma.

Culture and Equity: Analyze and describe the concepts of power relations, equity, and social justice and find examples of each concept in the U.S. society and other societies.

Team work: Listens to, acknowledges, and builds on the ideas of others.

33

Examples of Outcomes for Graduate Programs

Generate new knowledge through research/scholarship and transmit that knowledge to othersDescribes and applies techniques, technologies, and strategies that promote required or desired changeWorks effectively with individuals from diverse cultural backgroundsArticulates and follows ethical standards consistent with professional commitment

196

Page 199: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

34

Impact of Outcomes on Student Learning & Success

Directs student learning efforts for greater success

Motivates student learning efforts for greater success

Promotes deep learning due to understanding of expectations which leads to greater success

35

Evidence

Student Work that Demonstrates Achievement of Outcomes (Assignments, Projects, Presentations, Papers, Responses to Questions, Etc.)Designed for appropriate level of learning expectations (outcomes)Opportunity for Different Ways of Demonstrating Learning

36

Examples of Evidence

Teamwork Role play or case study

Project or problem solving assignment

Math Mathematical and statistical projects and papers

Ethics A written accountA multi-media presentation or display boardAn audio tape

197

Page 200: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

37

Impact of Evidence on Student Learning & Success

Limit or expand the ways they demonstrate learning - success for all students?

Enrich and enhance learning and success

Provide opportunity to integrate experience with learning for enhanced success

38

Criteria

Qualities Desired in Student Work (Evidence)

Represent Powerful Professional Judgment of Faculty

Guide Student Learning Efforts

Promote Lifelong Learning

Support Faculty in Making Objective Evaluations

39

Examples of Criteria

Math AccuracyComplexityClarity and Coherence

Ethics Complexity (broad, multifaceted, interconnected)Conscious Awareness

Culture and Equity Range of CulturesReflectivity and Integration

TeamworkRespectFlexibility

198

Page 201: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

40

Criteria for Leadership (Graduate program goal)

BalanceChangeSelf-appraisalStrategic engagement (planning, evaluation, implementation, and assessment)

(Maki & Borkowski, 2006)

41

Impact of Criteria on Student Learning & Success

Promotes confidence in their learning efforts = successPromotes qualities of life-long learning = life successPromotes habits of self assessment = successPromotes student’s sense of fairness of evaluation = increased effort = success

42

Important QuestionWhat criteria would be distinctive of your institution? To cut across most of student work? Or be distinctive for your graduates?

IDEAS: scholarship, multiple perspectives, reflection, commitment

199

Page 202: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

43

Standards/Rubrics

Describe Different Levels of Criteria

Describe Specific Indications of Criteria

Promote Understanding of Criteria

Support Faculty in Making Objective Evaluations

44

Examples of Standards/Rubrics

Math (Accuracy)Satisfactory: Contains few errors and those errors do not significantly undermine the quality of the work.Considers and uses data, models, tools or processes that reasonably and effectively address issues or problems.Unsatisfactory: One or more errors that significantly undermine the qualityof the work.Uses data, models, tools or processes in inappropriate or ineffective ways.

Ethics (Complexity)Standard for Excellent: Consistently views sophisticated and significant dilemmas and issues with a broad focus and from multiple perspectives.Standard for Satisfactory: Usually views sophisticated and significant dilemmas and issues with a broad focus, but may sometimes use a more narrow focus and may use fewer perspectives.Standard for Unsatisfactory: Mainly views issues and dilemmas in simple terms and usually does so with a limited focus and minimal perspectives.

45

Rubric for Leadership: Balance in Facilitating Group Processes

Exemplary: Leads and empowers group members towards consensual solutions that maximize members’ commitment to and satisfaction with agreed-upon responses.Proficient: Is hesitant but able to lead and empower group members in…Marginal: Requires significant assistance in leading and empowering group members…after an extended time period.Unacceptable: Is not able to lead or empower a group…

200

Page 203: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

46

Impact of Standards on Student Learning & Success

Guides level of student investment = focused successProvides insight into the assessment process – understanding leads to successPromotes confidence in how their work will be evaluated – security in learning

47

Assessing Student Learning: Course, Program and Institutional Levels

1. Preparation: Determine purpose(s) and definition of assessment; Examine mission and values

4. Make outcomes, evidence, criteria, and standards “public and visible” (syllabi, programs, brochures)

5. Collect evidence ofstudent achievement

7. Revise outcomes and criteria, Improve pedagogy and curriculum for learner success 2. Design

assessment: Articulate goals, Develop clear outcomes, evidence, criteria, and standards

6. Review and analyze student evidence

3. Alignment of curriculum and pedagogy with learning outcomes

48

Step 3: Aligning Curriculum and Pedagogy with Learning Outcomes

Outcomes and Criteria as Planning FocusFaculty Alignment GridsLearner Grids

201

Page 204: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

49

Student Learning Outcome Example: Implications?

Students describe and assume personal responsibility in collaborative endeavors, and respect and support the contributions of others.

Students analyze ethical issues from a variety of cultural perspectives.

50

Step 4 Communicating Learning Outcomes ---Leads to Success

Public and Visible

Relevant and Meaningful

Motivating and Supportive of Learning

51

Step 5: Collect Evidence of Student Achievement

Collect representative samples from eachcourse

Organize collaborative faculty teams forreview

Use outcomes, criteria, and directions forassignments or assessments

202

Page 205: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

52

Step 5: Review and Analyze Evidence

Read holistically to determine whether outcomes are achieved (reliability).Several readings to identify examples of criteria (validity).Final reading for insights about pedagogy, class structure and environment, and learning supports.

53

Changes in Teaching, Assessment and Reflection on Pedagogy to Promote Student Success

Scaffolding

Iterative assessment

Assessment used as a teaching tool

54

Processes for Developing Learning-centered Education and Student Success

Develop purpose and definitionReview/analyze mission and valuesArticulate goals, outcomes, evidence, criteria, and standardsDesigning curriculum and pedagogyMake assessment public and visibleSystematically collect student evidenceConduct collaborative review of student evidenceUse review to improve learning

203

Page 206: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

55

SUMMARYOutcomes-based assessment will intentionally focus your teaching, curriculum, and assessment on student learning in ways that are authentic, helpful to you and your students, provide accountability for you and others, and actually increase student success.

56

INSPIRATIONS:Let’s begin to think of students as

scholars and teach accordingly.

Let’s model how to learn from mistakes.

Let’s work to elevate learning to the level of identity rather than level of accomplishment.

204

Page 207: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

RESOURCES

205

Page 208: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

BIBLIOGRAPHY

206

Page 209: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Literature on Learning and Assessment

Allen, M. J. (2004). Assessing Academic Programs in Higher Education. Bolton, MA: Anker.

Allen, M. J. (2006). Assessing General Education Programs. Bolton, MA: Anker. American Association for Higher Education. (1998). Powerful Partnerships: A Shared

Responsibility for Learning (www.aahe.org/assessment/joint.htm). Washington DC: AAHE, ACPA, & NASPA.

Anderson, L. W. & Associates. (2000) A Taxonomy for Learning, Teaching, and

Assessing. A Revision of Bloom’s Taxonomy of Educational Objectives. Boston: Allyn & Bacon

Angelo, T. A., & Cross, K. P. (1993). Classroom Assessment Techniques: A Handbook

for College Teachers. San Francisco: Jossey-Bass Publishers. Astin, A. W. (1991). Assessment for Excellence: The Philosophy and Practice of

Assessment and Evaluation in Higher Education. New York: Macmillan. Astin, A. W. (1993). What Matters in College? Four Critical Years Revisited. San

Francisco: Jossey-Bass. Banta, T. W. & Associates. (2002). Building a Scholarship of Assessment. San

Francisco: Jossey-Bass. Bourne, J. & Moore, J., eds. (2004). Elements of Quality Online Education: Into the

Mainstream. Needham: Sloan Consortium. Brown, J. S. & Duguid, P. (2000). The Social Life of Information. Boston: Harvard

Business School Press. Burke, J. C. & Associates. (2005). Achieving Accountability in Higher Education:

Balancing Public, Academic, and Market Demands. San Francisco: Jossey-Bass. Cohen, A. M. (1998). The Shaping of American Higher Education: Emergence and

Growth of the Contemporary System. San Francisco: Jossey-Bass Publishers. Cross, K. P., & Steadman, M. H. (1996). Classroom Research: Implementing the

Scholarship of Teaching. San Francisco: Jossey-Bass. DeZure, D. (Ed.). (2000). Learning from Change: Landmarks in Teaching and Learning

in Higher Education from Change Magazine 1969-1999. Sterling, VA: Stylus. DiStefano, A., Rudestam, K. E., & Silverman, R. J., eds. (2004). Encyclopedia of

Distributed Learning. Thousand Oaks, CA: Sage.

Updated: 1/5/2009

Doherty, A., Riordan, T., & Roth, J. (Eds.). (2002). Student Learning: A Central Focus for Institutions of Higher Education. Milwaukee, WI: Alverno College Institute.

207

Page 210: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Updated: 1/5/2009

Driscoll, A. & Cordero de Noriega, D. (2006). Taking Ownership of Accreditation:

Assessment Processes that Promote Institutional Improvement and Faculty Engagement. Sterling, VA: Stylus.

Driscoll, A. & Wood, S. (2007). Developing Outcomes-Based Assessment for Learner-

Centered Education: A Faculty Introduction. Sterling, VA: Stylus.

Erwin, T.D. (1991). Assessing Student Learning and Development. San Francisco: Jossey-Bass.

Fried, J. & Evans, N. (2004). Learning Reconsidered: A Campus-Wide Focus on the

Student Experience. Washington, DC: National Association of Student Personnel Administrators and The American College Personnel Association.

Gaff, J. G., & Ratcliff, J. L. (1997). Handbook of the Undergraduate Curriculum: A

Comprehensive Guide to Purposes, Structures, Practices, and Change. San Francisco: Jossey-Bass.

Gardiner, L. (1994) Redesigning Higher Education: Producing Dramatic Gains in Student

Learning. East Lansing, MI: ASHE Higher Education Report Series Hakel, M. & Halpern, D. F., eds. (2002). Applying the Science of Learning to University

Teaching and Beyond: New Directions for Teaching and Learning. San Francisco: Jossey-Bass.

Halpern, D. F. (2002). Thought and Knowledge: An Introduction to Critical Thinking.

Mahwah, NJ: Lawrence Erlbaum. Huber, M. T., & Morreale, S. P., eds. (2002). Disciplinary Styles in the Scholarship of

Teaching and Learning: Exploring Common Ground. Washington, DC: American Association for Higher Education.

Huba, M. E. & Freed, J. E. (2000). Learner-Centered Assessment on College

Campuses: Shifting the Focus from Teaching to Learning. Boston: Allyn & Bacon.

Jarvis, P., ed. (2001). The Age of Learning: Education and the Knowledge Society.

London: Kogan Page. Leskes, A. & Wright, B. (2005). The Art & Science of Assessing General Education

Outcomes. Washington, DC: Association of American College and Universities. Lewis, R. G. and Smith, D. H. (1994). Total Quality in Higher Education. Delray Beach:

St. Lucie Press. Maki, P. (2006). The Assessment of Doctoral Education. Sterling, VA: Stylus. Maki, P. (2004) Assessing for Learning: Building a Sustainable Commitment Across the

Institution. Sterling, VA: Stylus Massy, W. F. (2003). Honoring the Trust: Quality and Cost Containment in Higher

Education. Bolton, MA: Anker.

208

Page 211: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Updated: 1/5/2009

Mentkowski, M. & Associates. (2000). Learning That Lasts: Integrating Learning, Development, and Performance in College and Beyond. San Francisco: Jossey-Bass.

Mestre, J. (2005). Transfer of Learning: Research and Perspectives. Miller, R. (2007). Assessment in Cycles of Improvement. Faculty Designs for Essential

Learning Outcomes. Washington, DC: Association of American Colleges and Universities.

Miller, R. & Leskes, A. (2005). Levels of Assessment. From the Student to the Institution.

Washington, DC: Association of American Colleges and Universities. Musil, C. M. (2006) Assessing Global Learning. Matching Good Intentions with Good

Practice. Washington, DC: Association of American Colleges and Universities. Nichols, James O. A Road Map For Improvement of Student Learning and Support

Services Through Assessment, Agathon Press, New York, 2005. National Research Council. (2003). Evaluating And Improving Undergraduate Teaching:

In Science, Technology, Engineering, and Mathematics. Washington, DC: National Academy Press.

National Research Council. (2000). How People Learn: Brain, Mind, Experience, and

School. Washington, DC: National Academy Press. National Research Council. (2001). Knowing What Students Know: The Science and

Design of Educational Assessment. Washington, DC: National Academy Press. O’Banion, T. (1997). A Learning College for the 21st Century. Phoenix, AZ: Oryx &

American Council on Education. Palmer, P. J. (1993). To Know As We Are Known: Education As A Spiritual Journey. San

Francisco: Harper. Palomba, C. A. & Banta, T. W. (1999). Assessment Essentials: Planning, Implementing

and Improving Assessment in Higher Education. San Francisco: Jossey-Bass. Riordan, T. & Roth, J. (2005). Disciplines as Frameworks for Student Learning. Sterling,

VA: Stylus. Schuh, John H. and Associates, M. Lee Upcraft (Foreword by) (2009) Assessment

Methods for Student Affairs San Francisco: Jossey-Bass.

Shavelson, R. J. (2007) A Brief History of Student Learning Assessment. How We Got Where We Are and a Proposal for Where to Go Next. Washington, DC: Association of American Colleges and Universities.

Silverman, S. L. & Casazza, M. E. (2000). Learning and Development: Making

Connections to Enhance Teaching. San Francisco: Jossey-Bass. Stevens, D. & Levi, A. J. (2005). Introduction to Rubrics. An Assessment Tool to Save

Grading Time, Convey Effective Feedback and Promote Student Learning. Sterling, VA: Stylus.

209

Page 212: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Updated: 1/5/2009

Stiehl, Ruth and Lewchuk, Les. The Outcomes Primer: Reconsidering the College Curriculum 3rd edition. The Learning Organization Press, Corvalis, Oregon, 2008. ISBN 978-0-9637457 Available for purchase only via

http://www.outcomesnet.com Stiehl, Ruth and Lewchuk, Les. The Mapping Primer: Tools for Reconstructing the College Curriculum. The Learning Organization Press, Corvalis, Oregon, 2005. ISBN 978-0-9637457-3-6 Available for purchase only via

http://www.outcomesnet.com Stiehl, Ruth and Lewchuk, Les. The Assessment Primer: Creating A Flow of Learning Evidence. The Learning Organization Press, Corvalis, Oregon, 2008. ISBN 978-0-9637457-5-0 Available for purchase only via

http://www.outcomesnet.com Suskie, L. (2004). Assessing Student Learning: A Common Sense Guide. Bolton, MA:

Anker. Urban Universities Portfolio Project. (2002). Metropolitan Universities: An International

Forum. Special Issue, September, 13:3. Vaill, P. B. (1996). Learning As A Way Of Being: Strategies for Survival in a World of

Permanent White Water. San Francisco: Jossey-Bass Publishers. Walvoord, B. E. (2004). Assessment Clear and Simple : A Practical Guide for

Institutions, Departments and General Education. San Francisco: Jossey-Bass Publishers.

Walvoord, B. A. & Anderson, J.A. (1998). Effective Grading: A Tool for Learning and

Assessment. San Francisco: Jossey-Bass. Wellman, Jane V., Ehrlich, Thomas, ed. (2003). How the Student Credit Hour Shapes

Higher Education : The Tie That Binds. San Francisco: Jossey-Bass Publishers. Wiggins, G. (1998). Educative Assessment: Designing Assessment to Inform and

Improve Student Performance. San Francisco: Jossey-Bass.

210

Page 213: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

RUBRICS

ACSCU

211

Page 214: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

PROGRAM LEARNING OUTCOMES Rubric for Assessing the Quality of Academic Program Learning Outcomes

Criterion Initial Emerging Developed Highly Developed Comprehensive List

The list of outcomes is problematic: e.g., very incomplete, overly detailed, inappropriate, disorganized. It may include only discipline-specific learning, ignoring relevant institution-wide learning. The list may confuse learning processes (e.g., doing an internship) with learning outcomes (e.g., application of theory to real-world problems).

The list includes reasonable outcomes but does not specify expectations for the program as a whole. Relevant institution-wide learning outcomes and/or national disciplinary standards may be ignored. Distinctions between expectations for undergraduate and graduate programs may be unclear.

The list is a well-organized set of reasonable outcomes that focus on the key knowledge, skills, and values students learn in the program. It includes relevant institution-wide outcomes (e.g., communication or critical thinking skills). Outcomes are appropriate for the level (undergraduate vs. graduate); national disciplinary standards have been considered.

The list is reasonable, appropriate, and comprehensive, with clear distinctions between undergraduate and graduate expectations, if applicable. National disciplinary standards have been considered. Faculty have agreed on explicit criteria for assessing students’ level of mastery of each outcome.

Assessable Outcomes

Outcome statements do not identify what students can do to demonstrate learning. Statements such as “Students understand scientific method” do not specify how understanding can be demonstrated and assessed.

Most of the outcomes indicate how students can demonstrate their learning.

Each outcome describes how students can demonstrate learning, e.g., “Graduates can write reports in APA style” or “Graduates can make original contributions to biological knowledge.”

Outcomes describe how students can demonstrate their learning. Faculty have agreed on explicit criteria statements, such as rubrics, and have identified examples of student performance at varying levels for each outcome.

Alignment There is no clear relationship between the outcomes and the curriculum that students experience.

Students appear to be given reasonable opportunities to develop the outcomes in the required curriculum.

The curriculum is designed to provide opportunities for students to learn and to develop increasing sophistication with respect to each outcome. This design may be summarized in a curriculum map.

Pedagogy, grading, the curriculum, relevant student support services, and co-curriculum are explicitly and intentionally aligned with each outcome. Curriculum map indicates increasing levels of proficiency.

Assessment Planning

There is no formal plan for assessing each outcome.

The program relies on short-term planning, such as selecting which outcome(s) to assess in the current year.

The program has a reasonable, multi-year assessment plan that identifies when each outcome will be assessed. The plan may explicitly include analysis and implementation of improvements.

The program has a fully-articulated, sustainable, multi-year assessment plan that describes when and how each outcome will be assessed and how improvements based on findings will be implemented. The plan is routinely examined and revised, as needed.

The Student Experience

Students know little or nothing about the overall outcomes of the program. Communication of outcomes to students, e.g. in syllabi or catalog, is spotty or nonexistent.

Students have some knowledge of program outcomes. Communication is occasional and informal, left to individual faculty or advisors.

Students have a good grasp of program outcomes. They may use them to guide their own learning. Outcomes are included in most syllabi and are readily available in the catalog, on the web page, and elsewhere.

Students are well-acquainted with program outcomes and may participate in creation and use of rubrics. They are skilled at self-assessing in relation to the outcomes and levels of performance. Program policy calls for inclusion of outcomes in all course syllabi, and they are readily available in other program documents.

212

Page 215: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

How Visiting Team Members Can Use the Learning Outcomes Rubric Conclusions should be based on a review of learning outcomes and assessment plans. Although you can make some preliminary judgments about alignment based on examining the curriculum or a curriculum map, you will have to interview key departmental representatives, such as department chairs, faculty, and students, to fully evaluate the alignment of the learning environment with the outcomes. The rubric has five major dimensions: 1. Comprehensive List. The set of program learning outcomes should be a short but comprehensive list of the most important knowledge, skills,

and values students learn in the program, including relevant institution-wide outcomes such as those dealing with communication skills, critical thinking, or information literacy. Faculty generally should expect higher levels of sophistication for graduate programs than for undergraduate programs, and they should consider national disciplinary standards when developing and refining their outcomes, if available. There is no strict rule concerning the optimum number of outcomes, but quality is more important than quantity. Faculty should not confuse learning processes (e.g., completing an internship) with learning outcomes (what is learned in the internship, such as application of theory to real-world practice). Questions. Is the list reasonable, appropriate and well-organized? Are relevant institution-wide outcomes, such as information literacy, included? Are distinctions between undergraduate and graduate outcomes clear? Have national disciplinary standards been considered when developing and refining the outcomes? Are explicit criteria – as defined in a rubric, for example – available for each outcome?

2. Assessable Outcomes. Outcome statements should specify what students can do to demonstrate their learning. For example, an outcome might state that “Graduates of our program can collaborate effectively to reach a common goal” or that “Graduates of our program can design research studies to test theories and examine issues relevant to our discipline.” These outcomes are assessable because faculty can observe the quality of collaboration in teams, and they can review the quality of student-created research designs. Criteria for assessing student products or behaviors usually are specified in rubrics, and the department should develop examples of varying levels of student performance (i.e., work that does not meet expectations, meets expectations, and exceeds expectations) to illustrate levels. Questions. Do the outcomes clarify how students can demonstrate learning? Have the faculty agreed on explicit criteria, such as rubrics, for assessing each outcome? Do they have examples of work representing different levels of mastery for each outcome?

3. Alignment. Students cannot be held responsible for mastering learning outcomes unless they have participated in a program that systematically supports their development. The curriculum should be explicitly designed to provide opportunities for students to develop increasing sophistication with respect to each outcome. This design often is summarized in a curriculum map—a matrix that shows the relationship between courses in the required curriculum and the program’s learning outcomes. Pedagogy and grading should be aligned with outcomes to foster and encourage student growth and to provide students helpful feedback on their development. Since learning occurs within and outside the classroom, relevant student services (e.g., advising and tutoring centers) and co-curriculum (e.g., student clubs and campus events) should be designed to support the outcomes. Questions. Is the curriculum explicitly aligned with the program outcomes? Do faculty select effective pedagogy and use grading to promote learning? Are student support services and the co-curriculum explicitly aligned to promote student development of the learning outcomes?

4. Assessment Planning. Faculty should develop explicit plans for assessing each outcome. Programs need not assess every outcome every year, but faculty should have a plan to cycle through the outcomes over a reasonable period of time, such as the period for program review cycles. Questions. Does the plan clarify when, how, and how often each outcome will be assessed? Will all outcomes be assessed over a reasonable period of time? Is the plan sustainable, in terms of human, fiscal, and other resources? Are assessment plans revised, as needed?

5. The Student Experience. At a minimum, students should be aware of the learning outcomes of the program(s) in which they are enrolled; ideally, they should be included as partners in defining and applying the outcomes and the criteria for levels of sophistication. Thus it is essential to communicate learning outcomes to students consistently and meaningfully. Questions: Are the outcomes communicated to students? Do students understand what the outcomes mean and how they can further their own learning? Do students use the outcomes and criteria to self-assess? Do they participate in reviews of outcomes, criteria, curriculum design, or related activities?

213

Page 216: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Rubric for Evaluating General Education Assessment Process (Draft for Use as Pilot, June 7, 2008)

Criterion Initial Emerging Developed Highly Developed GE Outcomes GE learning outcomes

have not yet been developed for the entire GE program; there may be one or two common ones, e.g., writing, critical thinking.

Learning outcomes have been developed for the entire GE program, but the list is too long, too short, or inappro-priate. Outcomes do not lend themselves to demonstrations of student learning.

The list is a well-organized set of reasonable outcomes that focus on the most important knowledge, skills, and values students learn in the GE program. Outcomes express how students can demonstrate their learning. Work to define levels of performance is beginning.

The list of outcomes is reasonable and appropriate. Outcomes describe how students can demonstrate their learning. Faculty have agreed on explicit criteria, such as rubrics, for assessing students’ level of mastery and have identified exemplars of student performance at varying levels for each outcome.

Curriculum Alignment with Outcomes

There is no clear relationship between the outcomes and the GE curriculum. Students may not have the opportunity to develop each outcome.

Students appear to be given reasonable opportunities to develop each of the GE outcomes. Curriculum map may indicate opportunities to acquire outcomes.

The curriculum is explicitly designed to provide opportunities for students to learn and to develop increasing sophistication with respect to each outcome. Design may be summarized in a curriculum map that shows “beginning,” “intermediate” and “advanced” treatment of outcomes.

Pedagogy, grading, the curriculum, and relevant student support services and the co-curriculum are explicitly aligned with GE outcomes.

Assessment Planning

There is no formal plan for assessing each GE outcome.

GE assessment relies on short-term planning, such as selecting which outcome(s) to assess in the current year. Interpretation and use of findings for improvement are implicit rather than planned or funded.

The campus has a reasonable, multi-year assessment plan that identifies when each GE outcome will be assessed. The plan includes specific mechanisms for interpretation and use of findings for improvement.

The campus has a fully-articulated, sustainable, multi-year assessment plan that describes when and how each outcome will be assessed. The plan is routinely examined and revised, as needed, based on experience and feedback from external reviewers. The campus uses some form of comparative data (e.g., own past record, aspirational goals, external benchmarking).

Assessment Implementation

It is not clear that potentially valid evidence for each GE outcome is collected and/or individual reviewers use idiosyncratic criteria to assess student work.

Appropriate evidence is collected and faculty have discussed relevant criteria for assessing each outcome. Those who assess student work are calibrated to apply assessment criteria in the same way or faculty routinely check for inter-rater reliability.

Appropriate evidence is collected and faculty use explicit criteria, such as rubrics, to assess student attainment of each outcome. Those who assess student work are calibrated to apply assessment criteria in the same way, and faculty routinely check for inter-rater reliability.

Assessment criteria, such as rubrics, have been pilot-tested and refined over time; and they usually are shared with students. Those who assess student work are calibrated, and faculty routinely find high inter-rater reliability. Faculty take comparative data into account when interpreting results and deciding on changes to improve learning.

214

Page 217: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Use of Results Results for GE outcomes are collected, but they are not discussed among relevant faculty. There is little or no collective use of findings.

Results for each GE outcome are collected and discussed by relevant faculty; results have been used occasionally to improve the GE program.

Results for each outcome are collected, discussed by relevant faculty and others, and regularly used to improve the GE program.

Relevant faculty routinely discuss results, plan needed changes, secure necessary resources, and implement changes. They may collaborate with others, such as librarians or Student Affairs professionals, to improve the program. Follow-up studies confirm that changes have improved learning.

215

Page 218: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

How Visiting Team Members Can Use the GE Assessment Rubric Conclusions should be based on review of the GE program’s written assessment record and discussion with relevant campus representatives (e.g., GE chair, GE Assessment Coordinator, faculty who teach GE courses). Discussion should validate that the reality matches the written record.

The rubric has five major dimensions:

1. GE Outcomes. The set of GE learning outcomes should be a comprehensive list of the most important knowledge, skills, and values students learn in the GE program. There is no strict rule concerning the optimum number of outcomes, but quality is more important than quantity. Faculty should not confuse learning processes (e.g., completing a science lab) with learning outcomes (what is learned in the science lab, such as ability to apply the scientific method). Outcome statements should specify what students do to demonstrate their learning. For example, an outcome might state that “Students who complete the GE program can explain major concepts and theories in at least two social science disciplines.” This outcome is assessable because faculty can rate the quality of students’ explanations. Criteria for assessing student work usually are specified in rubrics, and faculty should identify examples of varying levels of student performance, such as work that does not meet expectations, that meets expectations, and exceeds expectations. Questions. Is the list of outcomes reasonable and appropriate? Do the outcomes express how students can demonstrate learning? Have faculty agreed on explicit criteria, such as rubrics, for assessing each outcome? Do they have examplars of work representing different levels of mastery for each outcome?

2. Curriculum Alignment. Students cannot be held responsible for mastering learning outcomes unless the GE program systematically

supports their development. The GE curriculum should be explicitly designed to provide opportunities for students to develop increasing sophistication with respect to each outcome. This design often is summarized in a curriculum map—a matrix that shows the relationship between GE courses and GE learning outcomes. Pedagogy and grading should align with outcomes to foster growth and provide students helpful feedback on their development. Relevant student services (e.g., advising and tutoring centers) and the co-curriculum (e.g., student clubs and campus events) should also be designed to support development of the learning outcomes, since learning occurs outside the classroom as well as within it. Questions. Is the GE curriculum explicitly aligned with program outcomes? Do faculty select effective pedagogies and use grading to promote learning? Are student support services and the co-curriculum explicitly aligned to promote student development of GE learning outcomes?

3. Assessment Planning. Faculty should develop explicit, sustainable plans for assessing each GE outcome. They need not assess every

outcome every year, but they should have a plan to cycle through the outcomes over a reasonable period of time, such as the period for program review cycles. Experience and feedback from external reviewers should guide plan revision. Questions. Does the campus have a GE assessment plan? Does the plan clarify when, how, and how often each outcome will be assessed? Will all outcomes be assessed over a reasonable period of time? Is the plan sustainable? Supported by appropriate resources? Are plans revised, as needed, based on experience and feedback from external reviewers? Does the plan include collection of comparative data?

4. Assessment Implementation. GE assessment data should be valid and reliable. A valid assessment of a particular outcome leads to

accurate conclusions concerning students’ achievement of that outcome. Sometimes campuses collect assessment data that do not have the

216

Page 219: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Western Association of Schools and Colleges Accrediting Commission for Senior Colleges and Universities

potential to be valid. For example, a multiple-choice test may not collect information that allows faculty to make judgments about students’ ability to explain phenomena. Assessment requires the collection of valid evidence and judgments about that evidence that are based on agreed-upon criteria that specify how to identify work that meets or exceeds expectations. These criteria usually are specified in rubrics. Well-qualified judges should reach the same conclusions about individual student’s achievement of a learning outcome, demonstrating inter-rater reliability. If two judges independently assess a set of materials, their ratings can be correlated. Sometimes a discrepancy index is used. How often do the two raters give identical ratings, ratings one point apart, ratings two points apart, etc.? Data are reliable if the correlation is high and/or if the discrepancies are small. Raters generally are calibrated (“normed”) to increase reliability. Calibration usually involves a training session in which raters apply rubrics to pre-selected examples of student work that vary in quality; then they reach consensus about the rating each example should receive. The purpose is to ensure that all raters apply the criteria in the same way so that each student’s product would receive the same score, regardless of rater. Faculty may take external benchmarking data or other comparative data into account when interpreting results. Questions: Do GE assessment studies systematically collect valid evidence for each targeted outcome? Do faculty use agreed-upon criteria such as rubrics for assessing the evidence for each outcome? Do they share the criteria with their students? Are those who assess student work calibrated in the use of assessment criteria? Does the campus routinely document high inter-rater reliability? Do faculty pilot test and refine their assessment processes? Do they take external benchmarking (comparison) data into account when interpreting results?

5. Use of Results. Assessment is a process designed to monitor and improve learning, so assessment findings should have an impact. Faculty

should reflect on results for each outcome and decide if they are acceptable or disappointing. If results do not meet faculty standards, faculty (and others, such as student affairs personnel, librarians, tutors) should determine which changes should be made, e.g., in pedagogy, curriculum, student support, or faculty support. Questions: Do faculty collect assessment results, discuss them, and reach conclusions about student achievement? Do they develop explicit plans to improve student learning? Do they implement those plans? Do they have a history of securing necessary resources to support this implementation? Do they collaborate with other campus professionals to improve student learning? Do follow-up studies confirm that changes have improved learning?

217

Page 220: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

PORTFOLIOS Rubric for Assessing the Use of Portfolios for Assessing Program Learning Outcomes

Criterion Initial Emerging Developed Highly Developed

Clarification of Students’ Task

Instructions to students for portfolio development provide insufficient detail for them to know what faculty expect. Instructions may not identify outcomes to be addressed in the portfolio.

Students receive some written instructions for their portfolios, but they still have problems determining what is required of them and/or why they are compiling a portfolio.

Students receive written instructions that describe faculty expectations in detail and include the purpose of the portfolio, types of evidence to include, role of the reflective essay (if required), and format of the finished product.

Students in the program understand the portfolio requirement and the rationale for it, and they view the portfolio as helping them develop self-assessment skills. Faculty may monitor the developing portfolio to provide formative feedback and/or advise individual students.

Valid Results It is not clear that valid evidence for each relevant outcome is collected and/or individual reviewers use idiosyncratic criteria to assess student work.

Appropriate evidence is collected for each outcome, and faculty have discussed relevant criteria for assessing each outcome.

Appropriate evidence is collected for each outcome; faculty use explicit criteria, such as agreed-upon rubrics, to assess student attainment of each outcome. Rubrics are usually shared with students.

Assessment criteria, e.g., in the form of rubrics, have been pilot-tested and refined over time; they are shared with students, and student may have helped develop them. Feedback from external reviewers has led to refinements in the assessment process. The department also uses external benchmarking data.

Reliable Results

Those who review student work are not calibrated to apply assessment criteria in the same way, and there are no checks for inter-rater reliability.

Reviewers are calibrated to apply assessment criteria in the same way or faculty routinely check for inter-rater reliability.

Reviewers are calibrated to apply assessment criteria in the same way, and faculty routinely check for inter-rater reliability.

Reviewers are calibrated; faculty routinely find that assessment data have high inter-rater reliability.

Results Are Used

Results for each outcome are collected, but they are not discussed among the faculty.

Results for each outcome are collected and discussed by the faculty, but results have not been used to improve the program.

Results for each outcome are collected, discussed by faculty, and used to improve the program.

Faculty routinely discuss results, plan needed changes, secure necessary resources, and implement changes. They may collaborate with others, such as librarians or Student Affairs professionals, to improve student learning. Students may also participate in discussions and/or receive feedback, either individual or in the aggregate. Follow-up studies confirm that changes have improved learning.

If e-Portfolios Are Used

There is no technical support for students or faculty to learn the software or to deal with problems.

There is informal or minimal formal support for students and faculty.

Formal technical support is readily available and proactively assists in learning the software and solving problems.

Support is readily available, proactive, and effective. Tech support personnel may also participate in refining the overall portfolio process.

218

Page 221: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

How Visiting Team Members Can Use the Portfolio Rubric Portfolios can serve many purposes besides assessment; in fact, these other purposes are actually much more common. Portfolios may be compiled so students can share their work with family and friends. They may be designed to build students’ confidence by showing development over time or by displaying best work. They may be used for advising and career counseling, or so students can show their work during a job interview. The first thing a team needs to do is determine that the portfolios are used for assessment, and not for another purpose. Conclusions about the quality of the assessment process should be based on discussion with relevant department members (e.g., chair, assessment coordinator, faculty, students) and a review of the program’s written portfolio assignment. Two common types of portfolios are: • Showcase portfolios—collections of each student’s best work • Developmental portfolios—collections of work from early, middle, and late stages in the student’s academic career that demonstrate growth Faculty generally require students to include a reflective essay that describes how the evidence in the portfolio demonstrates their achievement of program learning outcomes. Sometimes faculty monitor developing portfolios to provide formative feedback and/or advising to students, and sometimes they collect portfolios only as students near graduation. Portfolio assignments should clarify the purpose of the portfolio, what kinds of evidence should be included, and the format (e.g., paper vs. e-portfolios); and students should view the portfolio as contributing to their personal development. The rubric has five major dimensions and a fifth dimension limited to e-portfolios: 1. Clarification of Students’ Task. Most students have never created a portfolio, and they need explicit guidance. Questions. Does the portfolio

assignment provide sufficient detail so students understand the purpose, the types of evidence to include, the learning outcomes to address, the role of the reflective essay (if any), and the required format? Do students view the portfolio as contributing to their ability to self-assess? Do faculty use the developing portfolios to assist individual students?

2. Valid Results. Sometimes portfolios lack valid evidence for assessing particular outcomes. For example, portfolios may not allow faculty to assess how well students can deliver oral presentations. Judgments about that evidence need to be based on well-established, agreed-upon criteria that specify (usually in rubrics) how to identify work that meets or exceeds expectations. Questions: Do the portfolios systematically include valid evidence for each targeted outcome? Are faculty using well-established, agreed-upon criteria, such as rubrics, to assess the evidence for each outcome? Have faculty pilot tested and refined their process? Are criteria shared with students? Are they collaborating with colleagues at other institutions to secure benchmarking (comparison) data?

3. Reliable Results. Well-qualified judges should reach the same conclusions about a student’s achievement of a learning outcome, demonstrating inter-rater reliability. If two judges independently assess a set of materials, their ratings can be correlated. Sometimes a discrepancy index is used. How often do the two raters give identical ratings, ratings one point apart, ratings two points apart, etc.? Data are reliable if the correlation is high and/or if discrepancies are small. Raters generally are calibrated (“normed”) to increase reliability. Calibration usually involves a training session in which raters apply rubrics to pre-selected examples of student work that vary in quality, then reach consensus about the rating each example should receive. The purpose is to ensure that all raters apply the criteria in the same way so that each student’s product would receive the same score, regardless of rater. Questions: Are reviewers calibrated? Are checks for inter-rater reliability made? Is there evidence of high inter-rater reliability?

4. Results Are Used. Assessment is a process designed to monitor and improve learning, so assessment findings should have an impact. Faculty should reflect on results for each outcome and decide if they are acceptable or disappointing. If results do not meet their standards, faculty should determine what changes should be made, e.g., in pedagogy, curriculum, student support, or faculty support. Questions: Do faculty collect assessment results, discuss them, and reach conclusions about student achievement? Do they develop explicit plans to improve student learning? Do they implement those plans? Do they have a history of securing necessary resources to support this implementation? Do they collaborate with other campus professionals to improve student learning? Do follow-up studies confirm that changes have improved learning?

5. If e-Portfolios Are Used. Faculty and students alike require support, especially when a new software program is introduced. Lack of support can lead to frustration and failure of the process. Support personnel may also have useful insights into how the portfolio assessment process can be refined. Questions: What is the quality and extent of technical support? Of inclusion in review and refinement of the portfolio process? What is the overall level of faculty and student satisfaction with the technology and support services?

219

Page 222: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

CAPSTONES Rubric for Assessing the Use of Capstone Experiences for Assessing Program Learning Outcomes

Criterion Initial Emerging Developed Highly Developed

Relevant Outcomes and Lines of Evidence Identified

It is not clear which program outcomes will be assessed in the capstone course.

The relevant outcomes are identified, e.g., ability to integrate knowledge to solve complex problems; however, concrete plans for collecting evidence for each outcome have not been developed.

Relevant outcomes are identified. Concrete plans for collecting evidence for each outcome are agreed upon and used routinely by faculty who staff the capstone course.

Relevant evidence is collected; faculty have agreed on explicit criteria statements, e.g., rubrics, and have identified examples of student performance at varying levels of mastery for each relevant outcome.

Valid Results It is not clear that potentially valid evidence for each relevant outcome is collected and/or individual faculty use idiosyncratic criteria to assess student work or performances.

Faculty have reached general agreement on the types of evidence to be collected for each outcome; they have discussed relevant criteria for assessing each outcome but these are not yet fully defined.

Faculty have agreed on concrete plans for collecting relevant evidence for each outcome. Explicit criteria, e.g., rubrics, have been developed to assess the level of student attainment of each outcome.

Assessment criteria, such as rubrics, have been pilot-tested and refined over time; they usually are shared with students. Feedback from external reviewers has led to refinements in the assessment process, and the department uses external benchmarking data.

Reliable Results

Those who review student work are not calibrated to apply assessment criteria in the same way; there are no checks for inter-rater reliability.

Reviewers are calibrated to apply assessment criteria in the same way or faculty routinely check for inter-rater reliability.

Reviewers are calibrated to apply assessment criteria in the same way, and faculty routinely check for inter-rater reliability.

Reviewers are calibrated, and faculty routinely find assessment data have high inter-rater reliability.

Results Are Used

Results for each outcome may or may not be are collected. They are not discussed among faculty.

Results for each outcome are collected and may be discussed by the faculty, but results have not been used to improve the program.

Results for each outcome are collected, discussed by faculty, analyzed, and used to improve the program.

Faculty routinely discuss results, plan needed changes, secure necessary resources, and implement changes. They may collaborate with others, such as librarians or Student Affairs professionals, to improve results. Follow-up studies confirm that changes have improved learning.

The Student Experience

Students know little or nothing about the purpose of the capstone or outcomes to be assessed. It is just another course or requirement.

Students have some knowledge of the purpose and outcomes of the capstone. Communication is occasional, informal, left to individual faculty or advisors.

Students have a good grasp of purpose and outcomes of the capstone and embrace it as a learning opportunity. Information is readily avail-able in advising guides, etc.

Students are well-acquainted with purpose and outcomes of the capstone and embrace it. They may participate in refining the experience, outcomes, and rubrics. Information is readily available.

220

Page 223: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

How Visiting Team Members Can Use the Capstone Rubric Conclusions should be based on discussion with relevant department members (e.g., chair, assessment coordinator, faculty). A variety of capstone experiences can be used to collect assessment data, such as: • courses, such as senior seminars, in which advanced students are required to consider the discipline broadly and integrate what they have learned

in the curriculum • specialized, advanced courses • advanced-level projects conducted under the guidance of a faculty member or committee, such as research projects, theses, or dissertations • advanced-level internships or practica, e.g., at the end of an MBA program Assessment data for a variety of outcomes can be collected in such courses, particularly outcomes related to integrating and applying the discipline, information literacy, critical thinking, and research and communication skills. The rubric has five major dimensions: 1. Relevant Outcomes and Evidence Identified. It is likely that not all program learning outcomes can be assessed within a single capstone course or

experience. Questions: Have faculty explicitly determined which program outcomes will be assessed in the capstone? Have they agreed on concrete plans for collecting evidence relevant to each targeted outcome? Have they agreed on explicit criteria, such as rubrics, for assessing the evidence? Have they identified examples of student performance for each outcome at varying performance levels (e.g., below expectations, meeting, exceeding expectations for graduation)?

2. Valid Results. A valid assessment of a particular outcome leads to accurate conclusions concerning students’ achievement of that outcome. Sometimes faculty collect evidence that does not have the potential to provide valid conclusions. For example, a multiple-choice test will not provide evidence of students’ ability to deliver effective oral presentations. Assessment requires the collection of valid evidence and judgments about that evidence that are based on well-established, agreed-upon criteria that specify how to identify low, medium, or high-quality work. Questions: Are faculty collecting valid evidence for each targeted outcome? Are they using well-established, agreed-upon criteria, such as rubrics, for assessing the evidence for each outcome? Have faculty pilot tested and refined their process based on experience and feedback from external reviewers? Are they sharing the criteria with their students? Are they using benchmarking (comparison) data?

3. Reliable Results. Well-qualified judges should reach the same conclusions about individual student’s achievement of a learning outcome, demonstrating inter-rater reliability. If two judges independently assess a set of materials, their ratings can be correlated. Sometimes a discrepancy index is used. How often do the two raters give identical ratings, ratings one point apart, ratings two points apart, etc.? Data are reliable if the correlation is high and/or if the discrepancies are small. Raters generally are calibrated (“normed”) to increase reliability. Calibration usually involves a training session in which raters apply rubrics to pre-selected examples of student work that vary in quality, then reach consensus about the rating each example should receive. The purpose is to ensure that all raters apply the criteria in the same way so that each student’s product receives the same score, regardless of rater. Questions: Are reviewers calibrated? Are checks for inter-rater reliability made? Is there evidence of high inter-rater reliability?

4. Results Are Used. Assessment is a process designed to monitor and improve learning, so assessment findings should have an impact. Faculty should reflect on results for each outcome and decide if they are acceptable or disappointing. If results do not meet faculty standards, faculty should determine which changes should be made, e.g., in pedagogy, curriculum, student support, or faculty support. Questions: Do faculty collect assessment results, discuss them, and reach conclusions about student achievement? Do they develop explicit plans to improve student learning? Do they implement those plans? Do they have a history of securing necessary resources to support this implementation? Do they collaborate with other campus professionals to improve student learning? Do follow-up studies confirm that changes have improved learning?

The Student Experience. Students should understand the purposes different educational experiences serve in promoting their learning and development and know how to take advantage of them; ideally they should also participate in shaping those experiences. Thus it is essential to communicate to students consistently and include them meaningfully. Questions: Are purposes and outcomes communicated to students? Do they understand how capstones support learning? Do they participate in reviews of the capstone experience, its outcomes, criteria, or related activities?

221

Page 224: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

PROGRAM REVIEW Rubric for Assessing the Integration of Student Learning Assessment into Program Reviews

Criterion Initial Emerging Developed Highly Developed

Required Elements of the Self-Study

Program faculty may be required to provide a list of program-level student learning outcomes.

Faculty are required to provide the program’s student learning outcomes and summarize annual assessment findings.

Faculty are required to provide the program’s student learning outcomes, annual assessment studies, findings, and resulting changes. They may be required to submit a plan for the next cycle of assessment studies.

Faculty are required to evaluate the program’s student learning outcomes, annual assessment findings, bench-marking results, subsequent changes, and evidence concerning the impact of these changes. They present a plan for the next cycle of assessment studies.

Process of Review

Internal and external reviewers do not address evidence concerning the quality of student learning in the program other than grades.

Internal and external reviewers address indirect and possibly direct evidence of student learning in the program; they do so at the descriptive level, rather than providing an evaluation.

Internal and external reviewers analyze direct and indirect evidence of student learning in the program and offer evaluative feedback and suggestions for improvement. They have sufficient expertise to evaluate program efforts; departments use the feedback to improve their work.

Well-qualified internal and external reviewers evaluate the program’s learning outcomes, assessment plan, evidence, benchmarking results, and assessment impact. They give evaluative feedback and suggestions for improve-ment. The department uses the feedback to improve student learning.

Planning and Budgeting

The campus has not integrated program reviews into planning and budgeting processes.

The campus has attempted to integrate program reviews into planning and budgeting processes, but with limited success.

The campus generally integrates program reviews into planning and budgeting processes, but not through a formal process.

The campus systematically integrates program reviews into planning and budgeting processes, e.g., through negotiating formal action plans with mutually agreed-upon commitments.

Annual Feedback on Assessment Efforts

No individual or committee on campus provides feedback to departments on the quality of their outcomes, assessment plans, assessment studies, impact, etc.

An individual or committee occasionally provides feedback on the quality of outcomes, assessment plans, assessment studies, etc.

A well-qualified individual or committee provides annual feedback on the quality of outcomes, assessment plans, assessment studies, etc. Departments use the feedback to improve their work.

A well-qualified individual or committee provides annual feedback on the quality of outcomes, assessment plans, assessment studies, benchmarking results, and assessment impact. Departments effectively use the feedback to improve student learning. Follow-up activities enjoy institutional support

The Student Experience

Students are unaware of and uninvolved in program review.

Program review may include focus groups or conversations with students to follow up on results of surveys

The internal and external reviewers examine samples of student work, e.g., sample papers, portfolios and capstone projects. Students may be invited to discuss what they learned and how they learned it.

Students are respected partners in the program review process. They may offer poster sessions on their work, demon-strate how they apply rubrics to self-assess, and/or provide their own evaluative feedback.

222

Page 225: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

How Visiting Team Members Can Use the Program Review Rubric Conclusions should be based on a review of program-review documents and discussion with relevant campus representatives, such as department chairs, deans, and program review committees. The rubric has five major dimensions: 1. Self-Study Requirements. The campus should have explicit requirements for the program’s self-study, including an analysis of the program’s

learning outcomes and a review of the annual assessment studies conducted since the last program review. Faculty preparing the self-study should reflect on the accumulating results and their impact; and they should plan for the next cycle of assessment studies. As much as possible, programs should benchmark findings against similar programs on other campuses. Questions: Does the campus require self-studies that include an analysis of the program’s learning outcomes, assessment studies, assessment results, benchmarking results, and assessment impact, including the impact of changes made in response to earlier studies? Does the campus require an updated assessment plan for the subsequent years before the next program review?

2. Self-Study Review. Internal reviewers (on-campus individuals, such as deans and program review committee members) and external reviewers (off-campus individuals, usually disciplinary experts) should evaluate the program’s learning outcomes, assessment plan, assessment evidence, benchmarking results, and assessment impact; and they should provide evaluative feedback and suggestions for improvement. Questions: Who reviews the self-studies? Do they have the training or expertise to provide effective feedback? Do they routinely evaluate the program’s learning outcomes, assessment plan, assessment evidence, benchmarking results, and assessment impact? Do they provide suggestions for improvement? Do departments effectively use this feedback to improve student learning?

3. Planning and Budgeting. Program reviews should not be pro forma exercises; they should be tied to planning and budgeting processes, with expectations that increased support will lead to increased effectiveness, such as improving student learning and retention rates. Questions. Does the campus systematically integrate program reviews into planning and budgeting processes? Are expectations established for the impact of planned changes?

4. Annual Feedback on Assessment Efforts. Campuses moving into the culture of evidence often find considerable variation in the quality of assessment efforts across programs, and waiting for years to provide feedback to improve the assessment process is unlikely to lead to effective campus practices. While program reviews encourage departments to reflect on multi-year assessment results, some programs are likely to require more immediate feedback, usually based on a required, annual assessment report. This feedback might be provided by an Assessment Director or Committee, relevant Dean or Associate Dean, or others; and whoever has this responsibility should have the expertise to provide quality feedback. Questions: Does someone have the responsibility for providing annual feedback on the assessment process? Does this person or team have the expertise to provide effective feedback? Does this person or team routinely provide feedback on the quality of outcomes, assessment plans, assessment studies, benchmarking results, and assessment impact? Do departments effectively use this feedback to improve student learning?

5. The Student Experience. Students have a unique perspective on a given program of study: they know better than anyone what it means to go through it as a student. Program review should take advantage of that perspective and build it into the review. Questions: Are students aware of the purpose and value of program review? Are they involved in preparations and the self-study? Do they have an opportunity to interact with internal or external reviewers, demonstrate and interpret their learning, and provide evaluative feedback?

223

Page 226: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

The Educational Effectiveness Framework: Capacity and Effectiveness as They Relate to Student and Institutional Learning

Key Descriptive Terms

ELEMENT & DEFINITION INITIAL EMERGING DEVELOPED HIGHLY DEVELOPED

Learning A. Student learning outcomes

established; communicated in syllabi and publications; cited and used by faculty, student affairs, advisors, others (CFRs 2.2, 2.4):

For only a few programs and units; only vaguely (if at all) for GE; not communicated in syllabi, or publications such as catalogues, view books, guides to the major; only a few faculty know and use for designing curriculum, assignments, or assessment

For many programs and units, most aspects of GE; beginning to be communi-cated in basic documents; beginning to be used by some faculty for design of curriculum, assignments, assessments

For all units (academic & co-curricular), and for all aspects of GE; cited often but not in all appropriate places; most faculty cite; used in most programs for design of curriculum, assignments, and assessment

For all units (academic and co-curricular), and for all aspects of GE; cited widely by faculty and advisors; used routinely by faculty, student affairs, other staff in design of curricula, assignments, co-curriculum, and assessment

B. Expectations are established for how well (i.e., proficiency or level) students achieve outcomes (CFRs 2.1, 2.4, 2.5):

Expectations for student learning have not been set beyond course completion and GPA; level of learning expected relative to outcomes unclear

Expectations for level of learning explicit in a few programs; heavy reliance on course completion and GPA

Expectations for student learning explicit in most programs

Expectations for student learning are explicit in all programs, widely known and embraced by faculty, staff, and students

C. Assessment plans are in place; curricular and co-curricular outcomes are systematically assessed, improvements documented (CFRs 2.4, 2.7):

No comprehensive assessment plans. Outcomes assessed occasionally using surveys and self reports, seldom using direct assessment; rarely lead to revision of curriculum, pedagogy, co-curriculum, or other aspects of educational experience

Some planning in place. Outcomes assessed occasionally, principally using surveys; beginning to move toward some direct assessment; occasionally leads to improvements in educational experience; improvements sporadically documented, e.g., in units’ annual reports.

Plans mostly in place. Assessment occurs periodically, using direct methods supplemented by indirect methods and descriptive data; educational experience is frequently improved based on evidence and findings; improvements are routinely documented, e.g. in units’ annual reports

Assessment plans throughout institution. Assessment occurs on regular schedule using multiple methods; strong reliance on direct methods, performance-based; educational experience systematically reviewed and improved based on evidence and findings; documentation widespread and easy to locate.

D. Desired kind and level of learning is achieved (CFR 2.6):

Possible that learning is not up to expectations, and/or expectations set by institution are too low for degree(s) offered by the institution

Most students appear to achieve at levels set by the institution; faculty and other educators beginning to discuss expectations and assessment findings

Nearly all students achieve at or above levels set by institution; assessment findings discussed periodically by most faculty and other campus educators

All students achieve at or above levels set by institution; findings are discussed regularly and acted upon by all or nearly all faculty and other campus educators

Teaching/Learning Environment A. Curricula, pedagogy, co-

curriculum, other aspects of educational experience are aligned with outcomes (2.1, 2.2, 2.3, 2.4, 2.5, 4.6):

Conceived exclusively or largely in terms of inputs (e.g. library holdings, lab space), curricular requirements (e.g., for majors, GE) and availability of co-curricular programs; not visibly aligned with outcomes or expectations for level of student achievement; evidence of alignment processes lacking

Educational experience beginning to be aligned with learning outcomes and expectations for student achievement; evidence of alignment efforts available in some academic and co-curricular programs

Educational experience generally aligned with learning outcomes, expectations for student achievement; alignment becoming intentional, systematic, supported by tools (e.g. curriculum maps) and processes. Evidence of alignment efforts generally available

Educational experience fully aligned with learning outcomes, expectations; alignment is systematic, supported by tools and processes as well as broader institutional infrastructure. Evidence of alignment efforts readily available

B. Curricular and co-curricular processes (CFRs 2.1, 2.2, 2.3, 2.11, 2.13) are:

Rarely informed by good learning practices as defined by the wider higher education community; few curricular or co-curricular activities reviewed, mostly without reference to outcomes or evidence of student learning

Informed in some instances by good learning practices; curricula and co-curricular activities occasionally reviewed and improved but with little reference to outcomes or assessment findings

Informed in many cases by good learning practices; reviewed and improved by relevant faculty and other campus educators; often based on outcomes and assessment findings

Regularly informed by good learning practices; improvements consistently result from scholarly reflection on outcomes and assessment findings by relevant faculty and other campus educators

224

Page 227: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

The Educational Effectiveness Framework: Capacity and Effectiveness as They Relate to Student and Institutional Learning

C. Professional development,

rewards (CFRs 2.8, 2.9): Little or no support for faculty, other campus educators to develop expertise in assessment of student learning, related practices; work to assess, improve student learning plays no positive role in reward system, may be viewed as a negative

Some support for faculty, other educators on campus to develop expertise in assessment of student learning, related practices; modest, implicit positive role in reward system

Some support for faculty, other campus educators to develop expertise in assessment of student learning, related practices; explicit, positive role in reward structure

Significant support for faculty, other campus educators to develop expertise in assessment of student learning, related practices; explicit, prominent role in reward structure

Organizational Learning A. Indicators of educational

effectiveness are (CFRs 1.2, 4.3, 4.4):

Notable by their absence or considered only sporadically in decision-making

Found in some areas; dissemination of performance results just beginning; no reference to comparative data

Multiple, with data collected regularly, disseminated, collectively analyzed; some comparative data used. Some indicators used to inform planning, budgeting, other decision making on occasional basis

Multiple, with data collected regularly, disseminated widely, collectively analyzed; comparative data used, as appropriate, in all programs. Indicators consistently used to inform planning, budgeting, other decision making at all levels of the institution

B. Formal program review (CFRs 2.7, 4.4) is:

Rare, if it occurs at all, with little or no useful data generated. Assessment findings on student learning not available and/or not used

Occasional, in some departments or units; heavy reliance on traditional inputs as indicators of quality; findings occasion-ally used to suggest improvements in educational effectiveness; weak linkage to institution-level planning, budgeting

Frequent, affecting most academic and co-curricular units, with growing inclusion of findings about student learning; unit uses findings to collectively reflect on, improve effectiveness; some linkage to institution-level planning, budgeting

Systematic and institution-wide, with learning assessment findings a major component; units use findings to improve student learning, program effectiveness, and supporting processes; close linkage to institution-level planning, budgeting

C. Performance data, evidence, and analyses (CFRs 4.3, 4.5, 4.6) are:

Not collected, disseminated, disaggregated, or accessible for wide use. Not evident in decision-making processes; do not appear to be used for improvement in any programs

Limited collection, dissemination, disaggregation, or access. Campus at beginning stages of use for decisions to improve educational effectiveness at program, unit, and/or institutional level

Systematic collection and dissemination, wide access; sometimes disaggregated; usually considered by decision-making bodies at all levels, but documentation and/or linkage to educational effectiveness may be weak

Systematic collection and dissemination, and access, purposeful disaggregation; consistently used by decision-making bodies for program improvement at all levels, with processes fully documented

D. Culture of inquiry and evidence (CFRs 4.5, 4.6, 4.7):

Faculty, other educators, staff, institutional leaders, governing board not visibly committed to a culture of inquiry and evidence except in isolated cases; not knowledgeable about learner-centeredness, assessment, etc.

Campus knowledge is minimal; support – at top levels and/or grass roots – for development of a culture of inquiry and evidence is sporadic and uneven

Campus knowledge and support for a culture of inquiry and evidence fairly consistent across administration, faculty, professional staff but may not be uniformly deep

Consistent, knowledgeable, deep commitment to creating and sustaining a culture of inquiry and evidence in all appropriate functions at all levels

E. Communication and transparency (CFR 1.2, 1.7):

Little or no data, findings, analyses from assessment of student learning available within the institution or to external audiences

Some data, findings, analyses from assessment of student learning available but may be incomplete, difficult to access or understand for internal or external audiences

Data, findings, analyses from assessment of student learning generally available, easily accessible; chosen for relevance to multiple audiences

Data, findings, analyses from learning assessment are widely available and skillfully framed to be understandable, useful to multiple audiences

Overall: The institution can best be described as:

Committed to isolated aspects of educational effectiveness; if other areas are not addressed, continuing reaffirmation of accreditation is threatened

Committed to educational effectiveness in some areas; significant number of areas require attention, improvement

Mostly well-established commitment to educational effectiveness; a few areas require attention, improvement

Fully committed to and going beyond WASC recommendations; operates at an exemplary level in addressing its Core Commitments to capacity as it relates to learning and to educational effectiveness

225

Page 228: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

RUBRICS

ACCJC

226

Page 229: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

1

Accrediting Commission for Community and Junior Colleges Western Association of Schools and Colleges

Rubric for Evaluating Institutional Effectiveness – Part I: Program Review

(See attached instructions on how to use this rubric.)

Levels of Implementation

Characteristics of Institutional Effectiveness in Program Review (Sample institutional behaviors)

Awareness

• There is preliminary investigative dialogue at the institution or within some departments about what data or process should be used for program review. • There is recognition of existing practices and models in program review that make use of institutional research. • There is exploration of program review models by various departments or individuals. • The college is implementing pilot program review models in a few programs/operational units.

Development

• Program review is embedded in practice across the institution using qualitative and quantitative data to improve program effectiveness. • Dialogue about the results of program review is evident within the program as part of discussion of program effectiveness. • Leadership groups throughout the institution accept responsibility for program review framework development (Senate, Admin. Etc.) • Appropriate resources are allocated to conducting program review of meaningful quality. • Development of a framework for linking results of program review to planning for improvement. • Development of a framework to align results of program review to resource allocation.

Proficiency

• Program review processes are in place and implemented regularly. • Results of all program review are integrated into institution- wide planning for improvement and informed decision-making. • The program review framework is established and implemented. • Dialogue about the results of all program reviews is evident throughout the institution as part of discussion of institutional effectiveness. • Results of program review are clearly and consistently linked to institutional planning processes and resource allocation processes; college can demonstrate or provide specific examples. • The institution evaluates the effectiveness of its program review processes in supporting and improving student achievement and student learning outcomes.

Sustainable Continuous

Quality Improvement

• Program review processes are ongoing, systematic and used to assess and improve student learning and achievement. • The institution reviews and refines its program review processes to improve institutional effectiveness. • The results of program review are used to continually refine and improve program practices resulting in appropriate improvements in student achievement and learning.

227

Page 230: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

2

Accrediting Commission for Community and Junior Colleges

Western Association of Schools and Colleges

Rubric for Evaluating Institutional Effectiveness – Part II: Planning (See attached instructions on how to use this rubric.)

Levels of Implementation

Characteristics of Institutional Effectiveness in Planning

(Sample institutional behaviors)

Awareness

• The college has preliminary investigative dialogue about planning processes. • There is recognition of case need for quantitative and qualitative data and analysis in planning. • The college has initiated pilot projects and efforts in developing systematic cycle of evaluation, integrated planning and implementation (e.g. in human or physical resources). • Planning found in only some areas of college operations. • There is exploration of models and definitions and issues related to planning. • There is minimal linkage between plans and a resource allocation process, perhaps planning for use of "new money" • The college may have a consultant-supported plan for facilities, or a strategic plan.

Development

• The Institution has defined a planning process and assigned responsibility for implementing it. • The Institution has identified quantitative and qualitative data and is using it. • Planning efforts are specifically linked to institutional mission and goals. • The Institution uses applicable quantitative data to improve institutional effectiveness in some areas of operation. • Governance and decision-making processes incorporate review of institutional effectiveness in mission and plans for improvement. • Planning processes reflect the participation of a broad constituent base.

Proficiency

• The college has a well documented, ongoing process for evaluating itself in all areas of operation, analyzing and publishing the results and planning and implementing improvements. • The institution's component plans are integrated into a comprehensive plan to achieve broad educational purposes, and improve institutional effectiveness. • The institution effectively uses its human, physical, technology and financial resources to achieve its broad educational purposes, including stated student learning outcomes. • The college has documented assessment results and communicated matters of quality assurance to appropriate constituencies (documents data and analysis of achievement of its educational mission). • The institution assesses progress toward achieving its education goals over time (uses longitudinal data and analyses). • The institution plans and effectively incorporates results of program review in all areas of educational services: instruction, support services, library and learning resources. • Program review processes are ongoing, systematic and used to assess and improve student learning and achievement.

Sustainable Continuous Quality Improvement

• The institution uses ongoing and systematic evaluation and planning to refine its key processes and improve student learning. • There is dialogue about institutional effectiveness that is ongoing, robust and pervasive; data and analyses are widely distributed and used throughout the institution. • There is ongoing review and adaptation of evaluation and planning processes. • There is consistent and continuous commitment to improving student learning; and educational effectiveness is a demonstrable priority in all planning structures and processes.

228

Page 231: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

3

Accrediting Commission for Community and Junior Colleges Western Association of Schools and Colleges

Rubric for Evaluating Institutional Effectiveness – Part III: Student Learning Outcomes

(See attached instructions on how to use this rubric.)

Levels of Implementation

Characteristics of Institutional Effectiveness in Student Learning Outcomes

(Sample institutional behaviors)

Awareness

• There is preliminary, investigative dialogue about student learning outcomes. • There is recognition of existing practices such as course objectives and how they relate to student learning outcomes. • There is exploration of models, definitions, and issues taking place by a few people. • Pilot projects and efforts may be in progress. • The college has discussed whether to define student learning outcomes at the level of some courses or programs or degrees; where to begin.

Development

• College has established an institutional framework for definition of student learning outcomes (where to start), how to extend, and timeline. • College has established authentic assessment strategies for assessing student learning outcomes as appropriate to intended course, program, and degree learning outcomes. • Existing organizational structures (e.g. Senate, Curriculum Committee) are supporting strategies for student learning outcomes definition and assessment. • Leadership groups (e.g. Academic Senate and administration), have accepted responsibility for student learning outcomes implementation. • Appropriate resources are being allocated to support student learning outcomes and assessment. • Faculty and staff are fully engaged in student learning outcomes development.

Proficiency

• Student learning outcomes and authentic assessment are in place for courses, programs and degrees. • Results of assessment are being used for improvement and further alignment of institution-wide practices. • There is widespread institutional dialogue about the results. • Decision-making includes dialogue on the results of assessment and is purposefully directed toward improving student learning. • Appropriate resources continue to be allocated and fine-tuned. • Comprehensive assessment reports exist and are completed on a regular basis. • Course student learning outcomes are aligned with degree student learning outcomes. • Students demonstrate awareness of goals and purposes of courses and programs in which they are enrolled.

Sustainable Continuous Quality Improvement

• Student learning outcomes and assessment are ongoing, systematic and used for continuous quality improvement. • Dialogue about student learning is ongoing, pervasive and robust. • Evaluation and fine-tuning of organizational structures to support student learning is ongoing. • Student learning improvement is a visible priority in all practices and structures across the college. • Learning outcomes are specifically linked to program reviews.

JP;DB: cg 8/2007

229

Page 232: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

SLO, Rubrics, & Assessment http://www.apus.edu/Learning-Outcomes-Assessment/Initiatives/Rubrics-Program/Rubrics-Program-Overview.htm http://www4.nau.edu/assessment/assessment/liberal/documents/Oral_Comm_discussion_rubric.pdf http://www.greenriver.edu/learningoutcomes/StudentLearningOutcomes.htm http://pandora.cii.wwu.edu/cii/resources/writing/writing_rubric.asp http://www.uncwil.edu/cas/documents/Elaboratedcompetencies3.pdf http://www.mscd.edu/~ssac/SLO-IdentificationMatrix.pdf http://www.rcampus.com/indexrubric.cfm http://www.uwlax.edu/gened/GEAC%20Overview%20of%20Course%20Embedded%20Assessment%20Tasks%20&%20Rubrics.pdf http://www4.nau.edu/assessment/assessment/liberal/documents/Effective_writing_rubrics.pdf http://online.fresnocitycollege.edu/senate/curriculum/slo.html http://www.cabrillo.edu/~tsmalley/learneroutcomes.html http://pro.cabrillo.edu/slos/ http://www.montgomerycollege.edu/outcomes/documents/sloa_handbook.pdf http://www.elcamino.edu/academics/slo/ http://columbia.yosemite.cc.ca.us/slo/slohome.htm http://www.oxnardcollege.edu/faculty_staff/student_learning_outcomes.shtml http://www.league.org/league/projects/lcp/lcp3/Learning_Outcomes.htm http://www.contracosta.edu/Library/sites/slos.htm http://www.aacu.org/documents/New_Leadership_Statement.pdf http://www.laccd.edu/inst_effectiveness/Student_Learning/ http://www.cos.edu/view_page.asp?nodeid=3138&parentid=933&moduleid=5

230

Page 233: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

http://www.laney.peralta.edu/apps/comm.asp?Q=31028 http://www.pima.edu/acad_services/slo/ http://www.uri.edu/assessment/media/public/page_files/uri/outcomes/student/outcomes/outcomes_tools/SLO%20201%202.pdf http://aslo.lbcc.edu/ http://pro.cabrillo.edu/slos/docs/SLO_InstructionalPlanningAssessmentProcess.pdf http://css.rpgroup.org/ka.php?ka_id=7 http://research.crc.losrios.edu/Instructions%20for%20Writing%20Student%20Learning%20Outcomes.htm http://www.grossmont.edu/student_learning_outcomes/writing_slos.asp http://www.lcc.hawaii.edu/userfiles/file/accreditation/presentations/retreat_06_3_dashboards.pdf http://www.dvc.edu/org/departments/research/slo/ http://www.pasadena.edu/slo/ http://www.foothill.fhda.edu/schedule/learning_outcomes.php http://www.merritt.edu/apps/comm.asp?$1=40770 http://www.mtsac.edu/instruction/outcomes/ http://www.goldenwestcollege.edu/slo/ http://www.compton.edu/studentservices/slo.aspx http://www.deltacollege.edu/dept/dsps/StudentLearningOutcomes.html http://4faculty.org/docs/slo_resources.htm https://plan.elac.edu/public/Shared%20Documents/Student%20Learning%20Outcomes.aspx http://www.uwlax.edu/gened/Outcomes.htm http://cai.cc.ca.us/workshops/SLOFocusOnResults.doc http://www.sloassessment.com/

231

Page 234: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Rubric Resources http://www.rubrician.com/writing.htm http://www.teach-nology.com/web_tools/rubrics/ http://www.shambles.net/pages/staff/rubrics/ http://www.scribd.com/doc/11541381/Persuasive-Writing-Rubric http://www.teach-nology.com/web_tools/rubrics/persuade/ http://www.pz.harvard.edu/research/RubricsSelfPE.htm http://www.google.com/#hl=en&q=problem+solving+rubric&revid=1640595612&ei=Yf8_SsLYN4HKsQOwjM2VDw&sa=X&oi=revisions_inline&resnum=0&ct=broad-revision&cd=5&fp=cbonjxemLj4 http://www.schreyerinstitute.psu.edu/pdf/ProblemSolvingRubric1.pdf http://www.nden.k12.wi.us/tlcf/prob3.htm http://www.uen.org/Rubric/rubric.cgi?rubric_id=13 http://www.google.com/#hl=en&q=sentence+writing+rubric&revid=1640595612&ei=Yf8_SsLYN4HKsQOwjM2VDw&sa=X&oi=revisions_inline&resnum=0&ct=broad-revision&cd=4&fp=cbonjxemLj4 http://www.education.vermont.gov/new/html/pgm_curriculum/literacy/writing/benchmarks.html http://www.google.com/#hl=en&q=writing+conventions+rubric&revid=1640595612&ei=Yf8_SsLYN4HKsQOwjM2VDw&sa=X&oi=revisions_inline&resnum=0&ct=broad-revision&cd=3&fp=cbonjxemLj4 http://www.neiu.edu/~neassess/gened.htm http://www.englishcompanion.com/pdfDocs/foundationskills.pdf

232

Page 235: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Assessment Quickies:

Student Learning

Outcome Assessment

in Ten Easy Steps 233

Page 236: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

ASSESSMENT QUICKIES: STUDENT LEARNING OUTCOME ASSESSMENT IN TEN EASY STEPS

By

Michelle Saint-Germain, Director, Program Review and Assessment California State University, Long Beach*

What is it? A set of ten podcasts on how to do assessment Where is it? Available (free!) on iTunesU (http://itunesu.csulb.edu/) What do I need? Download iTunes player on your computer or iPod How do I get it? Find the podcasts and subscribe to all What do I get? Power-point slides and voice narration Can I share it? Yes, it is open to anyone who wants to use it Feedback? Send questions/comments to [email protected] File# Title File Type

001 What Are Student Learning Outcomes? MPEG-4 video file

001T What Are Student Learning Outcomes? Transcript (Word)

002 Writing Student Learning Outcome Statements MPEG-4 video file

002T Writing Student Learning Outcome Transcript (Word)

003 Levels of Student Learning MPEG-4 video file

003T Levels of Student Learning Transcript (Word)

004 Mapping Student Learning Outcomes to the Curriculum MPEG-4 video file

004T Mapping Student Learning Outcomes to the Curriculum Transcript (Word)

005 Choosing Assessment Measures MPEG-4 video file

005T Choosing Assessment Measures Transcript (Word)

006 Matching Assessment to Teaching and Learning MPEG-4 video file

006T Matching Assessment to Teaching and Learning Transcript (Word)

007 Collecting Assessment Evidence of Student Learning MPEG-4 video file

007T Collecting Assessment Evidence of Student Learning Transcript (Word)

008 Analyzing Evidence of Student Learning MPEG-4 video file

008T Analyzing Evidence of Student Learning Transcript (Word)

009 Using Evidence of Student Learning for Program Improvement

MPEG-4 video file

009T Using Evidence of Student Learning for Program Improvement

Transcript (Word)

010 Why Assess Student Learning? MPEG-4 video file

010T Why Assess Student Learning? Transcript (Word)

*These podcasts reflect the opinions of the author only and do not constitute any official policy or process of the California State University, Long Beach.

234

Page 237: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Notes

235

Page 238: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

Notes

236

Page 239: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...
Page 240: Retreat on Student Learning and Assessment, Level I · Retreat on Student Learning and Assessment, Level I ... A Close-Up Look ... the nature of the students served by those ...

April 21-23, 2010, The Westin Long Beach, CA

SUSTAINABILITY: A VISION

FOR HIGHER EDUCATION

2010a c a d e m i c r e s o u r c e c o n f e r e n c eARCs p o n s o r e d b y a c s c u i n c o l l a b o r a t i o n w i t h a c c j c