Skip to main content Skip to search
""

YU Learning Assessment

Office of the Provost

Our Commitment

Welcome to the Yeshiva University Learning Assessment website. At Yeshiva University we are committed to student learning assessment in order to ensure that our colleges and schools, programs/majors, and courses are successfully fulfilling their educational missions, goals, and objectives. The purpose of this website is to provide faculty and staff with assessment-related tools and resources to help guide development and implementation of effective learning assessment plans.

The purpose of the AAC is to promote and support YU’s learning assessment efforts by:

  • Fostering a positive assessment culture throughout the University
  • Supporting and facilitating University-wide assessment activities such as (1) disseminating assessment information across YU colleges/schools, including identifying best models and practices, and (2) collecting, documenting, and sharing assessment information for program/major improvement

The Committee meets at least once each semester. Its members are as follows:

  • Dr. Rachel Ebner (Chair), Director of Student Learning Assessment; Clinical Assistant Professor of Psychology
  • Dr. Selma Botman, Provost and Senior Vice President for Academic Affairs
  • Dr. Timothy Stevens, Special Assistant to the Provost of Academic Affairs and Faculty Services
  • Dr. Yuxiang Liu, Director of Institutional Research
  • Dr. Karon Bacon, Dean of Undergraduate Faculty of Arts and Sciences
  • Michael Strauss, Associate Dean of Sy Syms School of Business
  • Dr. Avi Giloni Associate Dean of Sy Syms School of Business
  • Dr. Shalom Holtz, Associate Dean of Academic Affairs
  • Rachael Dylenski, Executive Director of Academic Programs for the Katz School of Science and Health

FAQ

The Assessment Cycle

Assessment is "the systematic and ongoing process of gathering, analyzing, and using information from multiple sources to draw inferences about the characteristics of students, programs, or an institution for the purpose of making informed decisions to improve the learning process" (Linn & Miller, 2005). The principle that assessment is a systematic and continuous process, not an end product, is central to this definition.

Assessment is...

  • a cyclical process not an end goal
  • planned and systematic not random and variable
  • ongoing and cumulative not one point in time
  • multifaceted not singular
  • informative not a judgment
  • objective not subjective
  • transparent not unclear or hidden
  • pragmatic not useless
  • faculty designed and implemented not imposed from the top down

  • Classroom assessment: assessing an individual student’s learning experience in a course
  • Program assessment: assessing a group of students’ learning experience in relation to a program, departmental major or unit of study
  • Institutional assessment: assessing campuswide factors

If you don’t know where you are going, the best-made maps won’t help you get there” (Mager, 1997, p. vi).

  1. Assessment promotes self-reflection, which is essential for effective teaching and learning (Assessment: FAQ, Stanford University). It helps you to reflect on:
    • What goals you are trying to accomplish
    • How well you are meeting those goals
    • How you can improve
  2. Accreditation: Middle States Commission on Higher Education Standard 14: Assessment of Student Learning—“Assessment of student learning demonstrates that, at graduation, or other appropriate points, the institution’s students have knowledge, skills, and competencies consistent with institutional and appropriate higher education goals” (MSCHE, 2009). MSCHE.org.

Assessment involves collecting evidence of student learning and attainment of intended learning outcomes. To develop a more complete understanding of the extent of student learning, multiple pieces of evidence are needed. Evidence of student learning can be direct or indirect. To obtain the best indication of student learning, a combination of direct and indirect measures should be used.

  • Direct assessment: evidence based on directly examining and measuring students’ performance (e.g., exams, projects, papers, portfolio assignments, oral presentations, fieldwork observations)
  • Indirect assessment: evidence based on reports of perceived student learning (e.g., surveys and interviews with students, employers, faculty)

It means to take action by using assessment results to make program-level improvements or decisions. This might include:

  1. Revising your program-level outcomes
  2. Changing curricula by adding or removing courses or program experiences, requiring prerequisite courses, changing instructional methods or assignments within courses
  3. Creating or modifying assessments
  4. Creating or modifying rubrics
  5. Using assessment results to support current program practices or to make other program policies or decisions

Transparency showcases evidence of student learning from program experiences. It also enables you to reflect on program practices and effectiveness for meeting student outcomes.

Please contact us if there is any aspect of this website or student learning assessment that you would like to discuss.

Rachel J. Ebner, PhD
Director of Student Learning Assessment
Belfer Hall 1300A; 215 Lex. Room 606
212.960.5400, ext. 6138
rachel.ebner@yu.edu

Assessment Toolkit Resources

  1. Clearly define program’s/major’s mission
  2. Identify student outcome learning goals that directly align with program’s/major’s mission
  3. Define learning goals by stating objectives
  4. Map out which program/major courses and learning experiences will enable students to achieve program/major goals (curriculum mapping)
  5. Devise a program/major assessment plan and timeline
  6. Identify which goals you are going to assess and when
  7. Develop comprehensive methods for both directly and indirectly assessing students’ attainment of those goals (NOTE: no one assessment can evidence learning—multiple pieces of evidence are needed)
  8. Develop corresponding scoring rubrics to ensure consistency and accuracy in scoring of assessments (NOTE: rubrics are not the assessment, but a tool for scoring assessments)
  9. Implement the assessment plan and continuously monitor its effectiveness, making changes or improvements when necessary
  10. Analyze assessment results and communicate/report findings
  11. Use assessment results to inform and improve program’s/major’s effectiveness in meeting learning goals and objectives
  12. Document steps 1-11

Click Here for YU Program/Major Assessment Guidelines

How to Write Missions from the University of Connecticut’s assessment website

How to Write Learning Goals from the University of Connecticut’s assessment website

Skip past mobile menu to footer