Skip to main content

Assessing

How are your students doing? Are they learning what you want them to learn? Is your teaching working the way you want it to work? How engaged are your students? Are your assignments and exams measuring the kind of learning and content you care about most?

What is it?

Assessing group work involves evaluating a group's achievement of learning outcomes through collaborative assignments or projects (product), that group's ability to work together (process), and/or the individual group members' contributions. Approaches to group assessment may combine student self-assessment, peer ratings, and instructor evaluation strategies.

Why should you do it?

Scoring individual effort and the group work process in addition to the team's final product makes the overall assessment more equitable and holds all students accountable.

How do you do it?

Process

Let students know in advance what will be assessed—i.e., which aspects of the final group product, the group work process, and individual contributions will be assessed; when they will be assessed—i.e., how often during the group work period; and who will be assessing them. Use rubrics, Likert scale style surveys, the plus-delta exercise, and/or other instruments to guide the group work assessment. Recommendations from multiple sources (Carnegie Mellon, n.d.; Duke University, n.d.; Oakley et al., 2004; Stanford University, 1999) are organized in the following table. Select the pieces that work for group work in your class.

 

Assessing individual contributions

Assessing the group's end product

Assessing the group work process

Self-assessment

Use quizzes to evaluate what each student has learned, as well as reflective self-reports of their own contributions.

Plus-delta: Ask students the relative value of their contributions toward the final product, and to report what they might do to improve the final product.

Ask students to document and assess their own participation in the group work process.

Peer assessment

Use evaluation forms or quizzes for each student to rate the contributions of the other team members.

Use evaluation forms or survey questions for students to rate the value of each team member's contributions to the end product.

Ask students to document and rate each other's participation in the group work process.

Instructor assessment

Ask students to note which contributions they made (e.g., with initials or highlighting) or use the history tab in a wiki to review individual work on a project.

Use a rubric to evaluate how well the group's end product demonstrates achievement of the learning outcomes.

Include peer input as a percentage of students' grades.

Applications/Examples
  • Plus-delta exercise (in-class, online, or both): Use the plus-delta exercise to ask students to evaluate their team's progress periodically throughout the period when they work in groups. Provide worksheets with two columns or use a discussion forum that requires students to answer two separate questions: "What is working?" and "What can be improved?"
  • Peer assessment (in-class, online or both): Consider having students assign to each team member a specific percentage of the total group effort, including themselves. The percentages must add up to 100. To make peer assessment fair for weaker students, ask them instead to assess the "team citizenship" of each member—i.e., give each student a fixed number of points to assign to the different team members for completing tasks, helping others, etc. For example, give each person on a 4-person team 40 points to give to themselves and their peers. If they all did equal work, each student will receive an average peer rating of 10. Use each team member's average peer rating to weight their grade for the team project overall (Oakley et al., 2004, p. 17).

Want more information?

Peers @ SCU: If you're doing this, let us know!

Elizabeth Day uses the following team evaluation sheet in her course.

Kathryn Bruchmann, Psychology
Richard Trevisan, Management (Academic)
Theresa Conefrey, English
Steve Levy, Management (Academic)
Sandy Piderit, Management (Academic)
Resources for Tips & Tricks

Carnegie Mellon. (n.d.). How can I assess group work?

  • Provides strategies for assessing individual and group learning, and several samples of group project assessment tools.

CATME software from Purdue University is a teaming tool that helps you form teams based on a wide variety of criteria including schedule, leadership style, gender, GPA, etc. The tool also includes a well-developed self and team assessment survey. The survey results provide the instructor with a grading factor based on assessment results. It is recommended that teams complete the assessment activity mid-quarter and conflicts be addressed with final grade factors being based on a second assessment at the end of the term.

University of Wisconsin-Madison. (n.d.). Evaluating and Assessing Collaborative Group Work. Technology Enhanced Collaborative Group Work (TECGW) Resources for Faculty.

  • Shares numerous forms, rubrics and evaluation tools for assessing group work.
Resources for Deeper Learning

Oakley, B.; Felder, R.M.; Brent, R. & Elhajj, I. (2004). Turning Student Groups into Effective Teams.Journal of Student Centered Learning, 2(1), 9-34.

  • Contains self-assessment and peer assessment forms as appendices to the article.
References

Carnegie Mellon. (n.d.). How can I assess group work?

Duke University. (n.d.). Groups & Team-Based Learning. Teaching Strategies.

Oakley, B.; Felder, R.M.; Brent, R. & Elhajj, I. (2004). Turning Student Groups into Effective TeamsJournal of Student Centered Learning, 2(1), 9-34.

Stanford University. (1999, Winter). Cooperative Learning: Students Working in Small GroupsSpeaking of Teaching, 10(2). 

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

You are used to assessing students' knowledge and skills, but have you ever assessed students' attitudes or engagement? Both types of assessment are often done through surveys. Assessing students' attitudes is part of a larger domain called affective learning, which also includes emotions, feelings and values. Instruments to assess attitudes may ask students to rate how relevant they feel the course content is to their academic goals, career goals, or personal lives (Hall, 2012). In other cases, they may ask students to state their beliefs about their own capacity to learn (see metacognition).

The National Survey of Student Engagement (NSSE) provides a useful definition of the latter:

Student engagement represents two critical features of collegiate quality. The first is the amount of time and effort students put into their studies and other educationally purposeful activities. The second is how the institution deploys its resources and organizes the curriculum and other learning opportunities to get students to participate in activities that decades of research studies show are linked to student learning.

The second NSSE factor addresses students' engagement with their peers, faculty, and the campus environment, but also their active role in their own learning—e.g., how challenged they feel, if they have adopted learning strategies, and the extent to which they reflect on and integrate what they have learned.

The two aspects of student engagement listed above—effort and participation in the learning process—are directly influenced by their attitudes and motivation. "Motivation and desire represent the very foundation of learning. If students don't want to learn, there will be no learning. If they feel unable to learn, there will be no learning. Desire and motivation are not academic achievement characteristics. They are affective characteristics" (Stiggins, as cited in Hall, 2012).

Why should you do it?

W. James Popham (as cited in Hall, 2012) tells us why we should assess attitudes: "The reason such affective variables as students' attitudes, interests and values are important to us is that those variables typically influence future behavior. The reason we want to promote positive attitudes toward learning is because students who have positive attitudes toward learning today will be inclined to pursue learning in the future."  

In practical terms, assessing attitudes about a subject can help predict their performance and guide you in your efforts to increase students' motivation, participation, and, ultimately, achievement. Assessing attitudes about an instructional strategy or activity can provide insight into redesigning an activity or part of your course. If the attitudes are positive, it may encourage you to use it more often! Students who are the first in their family to go to college may not have the same attitudes as other students. Assessing engagement can tell you things like how much time-on-task students spend on your course and how they are participating in the learning activities you create. The results may inspire you to make some recommendations to students while they can still change their behaviors, or on the flip side continue high levels of engagement.

How do you do it?

Process

1. Prepare

Consider an affective learning outcome for your class, such as "Students will reflect on the influence their attitudes and engagement have on their performance throughout the quarter."

2. Engage your students

Conduct a survey that asks students questions about their attitudes about the course material in general (e.g., relevance), specific materials you have created (e.g., infographic, videos in a lecture), specific activities you use (e.g., polling with clickers, small group discussions, peer review), and/or learning in general (e.g., mindset, ability, motivation). Include questions about levels of engagement (e.g., time committed each week, time committed to a specific project, level of participation in small groups, related activity outside the classroom).

Applications / Examples
  • Student attitude and engagement survey (in-class, online or both): Use clickers or an online survey tool to gather information from your students about attitudes and engagement

  • Attitudes: Ask how they feel about the course content, the class activities, or learning

  • Engagement: First define low-level thinking (e.g., memorization), mid-level thinking (e.g., application) and high-level thinking (e.g., evaluation, synthesis, creation). Then, ask students to rate the level of thinking required throughout your class. This could also be posed as "to what extent has this class challenged you to do your best work?" Ask them to give an average amount of time spent weekly preparing for class, on specific activities (e.g., reading), or on specific projects.

  • Formative feedback for hybrid or flipped class (in-class, online, or both): If you are trying a new course format for the first time (e.g., hybrid or flipped), you probably will not want to wait until the end of the semester to find out what is working and what needs to change. Use the plus-delta exercise to focus specifically on aspects of the class format that are working or that require changes by you or the students. Again, you can use this opportunity to add your own comments if students do not hold themselves completely accountable. For example, as a plus, you can state that you appreciate when students come prepared for the flipped class session by reviewing the required materials. As a change, you can state that when some students come unprepared it can delay the activities and decrease the chances that everyone will reach the desired outcomes.

Want more information?

Peers @ SCU: If you're doing this, let us know!

Diana Morlang (Political Science), Theresa Conefrey (English), and Tonya Nilsson (Civil Engineering) have developed strategies for managing the new classroom spaces and use these rooms to improve student engagement and facilitate active learning.

Resources for Deeper Learning

Rimland, E. (2013). Assessing affective learning using a student response system. Libraries and the Academy, 13(4), 385-401. 

  • Describes a study of student attitudes about information competence modules and practices, using clickers to collect the data.

Saxon, D. P., Levine-Brown, P., & Boylan, H. R. (2008). Affective Assessment for Developmental Students, Part 1.Research in Developmental Education, 22(1).

  • Lists a wide variety of affective assessment instruments with information about each one.

UCLA Higher Education Research Institute

  • Shares research on Students' First College Year, Diverse Learning Environments, Freshmen and Senior Experiences, and more.

  • Communicates aggregated survey results via infographics

University of Minnesota. (n.d.). Student Engagement Instrument – Adaptations for Postsecondary Students.

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Learning from assessment, or assessment for learning, is a formative assessment strategy that allows you and your students to evaluate progress and possible areas for improvement. The two-circle Venn diagram below shows that assessment can be used for two purposes.

  • In assessment OF learning scenarios, you evaluate students' competencies at that point in time (e.g., midterm or final exams).
  • In assessment FOR learning scenarios, assessment strategies are a part of the students' learning process.

 

Why should you do it?

Being involved in the learning process through iterative assessment strategies can increase student motivation and achievement. If they determine what concepts they do not understand before high-stakes midterm or final exams, they can take steps to improve before it is too late. Assessment for learning strategies also provide you with information about what course topics students do and do not understand well.

How do you do it?

Process

Assessment for learning can be done through a classroom assessment technique, a low-stakes (or no-stakes) assessment or student self-assessment conducted online or as an assignment, or a peer review activity.

1. Plan assessment for learning activities

Some activities lend themselves to formative assessment more than others.

2. Engage students

It is important to let students know a) that you are using assessment for learning strategies and b) how they can use the results to improve.

You can include students in the assessment for learning process in a number of ways, such as:

  • Create a rubric to guide students through a self-assessment or peer review activity.
  • Create dummy examples or use examples with permission from students who took your class previously.
  • Discuss assessment strategies before and after. For example, prepare students to conduct a self-assessment or peer review exercise by going over the rubric with them and using the rubric to evaluate an example paper. After the activity, ask them what worked and what they would change about the self-assessment or peer review process. Be sure to ask how they think it helped them to learn the underlying course concepts.
  • Ask students to create a learning journal with entries about how assessment for learning activities did or did not help them reach the learning objectives and why or why not.
Applications/Examples
  • Immediate feedback (online): Most quiz tools—inside or outside a learning management system (LMS) like Canvas (SCU's Camino) —allow you to construct feedback for correct and incorrect responses. Include tips for where students can get the information they need to answer the question correctly—e.g. "Incorrect. Review textbook Chapter 6, pp. 34-36, to find what you need to answer this question correctly next time." In the quiz settings, allow students to attempt the quiz multiple times so they can use the feedback and a low-stakes assessment process as learning tools.
  • Assessment for learning with ePortfolios (example): The Journalism Department at San Francisco State University uses ePortfolios for students to collect their work. Some instructors require students to track and reflect on their own progress at each stage of the iterative writing process:
  • Outline
  • First draft
  • Peer review feedback
  • Second draft with revisions based on peer feedback
  • Final draft with revisions based on self-assessment with rubric
  • Instructor feedback
  • Post-final draft with revisions based on instructor feedback (not for grade, but for professional portfolio)

Want more information?

Peers @ SCU: If you're doing this, let us know!

Jill Goodman Gould, English
Theresa Conefrey, English
Shelby McIntyre, Marketing (Academic)
Katy Bruchmann, Psychology
Sharmila Lodhia, Women's and Gender Studies
Robin Tremblay-McGaw, English
Deborah Ross, Ministerial Formation, Jesuit School of Theology

Resources for Tips & Tricks

Brown University – Sheridan Center. Assessment for Learning: Top Ten Tips.

  • Provides tips for creating and facilitating assessment for learning activities.
Resources for Deeper Learning

Education Services Australia. Assessment for Learning website.

  • Contains numerous assessment tasks (activities) in multiple disciplines as well as faculty professional development materials and related research.

Sambell, K. (2011, July). Rethinking feedback in higher education: An assessment for learning perspective. Bristol, England: ESCalate – HEA Subject Centre for Education.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Similar to student ePortfolios, faculty ePortfolios are digital collections created by instructors over time that make visible their growth and achievements in teaching, research, service, and other aspects of their professional lives. A variety of artifacts, reflective statements and feedback can be included to represent their experience and/or innovations in teaching, research, service, and professional development—e.g., syllabi, recorded lectures, instructional materials, course redesign reflections, conference presentations, publications, grant project applications and reports, committee reports, sample student work from your classes, and feedback from students and colleagues.

Why should you do it?

ePortfolios for faculty can serve formal functions, such as streamlining the retention, tenure and/or promotions processes through digitizing and linking the required portfolio elements. It can also serve informal functions, such as reflecting critically on your own professional work; inviting feedback from peers, students or other colleagues; or showcasing your innovations and successful strategies publicly to encourage adoption by others.

How do you do it?

Process

1. Collect - Pull together components of your teaching ePortfolio (Vanderbilt University, n.d.):

  • Personal statements, such as a teaching philosophy and/or your professional goals
  • Teaching artifacts, such as course descriptions, instructional materials, lecture or mini-lecture recordings, assessment strategies, and reflections about how each choice was designed to help students achieve specific learning outcomes.
  • Feedback, such as student evaluations of teaching, peer instructor feedback, and reflections about how you have made changes based on specific feedback
  • Evidence of student learning, such as links to student ePortfolios, specific project work from your classes (with the students' permission), and reflections about the levels of performance in your classes or how your guidance helped specific students achieve the outcomes.
  • Professional development, such as participating in workshops, teaching with others, mentoring, reading books or journals about some aspect of your work, and reflecting about how these activities contributed to course redesign efforts or other changes you have made.
  • Contributions on and beyond the campus, such as committee work, community service efforts, publications, conference presentations, and reflections on how these contributions connect to your goals and growth as a professional.

2. Select - Identify your audience(s)

  • You may want to tell different stories to different audiences. You will use different artifacts to help tell each story.

3. Reflect

  • While you may reuse some artifacts across the various contexts, your reflective statements should change to show how those artifacts are relevant to the reviewer(s).

4. Build & Publish - Decide how you want to organize your work.

  • You can use multiple ePortfolios or ePortfolio pages to tell different stories to different audiences, as described above. Some ePortfolios or pages may follow a prescribed format, such as for retention, tenure, or promotion committee reviews. Other ePortfolios or pages may be more open, including making the pages public for the world to see.
Applications/Examples

California State University system. (n.d). Course Redesign with Technology - ePortfolio Showcase.

  • Links to dozens of faculty course redesign ePortfolios in a variety of disciplines.

Clemson University – ePortfolio Gallery

Penn State Brandywine – Dr. Laura Guertin's ePortfolio 

Want more information?

Peers @ SCU: If you're doing this, let us know!

Deborah Ross, Ministerial Formation, Jesuit School of Theology
Michael Whalen, Communication
Leslie Gray, Environmental Studies and Sciences

Resources @ SCU

SCU ePortfolio site 

Resources for Tips & Tricks

Reis, R. (n.d.). Building Your Teaching Portfolio. Tomorrow's Professor #483.

  • Outlines a process for building a teaching portfolio, which can be used for a teaching ePortfolio as well.

Reis, R. (1998, April 2). Items for Inclusion in a Teaching Portfolio. Tomorrow's Professor #13.

  • Lists around 50 items to consider sharing through your teaching ePortfolio.

Vanderbilt University. (n.d.). Teaching Portfolios.

  • Provides guidelines and considerations for creating a teaching portfolio or teaching ePortfolio.
Resources for Deeper Learning

Danowitz, E.S. (2012). On the Right Track: Using ePortfolios as Tenure Files. International Journal of ePortfolio, 2(1), 113-124.

  • Describes the benefits and challenges of using an ePortfolio for the tenure process.

Seldin, P.; Miller, J.E.; & Seldin, C.A. (2010). The Teaching Portfolio: A Practical Guide to Improved Performance and Promotion/Tenure Decisions. San Francisco: Jossey-Bass.

  • Shares ideas related to creating, sharing, and evaluating teaching portfolios, increasing buy-in, and more. Includes 20 sample portfolios in as many disciplines.

Zubizarreta, J. (n.d.). What is a Teaching Portfolio? A Primer.

  • Describes teaching portfolios and their contents, mentoring with teaching portfolios, and related references.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

In reflection activities, students consciously document their thoughts about or evaluation of different aspects related to learning, such as their own actions, experiences, or work products; collections of work in an ePortfolio; or the learning process itself. Evaluations of student reflections can be performed by instructors, students' peers and the students themselves. However, reflection activities can be difficult to evaluate, as it involves assessing the reflective process as much as the content of the reflection itself.

Kember, McKay, Sinclair, and Wong (2008, pp. 370-5) outlined four categories for assessing the level of student reflection:

  • habitual action (non-reflection): "a procedure is followed without significant thought about it"
  • understanding: "attempting to reach an understanding of a concept or topic…[which] does not imply reflection"
  • reflection: "takes a concept and considers it in relation to personal experiences"
  • critical reflection: "a change in perspective over a fundamental belief of the understanding of a key concept or phenomenon"

Why should you do it?

Why use student reflection?

Dalal, Hakel, Sliter, and Kirkendall (2012, p. 75) stated that reflections are "recommended for enhancing retention and transfer of learned material." The University of Michigan Sweetland Center for Writing (n.d.) found that assigning reflection activities also can help students become more self-directed as learners (also see metacognition).

Why evaluate student reflection?

Since many students do not have experience with reflection, your evaluation and feedback guides them through an unfamiliar process. Receiving grades—even small numbers of points—for reflections also encourages students to complete reflections on a regular basis.

How do you do it?

Process

1. Prepare

  • Identify opportunities for assigning student reflections. Create prompts that urge students to consider course content in relation to their own experiences, or even challenge their own beliefs.
  • Find, modify, or create a rubric to evaluate student reflections.

2. Engage

University of Michigan (n.d.) suggests a number of student reflection strategies:

  • Prepare students to go through reflective activities.
  • Make the reflection process social or collaborative, so students can benefit from learning from others' experiences.
  • Ask students to reflect during an experience or activity, in addition to after it takes place.
Applications/Examples

Louisiana State University. (n.d.). Sample Reflection Paper Rubric

  • Example rubric provides criteria and definitions based on a service-learning context.

Van Gyn, G. (May 6, 2013). The Little Assignment With the Big Impact: Reading, writing, critical reflection, and meaningful discussion. Faculty Focus [website]: online.

  • Brief article describing a writing assignment that promotes students' critical thinking.

Tanner, K.D. (2012). Promoting Student Metacognition. CBE Life Sci. Educ. 11(2): 113-120. Available online.

  • The author argues that faculty should model metacognition in our teaching to help students learn to think like professionals in our disciplines -- in her case, biology.

Want more information?

Peers @ SCU: If you're doing this, let us know!

Andrea Brewster, Undergraduate Studies
Deborah Ross, Ministerial Formation, Jesuit School of Theology
Michael Whalen, Communication
Leslie Gray, Environmental Studies and Sciences
Maura Tarnhoff, English

Resources for Tips & Tricks

Dalal, D.K.; Hakel, M.D.; Sliter, M.T.; & Kirkendall, S.R. (2012). Analysis of a Rubric for Assessing Depth of Classroom ReflectionsInternational Journal of ePortfolio, 2(1), 75-85.

  • Shares a rubric to assess the depth of student reflections on a scale of 0 (not a reflection) to 5 (deep reflection), with definition and examples for each part of the scale. 

Jones, S. (n.d.). Assessment Rubric for Student Reflections from Howard University.

  • Offers a holistic rubric to assess student reflections according to five elements: clarity, relevance, analysis, interconnections, and self-criticism.
Resources for Deeper Learning

Kember, D., McKay, J., Sinclair, K., & Wong, F. K. Y. (2008, August). A four-category scheme for coding and assessing the level of reflection in written work. Assessment & Evaluation in Higher Education, 33(4), 363-379.

  • Outlines four categories for assessing the level of student reflection.

University of Michigan. (n.d.). Metacognition: Cultivating Reflection to Help Students Become Self-Directed Learners.

  • Lists five practices that lead to effective student reflection.
References

Kember, D., McKay, J., Sinclair, K., & Wong, F. K. Y. (2008, August). A four-category scheme for coding and assessing the level of reflection in written work. Assessment & Evaluation in Higher Education, 33(4), 363-379.

University of Michigan. (n.d.). Metacognition: Cultivating Reflection to Help Students Become Self-Directed Learners.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

We assess our students on a regular basis, but we rarely stop to evaluate the assessment! Evaluating tests—usually those tests with true-false, multiple choice, and short answer questions—includes investigating the test—e.g., measuring reliability and validity—and the individual questions—e.g., measuring difficulty and discrimination. The following descriptions offer more detail about each method of measurement:

  • Reliability – How consistently does the test assess student achievement of the learning outcomes?
  • Validity – How well does the test represent the knowledge or skills students need to achieve the learning outcomes?
  • Difficulty – How hard is each question to answer? Calculated as the percentage of students who answered the question correctly.
  • Discrimination – How well does each test question differentiate between students who perform well on the test—e.g., highest quartile—and those who perform poorly—e.g., lowest quartile?

Why should you do it?

For the most part, you should evaluate tests so you can improve them and/or your instruction. This does not mean you are trying to make the tests easier. It means you are making sure each test is fair, covers material students should have experienced through your class, and consistently assesses achievement of specific learning outcomes. Through the process, you may find questions that are more difficult for specific students, such as non-native speakers of English, or are culturally biased. 

How do you do it?

Process

Learning management systems like Canvas (SCU's Camino) simplify the process quite a bit, by allowing you to review statistics related to the quiz results and even download a quiz item analysis.

  • If analysis shows a question is very difficult, check the wording of both the question and response options. If the question itself is misleading, then you may choose to throw out that question. If a response option other than the correct one is partially true or correct in a different context, then you may choose to rescore that question and count the other response as correct or partially correct.
  • In some cases, you may be drawing from pools or banks of test questions provided by a textbook publisher. Be sure to vet the questions before using them in a test, or evaluate those tests when you first use them.

If you use student response systems, or "clickers," as polling tools, you can conduct custom analysis by having students enter demographic information, such as gender or class level. You can ask these questions at any time and should only have to ask them once, after which you can reuse the data. Then you can cross-tabulate by showing slices of results—e.g., compare freshmen and sophomores, those who reported reading the optional article and those who did not, native and non-native English speakers, or first generation to go to college with second or third generation.

Applications/Examples
  • Quiz item analysis (online): In Canvas (SCU's Camino), open the Quiz Statistics for a quiz or test you want to evaluate. Download the Item Analysis file. Notes from the Canvas website:
    • Quiz analysis only applies to True/False, Multiple Choice, and Multiple Fill-in-the-Blanks questions within quizzes that have more than 15 responses.
    • If a quiz allows multiple attempts, the calculations will only consider the student's first attempt--to rule out practice effects.
    • If you generate the Item Analysis file for quizzes that pull from a question group, the spreadsheet will generate a line representing each version of the quiz (unless there are fewer than 15 students who responded to each version).

Want more information?

Peers @ SCU: If you're doing this, let us know!

Stephen Reaney, Chemistry & Biochemistry
Ana Maria Pineda, Religious Studies
Marina Hsieh, Law

Resources @ SCU

SCU Office of Assessment

  • Assists faculty with test-item analysis and interpretation of assessment results.
Resources for Tips & Tricks

Penn State University - Schreyer Institute for Teaching Excellence. (n.d.). Improve multiple choice tests using item analysis.

  • Describes question difficulty and discrimination, and suggests ways to improve tests.

Rudner, L.M. (1993, December). Test Evaluation.

  • Outlines the questions to ask yourself when evaluating your tests.

University of Texas. (n.d.). Interpreting Test Results.

  • Offers a statistics-based approach to evaluating tests.
  • Provides guidelines for ideal question difficulty, discrimination and reliability.
Resources for Deeper Learning

Dinero, S.L., Dinero, T.E., & Cuevas, N.M. (2003). Evaluating tests. Orientation to College Teaching.

  • Provides guidance on evaluating the validity and reliability of a test, and making improvements.

Dinero, S.L., Dinero, T.E., & Cuevas, N.M. (2003). Measuring Student Learning. Orientation to College Teaching.

  • Includes a section on analyzing a test via item analysis (e.g., item difficulty, item discrimination).

Kitao, K., & Kitao, S.K. (1996). Evaluating the Test Results. ERIC Document ED398255.

  • Provides a simple overview of evaluating test results and then improving the test.
Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Mid-term student feedback is a formative process of collecting data from students about how your course is going to them and what could be done to improve the learning experience. As the name suggests, the data collection happens in the middle of an academic term. The process is less formal than student evaluations at the end of the term.

Why should you do it?

Collecting feedback around the middle of the term allows you to make changes that benefit your students while they are still in your class. Harris and Stevens (2013, p. 537) found that "employing midterm student feedback has been found to be instrumental in informing faculty about instructional quality and improving [achievement of] student learning outcomes." Students appreciate having an opportunity to provide feedback as well.

How do you do it?

Process

1. Make a teaching effectiveness plan and/or ask peers to review your course before the term

2. Collect formative feedback throughout the term and discuss with students (Kelly, 2008):

  • Conduct a teaching self-assessment using benchmarks
  • Create an online "suggestion box" for your class
  • Use Classroom Assessment Techniques, such as the one-minute paper, to gauge not only student progress, but also teaching effectiveness
  • Conduct a mid-term evaluation survey
  • Create a teaching effectiveness rubric for students to complete
  • Conduct a plus-delta exercise about specific teaching strategies or practices

3. Make small changes based on formative feedback during the term

  • Basically, don't ask for feedback if you are not prepared to do something about it!
  • You do not have to change everything about the course, but there will probably be some strategies you can adopt to help your students.
Applications/Examples
  • Formative feedback for hybrid or flipped class (in-class, online, or both): If you are trying a new course format for the first time (e.g., hybrid or flipped), you do not want to wait until the end of the semester to find out what is working and what needs to change. Use the plus-delta exercise to focus specifically on aspects of the class format that are working or that require changes by you or the students. You also can use this opportunity to add your own comments if students do not hold themselves completely accountable. For example, as a plus, you can state that you appreciate when students come prepared for the flipped class session by reviewing the required materials. As a change, you can state that when some students come unprepared it can delay the activities and decrease the chances that everyone will reach the desired outcomes.
Tools for collecting mid-term student feedback

Student Assessment of Learning Gains

  • Allows instructors to collect learning-focused feedback from students
  • Cost: Free

Free Assessment Summary Tool

  • Allows anyone to develop an anonymous, online assessment questionnaire by creating new questions or drawing from a database of over 350 questions related to teaching effectiveness
  • Cost: Free

Want more information?

Peers @ SCU: If you're doing this, let us know!
Resources for Tips & Tricks

Reis, R. (n.d.). Using Mid-Term Evaluation Sources of Student Feedback on Teaching. Tomorrow's Professor Message #313.

  • Makes recommendations about how to collect and use feedback in the middle of a term.

University of Waterloo. (n.d.). Using mid-term student feedback.

  • Shares general strategies for preparing, collecting feedback, and responding to feedback.

Washington University in St. Louis. (n.d.). Incorporating Midterm Course Evaluations.

  • Outlines strategies for designing evaluation questionnaires, administering them, and responding to student concerns.
Resources for Deeper Learning

Dartmouth Center for the Advancement of Learning. (n.d.). Interpreting Your Course Evaluations and Using Them for Professional Development.

  • While this site is specific to Dartmouth, it provides good advice for addressing issues raised by student evaluations.

Duquesne University. (n.d.). Benefits, Impact, and Process of Early Course Evaluations.

  • Provides rationale and strategies for conducting course evaluations early in the term.

Kelly, K. (2008). Evaluating and improving your online teaching effectiveness. In S. Hirtz, D.G. Harper, and S. Mackenzie (Eds.), Education for a Digital World: Advice, Guidelines, and Effective Practice from Around the Globe (pp. 365-377).

  • Shares practical strategies for a) collecting direct and indirect feedback from peers and students about online teaching effectiveness before, during, and after the course; and b) using this feedback to make changes to your online course or course environment, both during the course and for future iterations.
References

Harris, G.L.A. & Stevens, D.D. (2013, Summer). The Value of Midterm Student Feedback in Cross-Disciplinary Graduate Programs. Journal of Public Affairs Education, 19(3), 537-558.

Kelly, K. (2008). Evaluating and improving your online teaching effectiveness. In S. Hirtz, D.G. Harper, and S. Mackenzie (Eds.), Education for a Digital World: Advice, Guidelines, and Effective Practice from Around the Globe (pp. 365-377).

  • Note: The designated pages are in section 4 of the ebook.

Stanford University. (1997, Fall). Using Student Evaluations to Improve Teaching. Speaking of Teaching, 9(1), 1-4.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Normally used as a classroom assessment technique (CAT), one-minute papers ask students to write three things in one minute—the clearest point, the "muddiest" point, and any other comments they might have. With only a minute to write these three things, students provide short, concentrated answers rather than lengthy passages. This makes it easy to identify what concepts students feel they do or do not understand from a reading assignment, a lecture, an entire class meeting, or some other aspect of your class.

Why should you do it?

With the information from the one-minute paper exercise, you can address problem areas at the beginning of the next class meeting before moving to new material. To respond to less common comments, the instructor may opt to post additional resources, such as journal articles or links to websites that cover a problem area in more depth or from another perspective. Instead of covering the less common problems in class, the instructor has the option of providing more materials related to specific areas, and being open to additional questions. Angelo and Cross (1993) explain the concept of the one-minute paper in their book about CATs, while Chizmar and Ostrosky (1998) cover its benefits in detail.

How do you do it?

Process

This process can be anonymous or not, depending on how you plan to use the results. Start by selecting the resource (e.g., reading passage) or activity (e.g., lecture) for which you want to collect information about what students understood, or tell students to think back over the entire class session. During class, at the end of class, or after class in an online forum, ask students to write three things in one minute:

  • what they felt was clear, helpful, or most meaningful from a course reading, lecture, or classroom meeting
  • what they felt was "muddy," unclear, challenging, or least meaningful from a course reading, lecture, or classroom meeting
  • any additional comments.

You can use the technique to solicit specific information, such as

  • summarize the main point of the last lecture
  • list the top three things you took from and any challenges you had with the reading or homework assignment
  • relate the reading or lecture topic to a real-world example or recent news story.

In the classroom setting, the instructor collects all of the papers and looks for patterns, or areas that are clear or unclear to several students. Online, set a deadline for the discussion forum and review in advance of the next class meeting. If a large percentage of students did not understand something specific, you can address that concept in your next lecture or engage students in a quick exercise to see what exactly they did not get. If only one student or a small percentage of the class did not understand something, you might post an additional resource or link and tell the class to review it as they wish.

Applications/Examples

  • One-minute Tweets (in-class): At the end of a mini-lecture or class activity, ask students to Tweet the main point while there is still time left. To make it easy to list them all in one place, tell students they must include a specific hashtag with your course name (e.g., #SCU-ACTG11 or#SCU-Law376F). To make it easy to determine which question they have answered, tell them to use a hashtag to describe what they have entered (e.g., #meaningful, #clearest, #challenging). On your computer connected to the projector, pull up Twitter and display the results as they are entered. If not all students have a smart phone, ask students to form pairs or groups of three, where each pair or group has at least one smart phone, tablet or laptop, as well as a Twitter account. Prepare to revisit the items marked with hashtags like #unclear and #challenging, either right then and there, at the next class meeting, or online in between class meetings.
  • One-minute threads (online): Ask students to go through the one-minute paper exercise in an online discussion forum. This will accommodate students with different learning preferences as well as international students who may need more time to think about what they did not understand. Writing their thoughts right at the end of the class meeting does not give them a chance to digest what the class has done. They often want to go over their notes from the face-to-face class, to translate any unfamiliar terms and ideas, and sometimes even to discuss the concepts in a small group. By going online, they can have more time to process their thoughts and still give feedback before the next class meeting. You may also offer credit for students to answer each other's questions before you have a chance to reply. An online community can form around this classroom assessment technique that is traditionally facilitated by the instructor alone.

When creating the settings for "one-minute threads" discussions, do not allow students to post their own original threads or discussion topics. Otherwise, the threads will be hard to sort, since they may not have clear subject lines, and will be added in a fairly random order. Instead, ask them to reply to three specific questions (clearest point, muddiest point, and additional comments). This organizes the responses for you. If your LMS or other discussion forum engine does not allow this, write clear instructions for giving the specific responses you are trying to elicit.

Want more information?

Peers @ SCU: If you're doing this, let us know!
Resources for Tips & Tricks

Nahornick, A. (2013, November 12). One-Minute Papers: A Way to Further Design Thinking. [blog post].

  • Applies design thinking strategies along with the one-minute paper to "reframe and rethink the way we approach our subject expertise to be more innovative in our classrooms."

Tollefson, S. (n.d.). Gone in Sixty Seconds: The One-Minute Paper as a Tool for Evaluating Instructor and Students.

  • Contains examples for using the one-minute paper in different ways.
References

Angelo, T. A., & Cross, K. P. (1993). Classroom Assessment Techniques: A Handbook for College Teachers (2nd ed.). San Francisco, CA: Jossey-Bass Publishers.

Chizmar, J. A., & Ostrosky, A. L. (1998). The one-minute paper: Some empirical findings. Journal of Economic Education, 29(1), 1–8.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Peer review is a process by which students are asked to look at other students' work and provide feedback. While most commonly used for writing assignments, peer review can be used for any project or assignment where feedback would help students improve their work, such as student presentations, graphic design projects, software programming code projects, research studies, lab notebooks, infographics, videos or other media. Small groups can also provide peer review feedback to other groups. Using a rubric improves the consistency of feedback and guides students to look for specific criteria (e.g., required information, formatting, proper citation to prevent plagiarism) or to answer specific questions (e.g., Does the work answer the assignment prompt?).

Why should you do it?

Students can learn more about the assignment topic itself by seeing other students' approaches and perspectives during peer review. Students also learn how to provide constructive feedback and how to incorporate feedback into their own work—skills they will need throughout their academic career and in the workforce. Depending on your class size, it may be difficult to provide comprehensive feedback on each student's work. Assigning peer review means all students get feedback before submitting a final version for a grade. You also benefit by receiving stronger work to evaluate after students complete revisions spurred by peer review.

How do you do it?

Process

1. Prepare

  • Create a rubric to guide the peer review process. Provide the same rubric that you will use to assess the final submissions.
  • Create a peer review timeline with deadlines for first draft, peer review, and final draft after revisions. As peer review can take more time, you may want to reserve it for more difficult or complex assignments, where it is easier for students to make mistakes, omit information, etc.
  • Identify any technologies that might help with some or all aspects of the peer review process, including but not limited to:
    • Tools with in-line comments and/or tracked changes (e.g., Microsoft Office, Google apps): for documents, spreadsheets and presentations, features such as tracked changes and comments can be used for feedback.
    • Apps: If your class uses iPads, consider using the iAnnotate app, which allows you and your students to highlight passages in a document and leave voice or text comments.
    • Peer review tools: Calibrated Peer Review is a web-based tool created by UCLA.
    • Plagiarism prevention tools: Some plagiarism tools have peer review capabilities, such as Turnitin's PeerMark system.
    • Discussion forums
    • ePortfolios: Students can post drafts, receive feedback via comments or drafts with tracked changes, post final drafts for grading, and write reflective statements about how their work changed based on the feedback.

2. Engage students

  • Go through the peer review process with the students before asking them to do it.
  • Consider using a rubric and/or calibrated peer review strategies to further improve the results. You do not need to use UCLA's calibrated peer review tool to follow the principles (see Applications, below, for details).
  • Model evaluating an exemplar in front of students (in-class) or create a screencast outlining your ratings for the exemplar and why you gave those ratings.
  • Ask students for feedback about the use of peer review, perhaps with a plus-delta exercise.
Applications/Examples
  • Calibrated Peer Review (in-class, online or both): First, create a rubric that students will use for peer review and you will use to evaluate students' final work. Then, share that rubric with your students and discuss it with them. Have them use the rubric—in-class or as an assignment—to review a dummy assignment that you create or an assignment you have permission to use from a previous term. Next, use polling strategies (e.g., raise hands, clickers, Twitter) or an online tool (e.g., Google forms) to learn what ratings students gave to the exemplar assignments. Last, share your own ratings and comments and ask students why they rated more harshly and more leniently than you did. Come to an agreement as a class on a fair rating for each criterion. This exercise is designed to make the peer review process more consistent, regardless of who reviews a particular student's assignment.

Want more information?

Peers @ SCU: If you're doing this, let us know!
Resources for Tips & Tricks

Jamsen, K. (2014). Making peer review work

  • Offers specific suggestions for instructors who want to use peer review for the first time.
Resources for Deeper Learning

Colorado State University. What is peer review and how do I use it?

  • Outlines in detail the potential steps for using peer review in first-year seminar courses. Can be applied to any course.

Pearce, J.; Mulder, R. & Baik, C. (2009). Involving students in peer review: Case studies and practical strategies for university teaching. Melbourne, Australia: Centre for the Study of Higher Education - University of Melbourne.

  • Provides four case studies for incorporating peer review, as well as reflections on the practice and suggestions for online tools.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

This group exercise is used in a variety of settings: corporate meetings, professional development workshops, closing sessions at conferences, and, of course, classrooms. The purpose is to identify publicly what people think about a particular shared experience. The name "plus-delta" comes from the two symbols—plus (+), signifying positive aspects of the experience, and delta (∆), a common symbol for change in math and science, signifying aspects that people would change—that sit atop two blank columns. In a group setting, students add items to each column as the teacher solicits responses, clarifications, suggestions and additional discussion.

Why should you do it?

The plus-delta activity is a quick and easy way to conduct a formative evaluation for both you and your students. It allows you to assess students' reactions to the learning experience and models metacognitive reflection in a group setting. It also gives students a voice in the learning process, which can lead to increased motivation and participation. The exercise can also be an opportunity for you to add your own thoughts along with the students, as a way to publicly note issues you have observed over time.

How do you do it?

Process

First, identify when you want to conduct the exercise and how often. You may want to let students know in advance, as not all students have the same abilities to generate ideas on the fly. You could also conduct the data collection online, and then refine and discuss the results as a group. A simple two-column activity looks at what both the teacher and the students a) are doing that works and b) could change to improve (see table directly below).

 

Plus (+)

Delta (∆)

Teacher/Course

Positive actions by the teacher that students feel help them learn

Changes students would like to see the teacher make to help them learn

Student

Positive actions by the students that they feel help them learn

Changes students would like to make themselves to help them learn

To facilitate the activity, you can either give each person a chance to either add an item or pass, or go with a looser approach, letting students call out items while you or someone else marks them down in the correct column. Usually the list is generated publicly—i.e., using a computer connected to a projector, writing on a whiteboard, or writing on large pieces of paper on an easel or taped to the wall—so everyone in the room can see the growing lists. If you use the whiteboard, assign a scribe to copy what's written on the board or take a photo with a phone afterward and transcribe it later, since a picture is not accessible and may not always be easy to read.

Last, post the final digital version online, along with a list of tasks you have created for yourself to address the requested changes, so everyone can review it before the next class meeting. It is important that students see that you are willing to take action to remedy at least some of the items they noted, especially if a number of students noted a particular need or learning challenge. You can also make this a discussion forum assignment, where you list what you will change in the course and each student must list what he or she will do to improve his or her own learning.

Applications/Examples
  • Formative feedback for hybrid or flipped class (in-class, online, or both): If you are trying a new course format for the first time (e.g., hybrid or flipped), you do not want to wait until the end of the quarter or semester to find out what is working and what needs to change. Use the plus-delta exercise to focus specifically on aspects of the class format that are working or that require changes by you or the students. Again, you can use this opportunity to add your own comments if students do not hold themselves completely accountable. For example, as a plus, you can state that you appreciate when students come prepared for the flipped class session by reviewing the required materials. As a change, you can state that when some students come unprepared it can delay the activities and decrease the chances that everyone will reach the desired outcomes.

Want more information?

Peers @ SCU: If you're doing this, let us know!
Resources for Tips & Tricks

Iowa State University - Center for Excellence in Learning and Teaching. Mid-term Formative Evaluation: Using a PLUS/DELTA Assessment Technique.

  • Shares a four-quadrant chart for students to evaluate the class and their own learning.
Resources for Deeper Learning

Helminski, L. & Koberna, S. (1995). Total quality in instruction: A systems approach. In H. V. Roberts (Ed.),Academic initiatives in total quality for higher education (pp. 309-362). Milwaukee, WI: ASQC Quality Press.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is plagiarism?

Plagiarism occurs when one person uses another person's work—from basic ideas to exact wording—and presents it as his/her own. Sometimes this is done intentionally and flagrantly, while in other cases the person may not realize it is plagiarism. The Internet, a culture of mashups, and other factors have made it easy to find and use information without consideration for the original work's creator. Without learning and adopting academic integrity practices, students and even faculty can engage in plagiarism, unknowingly or not.

Why should you promote academic integrity?

Promoting academic integrity is every faculty member's responsibility. It is less painful for everyone when it is woven into every class and assignment at the beginning and throughout the course, so expectations and consequences are clear. As students navigate their academic career, they also learn skills that they will need in the workforce. Avoiding plagiarism is one of those skills.

How do you promote academic integrity?

Process

Plagiarism can be a big problem for students who often don't really understand how to cite a source properly or what it means to plagiarize. The new Camino learning management software allows electronic submittals to be immediately checked on Turnitin, a plagiarism checker, or checked when the document is graded using Camino's grade checker. But how do we teach students what plagiarism is and how to avoid it?  Below are a number of resources including some from SCU's Office of Student Life and the School of Law.

1. Identify activities where plagiarism might occur and prepare the students

  • Add an academic integrity policy to your syllabus. This will set the tone for your class. Include a question about the policy in your syllabus quiz, if you use one.
  • Add language to the instructions for each activity where students might do research and borrow from others' work. Provide examples of invented or anonymized student work or published scholarship that demonstrate good practice. Many writing handbooks (used in Critical Thinking and Writing and Advanced Writing courses) have examples of effective undergraduate research papers that model good citation practices in several disciplines and style formats (APA, MLA, Chicago). Raise questions about good research and citation practices as a topic of discussion in class or in an online discussion forum.
  • Adding an academic integrity criterion to an assignment rubric is one of the easiest ways to make sure students know what you want them to do (e.g., "citations properly formatted using APA guidelines").
  • Provide citation resources for students' reference, such as Purdue University's Online Writing Lab (OWL), which has multiple sections related to proper citation, summarization, paraphrasing and more. Keep in mind that student presentations and other assignment types often will include images as well as text taken from the web, so also share resources like Foter's infographic about proper attributions for Creative Commons images.
  • Most incidents of egregious plagiarism happen at the end of the term. Acknowledge that with your students early in the term and again in the later weeks of your courses; offer clear, friendly reminders that you take academic integrity seriously—even and especially as the course ends. Remind students of resources and options if they feel overwhelmed by assignments (tutoring, the HUB, your office hours, the Drahmann Center).

2. Engage the students

  • Use plagiarism detection tools to guide or proctor the writing process. Ask students to run their assignments through a plagiarism detection tool, such as Turnitin, before submitting the final version. This allows them to see where they may have forgotten to cite a source or incorrectly paraphrased a passage by using too much of the original. They can repeat this process afterward and submit the report along with the assignment.
  • Require revision and make it difficult to plagiarize. Design scaffolded writing assignments—with several intermediary due dates (e.g. for informal writing, topic conceptualizing, outlining, drafting, editing) to keep students on task (so you do not have to read papers written in single drafts the night before they were due), to build metacognition into the assignment by inviting them to notice their process as writers, and to create a record of their drafts. Have students submit these online to Camino in separate assignments so that you and your students are aware of their progress toward a final, finished paper.
  • Use peer review activities that include checking each other's work for plagiarism.
Applications/Examples
  • Sample syllabus statements
    • Academic Integrity refers to the "integral" quality of the search for knowledge that a student undertakes. The work you produce, therefore, ought to be wholly yours; it should result completely from your own efforts. A student will be guilty of violating Academic Integrity if he/she a) knowingly represents work of others as his/her own, b) uses or obtains unauthorized assistance in the execution of any academic work, or c) gives fraudulent assistance to another student.
Plagiarism is a form of cheating or fraud; it occurs when a student misrepresents the work of another as his or her own. Plagiarism may consist of using the ideas, sentences, paragraphs, or the whole text of another without appropriate acknowledgment, but it also includes employing or allowing another person to write or substantially alter work that a student then submits as his or her own. Any assignment found to be plagiarized will be given an "F" grade, and may result in an "F" in the course. Any violation of academic integrity in this class, including plagiarism, will be reported to the Department Chair and the Office of Student Life. Student Life will pursue the matter as a violation of the Student Conduct Code through the University judicial process. For more see Student Life Policies and Procedures.
  • Learning to locate, evaluate, select, and properly cite sources is a learning outcome for the course. If you have questions or concerns about how or when to cite sources, be sure to bring them up in class discussions. If your questions aren't resolved by our work on these issues, be sure to speak to me before you hand in your paper. Cite sources in papers and presentations. Any violation of academic integrity in this class will result in a report to the Department Chair and the Office of Student Life. Student Life will pursue the matter as a violation of the Student Conduct Code through the University judicial process. For more see Student Life Policies and Procedures

Want more information?

Peers @ SCU: If you're doing this, let us know!
Resources @ SCU

SCU policy on academic integrity and plagiarism

Faculty Development Teaching Resources Page on Academic Integrity (NOT LINKED)

Lester Deanes, Assistant Dean for Student Life

Santa Clara University Library. (n.d.). Plagiarism: don't let it happen to you!  (NOT LINKED)

  • Provides links to citation tools, tutorials, videos, and on-campus resources, such as the HUB Writing Center.

The HUB (SCU's writing center). Director Denise Krane offers resources on assignment design features that discourage plagiarism, support student learning and inspire students to do their best work.  (NOT LINKED)

Santa Clara University School of Law. (n.d.). Plagiarism guidelines for students from SCU's School of Law(NOT LINKED)

  • (See section 3.2)
Resources for Tips & Tricks

Clark College – Information & Research Instruction Site (IRIS). (n.d.). Plagiarism (And How to Avoid It): Tutorial & Handouts

  • The IRIS program offers a plagiarism quiz and useful information and tutorials about the research process. Although it is designed for students at a 2-year college it is helpful for lower division students. 

Indiana University Bloomington. (n.d.). How to Recognize Plagiarism.

  • A resource that allows students to test themselves.

Johns Hopkins University. (2012, November 5). Teaching Your Students to Avoid Plagiarism. [blog post].

  • Contains great links to faculty and student resources.

University of Southern Mississippi. (n.d.). Plagiarism Tutorial

  • A useful site for teaching students about plagiarism and how to avoid it.  
Resources for Deeper Learning

Foter. (2012, November 14). How to Attribute Creative Commons Photos [blog post].

  • Provides guidelines for properly citing Creative Commons images.

Schulman, M. (1998, Winter). Cheating Themselves. Issues in Ethics, 9(1).

  • Article by the editor of SCU's Issues in Ethics.   
Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Rubrics are scoring guides that allow anyone—you, peer instructors, teaching assistants, and the students themselves—to evaluate performance based on specific criteria and performance levels. Holistic rubrics assign one score for each student's work as a whole. They can be valuable for quick, broad decisions about the overall quality. Analytic rubrics are comprised of two elements that form a matrix:

  • The criteria are the objectives, individual tasks, or activity components that you want to review. For example, if you use a rubric to evaluate student presentations, then criteria might include "organized information," "supported main points," "prepared to answer questions," "made eye contact," or "used appropriate visual media."
  • The levels of achievement define the extent to which a given student meets each criterion. A numeric value is assigned—usually a 3-point, 4-point or 5-point scale. Within the matrix, you can also write text descriptors letting you and the students know what performance is expected at each level, for each criterion (see example below).

Student grades for the activity are given by adding up the scores the students receives for each criterion.

Why should you do it?

Rubrics are good for learning: Rubrics make it possible to be more objective and consistent during the assessment process. They allow you to measure learning according to real-world criteria.

Rubrics are good for students: Students can use the rubric both to prepare and to evaluate their own work before submitting the final draft. If the rubric is weighted, students can see the relative importance of specific criteria or requirements for a project or assignment. Rubrics guide students in the peer review process. When you return their work with a rubric, they receive critical feedback outlining where they did well and where they need work.

Rubrics are good for you: Rubrics can save you time by providing prepared comments you can modify or use for each student, and focusing your efforts on what is most important for students to learn.

Rubrics are good for the institution: Across General Education courses, rubrics can provide consistency regarding how the institution defines proficiency in writing, quantitative reasoning, critical thinking, etc. At the department or program level, rubrics can be used to evaluate capstone projects or student ePortfolios containing different work products for completion of a major.

How do you do it?

Process

To construct an analytic rubric, follow these steps:

1. Draft evaluation criteria

Make a list of everything that you feel is part of an assignment or activity, then pick the five to ten that are the most important. These can range from specific learning outcomes to general categories (e.g., critical thinking, communication, organization). Constructing the criteria with your students involves them in the learning process and makes the activity more meaningful.

2. Create levels of achievement and descriptors

Make the levels of achievement clear with a numeric value for scoring and a simple description, such as Exemplary (3), Meets Requirements (2), and Needs Improvement (1), or Advanced (3), Proficient (2), and Novice (1). Zero (0) can be used when a student does not earn any credit for a particular task. These levels then apply to each criterion. Text descriptors further clarify what each level means in concrete terms.

3. Test the rubric

If you want to be sure the rubric will accurately reflect what you want to evaluate, you can score a sample of student work from a previous semester, or ask a colleague to review it with you.

4. Share the rubric with students

When you assign a project or activity, provide the rubric at the same time, so students know what they are expected to do. As noted above, you may decide to include the students in making the rubric.

5. Use the rubric for assessment, peer assessment or student self-assessment

Note: Several learning management systems (including Camino) and ePortfolio (including Digication) solutions provide rubric functionality as a valuable feature.

Applications/Examples

One criterion from a student presentation rubric

 

4 - Exemplary

3

2

1 - Novice

Organization

Presentation is well-organized. The argument is clear and easy to follow. Topic was clearly stated, supporting information was helpful.

Presentation is organized fairly well. Some deviation from topic and outline interrupted organizational scheme.

Presentation is poorly organized. It takes effort to follow the argument. Off topic items were frequent and structure was violated often.

Presentation is not organized at all. The argument is unclear or too difficult to follow. No clarification or internal structure in place to avoid confusion.

See links to full rubric samples in the Resources for Tips and Tricks and Resources for Deeper Learning, below.

Want more information?

Peers @ SCU: If you're doing this, let us know!

Elizabeth Day (Liberal Studies) has her students create iMovies to demonstrate their content mastery over a given topic and uses the following rubric to assess these projects. 

Click here to find Marco Bravo's digital book assignment and rubric. Marco is an Associate Professor in the Department of Education.

Tonya Nilsson (Civil Engineering) uses rubrics for any writing assignments including laboratory reports. She also uses "cut sheets" for grading exams, which provides consistency in point allocations.

Stephen Carroll (English) uses rubrics to raise the quality of the work submitted by his students.

Chris Bachen (Communication) uses rubrics to provide students with a guide to what she feels are critical points in the assignment and minimum proficiency.

Jill Goodman, English

Resources for Tips & Tricks

Carnegie Mellon – Eberly Center for Teaching Excellence and Educational Innovation. Grading and Performance Rubrics

  • Provides sample rubrics for writing assignments, oral presentations, projects, and class participation.

Teachnology. Rubrics and Rubric Makers.

  • Lists links to several rubric creation tools for K-12 and higher education teachers.

University of Wisconsin – Stout. Rubrics for Assessment

  • Offers numerous rubrics for assessing a wide range of student work products or assignments (e.g., online discussions, oral presentations, cooperative learning, social media projects)
Resources for Deeper Learning

AAC&U. Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics

  • Shares numerous "metarubrics," or rubrics derived by combining the best aspects of rubrics from around North America. Anyone may download, modify, and use these metarubrics for free. Metarubric topics include critical thinking, creative thinking, problem solving, quantitative literacy, teamwork, oral and written communication, integrative and applied learning, global learning, civic engagement, and more.

National Institute for Learning Outcomes Assessment. Rubrics

  • Includes a list of related books and articles, as well as sample rubrics.

Palo Alto College. Outcomes/Assessment

  • Contains links to core assessment, General Education assessment, and program assessment rubrics, all aligned with institutional assessment values.
Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Student evaluations of teaching effectiveness, sometimes called course evaluations or referred to as SET (Student Evaluation of Teaching) are a regular part of the teaching and learning process each term. Official evaluation surveys conducted by the campus include standardized questions used for all instructors at the institution.

Why should you do it?

"…probably the most important benefit of student evaluations is the feedback the forms provide directly to instructors, so that they can refine their courses and teaching practices to provide students with better learning experiences" (Stanford University, 1997).

How do you do it?

Process

1. Make a teaching effectiveness plan and/or ask peers to review your course before the term

2. Collect formative feedback throughout the term (see mid-term student feedback)

3. Make small changes based on formative feedback during the term

4. Participate in a formal course evaluation process at the end of the term and after it ends

  • Ask your students to take the official course evaluation process seriously—i.e., offer constructive feedback and suggest specific actions to overcome the challenges they had
  • Look for patterns in student responses to see what went well and what might be improved

5. Make changes—from small tweaks to course redesign—based on summative feedback. Common student requests for improvement include (Stanford University, 1997; Vanderbilt University, n.d.):

  • Make in-class communications clear
  • Clean up organization of course content
  • Convey learning outcomes clearly and follow them
  • Plan assignments that help students reach the learning outcomes
Applications/Examples
  • Teaching improvement checklist(example): Use a reflective teaching journal or teaching portfolio to document changes in specific learning activities, your courses, or your teaching style. Discuss your classes with faculty development staff or peer faculty.

Want more information?

Peers @ SCU: If you're doing this, let us know!
Resources @ SCU

Course evaluations are conducted online at Santa Clara University.

Resources for Tips & Tricks

Stanford University. (1997, Fall). Using Student Evaluations to Improve TeachingSpeaking of Teaching, 9(1), 1-4.

  • Offers tips for making sense of student comments and improving particular teaching and learning areas.

Vanderbilt University. (n.d.). Student Evaluations

  • Includes tips for making sense of student evaluation feedback, resources on interpreting student evaluations, and summaries of research on student evaluations.
Resources for Deeper Learning

Dartmouth Center for the Advancement of Learning. (n.d.). End-of-Term Evaluations

  • While this site is specific to Darthmouth, it provides good advice for addressing issues raised by student evaluations.

Kelly, K. (2008). Evaluating and improving your online teaching effectiveness. In S. Hirtz, D.G. Harper, and S. Mackenzie (Eds.), Education for a Digital World: Advice, Guidelines, and Effective Practice from Around the Globe (pp. 365-377).

  • Shares practical strategies for a) collecting direct and indirect feedback from peers and students about online teaching effectiveness before, during, and after the course; and b) using this feedback to make changes to your online course or course environment, both during the course and for future iterations.
  • Note: The designated pages are in section 4 of the ebook.
References

Stanford University. (1997, Fall). Using Student Evaluations to Improve Teaching. Speaking of Teaching, 9(1), 1-4. 

Vanderbilt University. (n.d.). Student Evaluations.    

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Whether done individually or in groups, presentations are an assessment strategy through which students can demonstrate achievement of learning outcomes in almost any course or discipline. Assessment goals may include, but are not limited to:

  • showing understanding of course concepts, or ability to consider those concepts in other contexts
  • outlining research results, process steps, etc.
  • demonstrating specific knowledge or skills (e.g., mastery of another language)
  • giving an artistic performance (e.g., monologue, instrumental or vocal recital, dance)
  • synthesizing various perspectives or viewpoints
  • exhibiting oral communication skills

Student presentations can be facilitated in-person or online, and can be reviewed asynchronously via audio, screencast, or video recordings.

Why should you do it?

After reviewing numerous sources, Alison Wolfe (n.d.) listed pros and cons of student presentations. Pros include preparing students with skills they will need in graduate studies and the workforce, using deep cognitive strategies to increase understanding, and shifting to active learning. Cons, which you can view as things to avoid, include presentation activities with unclear goals, little or no guidance for students, poor preparation for peer review, and difficulties implementing in large classes. Virginia Commonwealth University (n.d.) listed numerous benefits of group presentations, such as student engagement, teamwork development, assessment, concept application, community building, peer teaching, and collaborative learning.

How do you do it?

Process

1. Prepare

  • Create a rubric to guide students through creating the presentation content and delivering the presentation itself. Criteria related to content can tie directly to achievement of learning outcomes, your prompts, or generic categories (e.g., argument clarity, provides supporting evidence). Criteria related to delivery might include mechanics (e.g., eye contact, audible volume), technology (e.g., augmented presentation vs distracted from presentation), and ability to answer follow-up questions posed by you and the other students.
  • Provide students with a checklist for effective presentations (e.g., Ten Steps to Preparing an Effective Oral Presentation by Princeton).

2. Engage

  • Make it an iterative process, with reviews of the presentation outline first, the slides next, and the presentation itself last. This will give students separate feedback about their content or argument separate from the delivery.
  • If there is not enough class time for all students to give a presentation, ask them to record and post the presentations online using a tool like VoiceThread, which allows students to record audio and/or post text-based notes. The presentations can also be done as a screencast. If students use a tool like Google Slides, then they should use the Notes feature to share the "script" for each slide.
  • Create a peer review activity, where students use the rubric to review all classroom presentations or a limited number of online presentations (you can assign specific presentations for each student to review or tell the students to "review three presentations that do not have three reviews yet"). In addition to rubric comments, you can ask students to note the most relevant points of each presentation or generate a follow-up question for the presenter.
  • If you have access to a digital video camera, record in-class presentations so students can assess themselves and review along with peer feedback and your evaluation.
Applications/Examples
  • Reflective process for student presentations (in-class, online or both):
  • Before: Ask students to reflect on best and worst presentations they have seen (idea drawn from example by Laura Goering in Resources for Tips & Tricks, below)
  • After: Ask students to conduct a self-assessment, based on review of a recording or reflecting on the presentation experience in general (e.g., If you could do it over, what would you change? What did you notice as you presented?)

Want more information?

Peers @ SCU: If you're doing this, let us know!

Al Bruno, Marketing (Academic)
Katy Bruchmann, Psychology
Amy Eriksson, Communication
Marina Hsieh, Law
Theresa Conefrey, English
Brett Solomon, Liberal Studies
Shelby McIntyre, Marketing (Academic)
Sharmila Lodhia, Women's and Gender Studies
Ana Maria Pineda, Religious Studies
Sandy Piderit, Management (Academic)
Steve Levy, Management (Academic)

Resources for Tips & Tricks

Online presentation tools

Teaching strategies for using student presentations

Goering, L. (n.d.). Planning student presentations

  • Provides tips on improving student presentations by making it a reflective process.

Helvie-Mason, L. (2011, October 10). Helping Students Find Their Voices: Four Corners of the ClassroomFaculty Focus

  • Outlines a student presentation rehearsal strategy for small to medium-sized classes.

Princeton University. (n.d.). Teaching Oral Presentation Skills to Undergraduates

  • Shares simple, practical tips for implementing student presentations in your class.
Resources for Deeper Learning

Brown University. (n.d.). Student Presentations

  • Provides links to sites that support faculty who want to use student presentations in their courses.

Weimer, M. (2013, February 21). Student Presentations: Do They Benefit Those Who Listen? Faculty Focus

  • Suggests peer review activities to encourage students to pay attention to their peers' presentations.

Wolfe, A.M. (n.d.). Pros & Cons of Student Presentations and Effective Student Generated Content

  • Lists pros and cons of student presentations, which can be turned into guidelines for conducting them in your class.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

According to McMillan and Hearn (2008, p. 40), student self-assessment "occurs when students judge their own work to improve performance as they identify discrepancies between current and desired performance." They can use informal or formal self-assessment strategies, either on their own or with guidance from an instructor, tutor, or peer mentor. Typically, self-assessment is a formative assessment strategy, allowing students a chance to make adjustments. The process can cover almost any part of the learning process, from goal setting to reflection on past performance.

Why should you do it?

Students can use self-assessment strategies to identify their current status—learning goals, gaps, progress, or areas of focus. Studies have shown that self-assessment can improve students' learning and achievement (Andrade & Valtcheva, 2009), as well as motivation (McMillan & Hearn, 2008; Sitzmann et al., 2010). Self-assessment is also a critical component of metacognition, where students evaluate their learning and take deliberate steps to improve it. Sitzmann et al. (2010) advocated that teaching self-assessment strategies and encouraging their use prepares students to be lifelong learners as well.

How do you do it?

Process

1. Prepare

  • Identify aspects of your class that lend themselves to the self-assessment process, such as assignments or projects that can be done in stages. You may also decide to have students assess their learning in general throughout the term, rather than focus on specific assignments.

2. Engage the students

  • Discuss the self-assessment process before each assignment where it will be used, along with your expectations for the exercise. Share with students how self-assessment prepares them for the workforce, lifelong learning, or both.
  • Provide a rubric for a class assignment or project, and ask students to turn in their work with a completed rubric and comments to themselves. When you evaluate their work, use the same rubric and provide appropriate comments, so they can see how their assessment compares to yours. As a variation, you can ask your students to submit their first draft as version 1, their self-assessment rubric, and the final draft as version 2. ePortfolios are ideal for multiple iterations of assignments.
  • Ask students to perform their own personal plus-delta exercise, writing what they like and what they would change about an assignment they completed.
Applications/Examples
  • Self-assessment journal (in-class, online or both): Ask the students to keep a journal of self-assessment activities throughout the term. You can require them to use a blog or other online environment, or give them the option to select the journal format. Provide global prompts at the beginning of the term, such as "Make notes both when you feel you did something well and when you do not know what to do next." Throughout the term, provide additional prompts, such as "What revisions did you make to your work after using the rubric to assess it?" or "How has your approach to learning changed in this class since you began? What effect have the changes had on your performance?"

Want more information?

Peers @ SCU: If you're doing this, let us know!

Katy Bruchmann, Psychology
Amy Eriksson, Communication

Resources for Tips & Tricks

Duquesne University – Center for Teaching Excellence. (n.d.). Student Self-Assessment

  • Provides concrete student self-assessment strategies, such as an exam wrapper, with steps for their use.

Georgia State University. (n.d.). Helping students self assess their learning

  • Offers four uses of self assessment with simple steps to practice them: student preparation, compare student work at different stages, learning reflection, and reflective blogging.

Texas A&M University Writing Center. (n.d.). Value Added: Critical Reflection and Self-Assessment

  • Shares a number of reflection exercises designed for self-assessment.
Resources for Deeper Learning

Foundation for Critical Thinking. (n.d.). Structures for Student Self-Assessment

  • Looks at self-assessment for writing, listening, speaking, and reading assignments, as well as for global self-assessment.
References

Andrade, H. & Valtcheva, A.  (2009). Promoting learning and achievement through self-assessment.Theory Into Practice, 48(1), 12-19.

McMillan, J. & Hearn, J. (2008, Fall). Student Self-Assessment: The Key to Stronger Student Motivation and Higher Achievement. Educational Horizons, 87(1), 40-49.

Sitzmann, T., Ely, K., Brown, K.G., & Bauer, K.N. (2010). Self-Assessment of Knowledge: A Cognitive Learning or Affective Measure? Academy of Management Learning & Education, 9(2), 169–191.

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

One classroom assessment technique involves asking students to create their own test questions and answers. The student-generated questions may be used for individual or collaborative quiz or exam preparation, and/or as questions for actual quizzes or exams. This strategy informs teaching, assessment and course design decisions, and acts as a student self-assessment and feedback opportunity.

Why should you do it?

According to Angelo and Cross (1993, pp. 240-242), student-generated test questions allow you to see what students find important or memorable, what they understand as fair and useful test questions, and how well they can answer the questions that they themselves have posed. Pittenger and Lounsbery (2011) identified several additional benefits associated with student-generated questions. Despite being a negative experience for a small group (15%) or a more difficult process than anticipated for some, student-generated questions resulted in greater time and depth of content engagement, fostered personal exploration of content, and led to better understanding of the profession/dispelling of myths.

How do you do it?

Process
  • Decide the number and types of questions students should generate, what topics they should cover, and what you hope students will gain from the experience. Begin this strategy two or more weeks before an actual test to allow students time to incorporate into exam preparation (Angelo & Cross, 1993, pp. 240-242).
  • Let students know how the questions will be used—i.e., for review purposes only, for the upcoming test or a future exam, or both. Provide some exemplar entries of questions, answers, source(s) and any other information you require to guide your students.
  • Use a rubric to evaluate each question's usability for a quiz or test, based on a) completeness—e.g., includes question, four answer options with a correct answer, feedback, and question source—and b) importance—e.g., high score: addresses a learning outcome; medium score: addresses pertinent information; low score: does not meet requirements (Pittenger and Lounsbery, 2011).
Applications/Examples
  • Game show quiz review (in-class): Jeopardy: Break the class into five or more teams and tell them to create one test question for each of ten different categories. If you do not want to use class time for this, you can make this portion a homework assignment where each team uses a Google doc to share their questions. If you plan to use the whiteboard for the game, tell the teams they must turn in 10 index cards with the category at the top, the question and answer in the middle, and the source at the bottom. The cards must be clearly printed. Select the 5 best questions for each category, assign point values based on their difficulty, and organize them by category. If you use a Jeopardy PowerPoint template, ask students to share their Google docs with you, and you can copy and paste the best questions into the template.

For the in-class quiz review, you can use the same teams to capitalize on competition. Use the index cards or the PowerPoint template to run the game like the television version. Give each team the points earned by collaboratively answering a question correctly. Be sure to announce what the winning team will get in advance.

Want more information?

Peers @ SCU: If you're doing this, let us know!

Shelby McIntyre, Marketing (Academic)

Resources for Tips & Tricks

Kelly, R. (2013, December 3). Have Students Generate Content to Improve Learning. Faculty Focus. 

  • Provides rationale for using student-generated test questions, as well as a unique strategy for using screencasts to provide feedback to the students about the quality of their questions.

Linse, A. (n.d.). Classroom Assessment Techniques.

  • Summarizes the information from Angelo and Cross (1993).
  • Article on the minute paper
Resources for Deeper Learning

Hutchinson, D. & Wells, J. (2013). An Inquiry into the Effectiveness of Student Generated MCQs as a Method of Assessment to Improve Teaching and Learning. Creative Education, 4, 117-125. 

  • Describes research study related to asking students to create Multiple Choice Questions. Found that the activity improved student knowledge and understanding of the appropriate concepts and practices.
References

Angelo, T. A., & Cross, K. P. (1993). Classroom Assessment Techniques: A Handbook for College Teachers(2nd ed.). San Francisco, CA: Jossey-Bass.

Pittenger, A.L. & Lounsbery, J.L. (2011, June 10). Student-Generated Questions to Assess Learning in an Online Pharmacy Course. American Journal of Pharmaceutical Education, 75(5). 

   

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Teaching observations work in both directions—you may ask to have a trained observer visit your class to give you faculty peer review feedback, or you may request to visit the class of another experienced teacher, after which you can discuss effective teaching strategies. The process is confidential—i.e., results are not used for performance evaluation purposes. Formative teaching observations occur early or in the middle of the term, so there is time to make changes based on the feedback provided.

Why should you do it?

"Have you ever taught a new course and wondered how it was going? Or felt a nagging sense that something was going wrong in your class but you didn't know what it was? You don't have to wait until the end of the quarter to find out." (Dreher, Healy, & Parrella, 2012). Atul Gawande (2011) outlined a number of reasons why people in every profession—not just top athletes and singers—should seek coaching through observations, feedback and dialog. Stuhlman et al. (n.d., p. 2) characterized teaching observation as a component of improving teacher effectiveness, and "providing feedback within a supportive relationship."

How do you do it?

Process

1. Request and collect feedback

  • Set up a time for someone to visit your class
    • If you want the observer to look for something specific or you are trying a new teaching strategy, let him or her know in advance. For example, if you are flipping your class for the first time, the observer may want to review the recorded materials so he or she can give better feedback on facilitating related activities in the classroom. If you have been working on a project or case studies as a class, provide background information for the observer to get up to speed.
  • Plan for a shortened class period
    • The observer may only watch for the first half of class, so be sure that you can fit everything you want the observer to see. For example, consider using a mini-lecture followed by one or more active learning exercises.
    • The observer may want to conduct a focus group with the students. If not, consider conducting a plus-delta exercise or collecting mid-term student feedback on your own. This will provide feedback from multiple sources.

2. Discuss and make changes

  • Set up a time to speak with the observer soon after the observation. Discuss what changes you might make right away, before the end of the term.
  • Talk to your students about the process.

Want more information?

Peers @ SCU: If you're doing this, let us know!
Resources @ SCU

SCU Faculty Development Program 

Resources for Tips & Tricks

Schweizer, B. (2009, December 10). The Dreaded Peer-Teaching Observation

  • Provides guidelines and strategies for the peer observation process.

Stuhlman, M., Hamre, B., Downer, J., & Pianta, R. (n.d.). How Classroom Observations Can Support Systematic Improvement in Teacher Effectiveness. University of Virginia. 

  • Provides guiding principles for teacher observations, as well as case studies.
Resources for Deeper Learning

Chism, N. (2007). Peer Review of Teaching: A Sourcebook. (2nd Edition). San Francisco: Jossey-Bass.

  • Contains a chapter on classroom observation, with guidelines, preparation strategies, log ideas and forms.
References

Dreher, D., Healy, T., & Parrella, F. (2012, Fall). Checking in with Your Class: Classroom Visits with Small Group Instructional Response

Gawande, A. (2011, October 3). Personal Best: Top athletes and singers have coaches. Should you? The New Yorker

Stuhlman, M., Hamre, B., Downer, J., & Pianta, R. (n.d.). How Classroom Observations Can Support Systematic Improvement in Teacher Effectiveness. University of Virginia. 


Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.

What is it?

Teaching evaluations are conducted as annual performance appraisals and at specific milestones related to retention, tenure and promotion.

Why should you do it?

Hoyt and Pallett (1999) caution against putting too much emphasis on student evaluations of teaching scores. A more holistic approach balances data collected from students, peers, the chair, and the instructor him or herself. It is encouraged to consider elements of dialogue, such as an interview of the instructor or an invitation for him or her to contribute a statement.

How do you do it?

Process

If you are performing a review:

  • In an online workshop on effective and fair faculty evaluations, Jeff Buller (n.d.) outlined some best practices:
    • Base the review on observable behaviors and documented results, not personality attitudes or opinions.
    • Don't over-interpret statistics.
    • Don't treat all items in a review process equally.
    • Both oral and written forms of evaluation have their place.
  • Balance the review process by going beyond scores from student evaluations of teaching effectiveness:
    • Ask if the instructor has created a teaching ePortfolio.

If your work is being reviewed:

  • Prepare
    • Talk to a mentor or peer who has gone through the process. Ask what to expect and how to prepare.
    • Consider creating a teaching ePortfolio. This allows you to organize your work, share reflections about different artifacts and link to relevant information.

Want more information?

Peers @ SCU: If you're doing this, let us know!
Resources for Deeper Learning

Buller, J.L. (2012). Best Practices in Faculty Evaluation: A Practical Guide for Academic Leaders. San Francisco: Jossey-Bass.

Hoyt, D.P. & Pallett, W.H. (1999). Appraising Teaching Effectiveness: Beyond Student Ratings. IDEA Paper #36. 

  • Describes limitations to using student rating systems alone to evaluate teaching effectiveness and strategies to collect additional information, such as observation and peer feedback.

Tobin, T.J. (2004). Best Practices for Administrative Evaluation of Online Faculty. Paper presented at DLA 2004, Jekyll Island, Georgia, May 23-26. 

  • Provides guidelines for evaluating online faculty work, based on the seven principles of effective teaching.

 

Interested in getting your hands DRTy with us? Add to this page, create a new page or suggest new topics here! We thank Kevin Kelly at San Francisco State for this page.