Practices in Action

Rubrics

In this section:

Consider different kinds of rubrics

Rubric Type→Analytic RubricsHolistic RubricsSingle-Point Rubrics
PurposeBreak down assignment into multiple criteria with separate scoresProvide single score based on overall impressionDescribe only proficient performance with space for feedback
Best forDetailed feedback, skill development, identifying specific strengths/areas for growthQuick grading, summative assessments, when criteria are difficult to separatePromoting growth mindset, reducing student focus on points, encouraging revision
StructureMultiple criteria (rows) × performance levels (columns)Single scale with descriptions of overall performance at each levelSingle column of criteria with space on either side for “areas of concern” and “evidence of exceeding”
Example useResearch papers where you assess thesis, evidence, analysis, and writing separatelyPortfolio assessment or creative projects requiring integrated evaluationFormative assessments, drafts, or when emphasizing individualized feedback

return to top

Evidence-based strategies for AI-resistant assessment

Research demonstrates that certain rubric designs naturally promote authentic work by requiring forms of engagement that AI cannot replicate (Perkins et al., 2024; Bearman & Ajjawi, 2023).

Cumulative Knowledge Building – Design criteria that require students to build on previous low-stakes, ungraded classroom activities:

Example criterion: “Integrates findings from three ungraded in-class quick writes to support thesis, showing how understanding evolved across semester”

Why it works: AI can’t access the accumulation of small, undocumented classroom moments. Only students who participated have this intellectual history to draw upon.

Implementation: Collect weekly index cards, exit tickets, or quick writes. Don’t grade them, but require students to reference them in major assignments.

Verified Personal Application – Require connections to specific, verifiable experiences within your course structure:

Example criterion: “Applies theory to your assigned community partner organization, incorporating specific details from site visit and supervisor interview”

Why it works: Each student has unique, documented assignments (internship sites, lab partners, community placements) that create non-replicable contexts.

Implementation: Maintain records of individual student placements/partners. Reference these in rubric criteria.

Synchronous Thinking Assessment – Build real-time components into major assignments:

Examples:

    • Presentation Q&A: “Responds to audience questions without notes, demonstrating internalized understanding”
    • Live peer review: “Provides specific, constructive feedback referencing course concepts during workshop”
    • In-class components: “Completes analytical framework during class using only memory and notes”

Why it works: Time-bounded, contextual tasks require immediate cognitive processing that can’t be delegated.

Version History Analysis – Require students to submit and discuss their revision history:

Example criterion: “Annotates significant changes between drafts, explaining why specific feedback led to particular revisions and identifying which changes were most difficult”

Why it works: Authentic revision leaves digital fingerprints. AI-generated work typically appears fully formed or with superficial changes. Platforms like Google Docs timestamp every edit.

Implementation: Require drafts in Google Docs with version history intact. Ask students to screenshot and annotate key revision moments.

Collaborative Thinking Traces – Design criteria that value intellectual exchange with verified peers:

Example criterion: “Synthesizes peer perspectives from discussions, showing how [specific student names]’s ideas challenged or enriched initial understanding”

Why it works: Real collaborative learning creates reciprocal traces. You can verify if Student A actually influenced Student B by checking both students’ work.

Progressive Skill Demonstration – Create criteria that assess skills built incrementally across multiple small assignments:

Example criterion: “Demonstrates mastery of [specific technique] by showing progression from Week 3 exercise (basic application) through Week 7 lab (troubleshooting) to final project (innovation)”

Why it works: AI can’t fake a learning trajectory across multiple timepoints with increasing sophistication. The progression must be authentic.

Implementation: Design scaffolded assignments where each builds on previous work. Rubric rewards visible growth, not just final achievement.

return to top

Language equity and accessibility

Different rhetorical traditions value different argumentation styles. Western linear logic isn’t the only valid approach. Consider accepting:

    • Narrative evidence alongside empirical data
    • Collective voice alongside individual argumentation

Various definitions of “professional presentation” across cultural contexts

Prioritize idea development over grammatical perfection unless mechanics are a specific learning outcome. Consider weighted criteria that value thinking over surface features:

    • Content Understanding (70%): Demonstrates grasp of concepts through clear explanation
    • Communication Effectiveness (20%): Organizes ideas so readers can follow reasoning
    • Mechanics (10%): Uses conventions appropriate for context

Design for linguistic diversity by recognizing that multilingual students bring valuable perspectives. A student explaining thermodynamics using translingual practices (code-switching between languages for precision) demonstrates sophisticated thinking that shouldn’t be penalized.

Ensure universal access by using plain language, defining disciplinary terms, and checking compatibility with screen readers. Test your rubric: Could a student using assistive technology understand and meet these criteria?

return to top

Disciplinary conventions

The same skill manifests differently across fields. For example:

SkillSciencesHumanitiesProfessional Fields
Evidence Use“Cites specific data with error margins”“Contextualizes sources within historical debates”“Applies industry benchmarks to case analysis”
Argumentation“Eliminates alternative hypotheses systematically”“Acknowledges interpretive tensions”“Weighs stakeholder perspectives”

return to top

Variations by assignment type

Different assignment formats require distinct rubric criteria that align with their specific learning goals and modes of expression. This section provides targeted rubric considerations for common assignment types, designed to motivate deep engagement by emphasizing ownership, relevance, relationships, and appropriate challenge.

Research Papers/Essays

  • Personal inquiry development: Student’s own research question emerges from their interests/experiences
  • Synthesis with course dialogue: Integration of class discussions, peer feedback, and evolving understanding
  • Evidence evaluation: Critical assessment of sources’ relevance to their specific argument
  • Reflective component: How this research changes their thinking or future goals

Presentations

  • Audience-specific design: Adaptation for actual classmates’ interests and backgrounds
  • Interactive engagement: Facilitation of meaningful dialogue, not just information delivery
  • Personal stake: Clear connection to presenter’s goals, experiences, or future applications
  • Responsive thinking: Authentic engagement with questions showing real-time processing

Group Projects

  • Negotiated roles: How members used individual strengths and interests
  • Conflict navigation: Evidence of working through disagreements productively
  • Collective meaning-making: Integration that goes beyond dividing tasks
  • Process documentation: Reflection on group dynamics and individual growth

Creative Works

  • Personal vision: How the work expresses individual perspective or cultural identity
  • Risk documentation: Evidence of pushing beyond comfort zone with instructor support
  • Iterative development: Response to feedback from peers and instructor
  • Contextualized reflection: How creating this work shaped understanding of course concepts

 

return to Designing Effective Rubrics