Skip to main content

From Theory to Practice: Designing Assignments in an AI World

From Theory to Practice: Designing Assignments in an AI World

Generative AI is rapidly becoming part of how students learn, and many instructors are exploring new ways to assess learning in this evolving landscape, but how do we design AI-aware assignments that meaningfully support student learning?

This article presents a practical traffic light framework for designing assignments in an AI-aware classroom. This system helps instructors make intentional choices about when AI might hinder learning and when it can be leveraged to support it.

Below are practical ideas you can easily adapt to your courses. These examples use a traffic light framework, similar to how a traffic light guides decision-making. This framework highlights intentional use and thoughtful decision-making about how AI will impact student learning. Red light assignments mean AI use is prohibited. Yellow light assignments signal moments where learning needs protection, while green light assignments invite AI use that fosters deeper thinking and professional judgment.

Red Light Assignments

These are assignments where students should not use generative AI at any point in the process, from brainstorming to drafting and editing to final presentation. But it’s not enough to simply ban the use of AI on an assignment; instructors need to engage in a dialogue with students about why AI would hinder learning. Clearly articulating the learning goals of an assignment, why doing things “the hard way” matters, and the development that would be missed by using AI helps students understand the red-light designation.

Yellow Light Assignments

Yellow light assignments are designed to protect key learning moments when generative AI may interfere with students’ development of essential learning skills.

Importantly, a yellow light approach does not aim to make assignments AI-proof [no red lights on this road]. Instead, these designs encourage students to slow down, pay attention, and proceed with caution and intention, ensuring that AI tools provide limited value unless students have engaged meaningfully with the course's concepts and learning objectives.

Common Yellow Light Design Strategies

Emphasizing the Learning Process over a Final Product

  • Submit your final answer along with a short decision log describing your initial approach, two alternative approaches you considered, and why you rejected them.
  • Solve the problem and annotate each step to explain why you chose that method and how it connects to course concepts.

Grounding Work in Course‑Specific Context

  • Choose one argument raised during a class discussion and explain how it influenced or challenged your thinking.
  • Apply one idea from our guest speaker’s talk about a case or problem of your choice.
  • Using the dataset your lab section collected, analyze one unexpected pattern and explain what might account for it.

Introducing Meaningful Constraints Such as Limiting Methods, Tools, or Audiences

  • Solve this problem using only techniques introduced before Week 4.
  • Explain this concept without using equations, code, or technical jargon.
  • Write this explanation for a non‑expert audience.
  • Requiring comparison and justification where students evaluate multiple approaches and explain their choices.

Explain why Solution A works under Condition X but fails under Condition Y.

  • This dataset contains missing and inconsistent values.
    • What assumptions must you make to proceed, and how do those assumptions affect your conclusions?
    • Identify at least two pieces of information you would need to feel confident in your conclusions and explain why.
  • You are given sources that point to different conclusions.
    • How do you reconcile those differences?
  • Incorporate a low‑stakes oral explanation of how to approach the task.
    • Record a two to three-minute video explaining how you approached this assignment, where you struggled, and what you did to move forward.
    • Describe your approach to a partner, then revise your explanation based on their questions.
    • Briefly discuss one decision you made that had the biggest impact on your results.

Why Yellow Light Design Strategies Work

Across all these examples, the common thread is that the student learning is visible. They must demonstrate reasoning, judgment, and engagement with course material, making AI shortcuts far less useful.

Green Light Assignments

Green light assignments help students make AI use visible, deliberate, and accountable.

Common Green Light Strategies

AI as a Thought Partner

Generate options, then evaluate them using course criteria. 

  • Use an AI tool to generate three possible approaches to this problem. Evaluate each using criteria from the course and explain which approach you chose and why.
  • Ask an AI tool to propose three research questions related to this topic. Select one, revise it, and explain why it best aligns with the course’s learning goals.
  • Use AI to suggest multiple design solutions. Critique the feasibility, risks, and assumptions of each before selecting one to develop further.

Prompting as a Learning Outcome

Revise prompts and reflect on improvements. 

  • Students submit an initial prompt, a revised prompt, and a short reflection explaining how changes to the prompt improved the output.
  • Revise your prompt to better reflect disciplinary conventions (e.g., tone, evidence, assumptions). Explain what you changed and why.
  • Critique and selectively accept AI suggestions.

AI Audit and Verification

Identify errors, bias, or weak reasoning. 

  • Ask an AI tool to summarize an argument or position related to a topic or debate in the course. Identify any implicit assumptions, bias, or missing perspectives in the response.
  • Identify at least two claims in the AI output that could be incorrect, incomplete, or misleading.
  • Explain why this reasoning would be problematic in a real‑world or disciplinary context.

Discipline‑Specific Judgment

Evaluate AI output through a professional lens. 

  • Use an AI tool to propose assumptions for this model. Identify which assumptions would be unacceptable or risky in professional practice. Explain your reasoning using standards or concepts from this course.
  • Ask an AI tool to generate an explanation of (topic of your course). Evaluate the response through a social‑science lens by identifying any missing variables, oversimplifications, or theoretical misalignments. Explain how these limitations affect the explanation’s validity.

AI Transparency Statements

Explain how AI was used and what was verified. 

  • In your AI use statement, identify one specific AI‑generated claim or suggestion you verified using course materials or an external source. Explain what you found.
  • Briefly describe a moment when you disagreed with the AI output and how you resolved that disagreement.

Why Green Light Strategies Work

In a green light framework, AI use is expected, but students’ judgment is what is being assessed. The objective is to produce an answer that questions, verifies, and improves an output. It’s not about preventing cheating, but about creating an environment similar to what students will face after graduation

While AI challenges many traditional assignment strategies, it also can offer a moment for reflection that prompts us to consider what a student can bring to their learning experience that a machine cannot. By intentionally inserting yellow light moments throughout a course, instructors can safeguard the effort of thinking with green light opportunities for real-world practice and shift the focus to student engagement.