Lesson Plan: How AI Is Changing Email—Teach Students to Evaluate AI-Generated Writing
Turn Gmail’s 2026 AI features into hands-on lessons: evaluate and improve human vs. AI emails for clarity, persuasion, and ethics.
Hook: Turn the anxiety about AI-written email into a classroom win
Teachers and students are drowning in drafts they didn’t write, inboxes that summarize messages for them, and marketing copy that can feel soulless. If you’ve wondered how to teach digital literacy, ethics, and persuasive writing in the age of AI — fast — this lesson plan turns current Gmail updates and real-world email marketing trends into hands-on classroom activities. By the end of this unit students will be able to evaluate human vs. AI email drafts for clarity, persuasion, and ethical transparency.
Why this matters in 2026
In late 2025 Google rolled Gmail into the Gemini 3 era, embedding features like AI Overviews and stronger generative tools directly into the inbox. With roughly 3 billion Gmail users worldwide, that means AI isn’t just an optional tool — it’s shaping how most people read and respond to email. At the same time, marketing professionals are warning about AI slop — low-quality, generic content that damages trust and hurts engagement. Evidence from the field suggests AI-sounding language can lower open and conversion rates, so marketers and educators alike must teach how to spot and improve AI-generated writing.
Learning objectives
- Students will compare and contrast human and AI email drafts using measurable criteria.
- Students will score emails for clarity, persuasion, audience fit, and ethical transparency.
- Students will revise AI drafts to improve readability and authenticity.
- Students will debate the ethical implications of using AI in mass communication and email marketing.
Skills students practice
- Critical reading and textual analysis
- Rhetorical analysis and persuasive strategy
- Digital literacy and AI detection
- Peer review and evidence-based revision
- Ethical reasoning and public communication
Overview: 3-lesson unit (can be condensed to 1 long class)
- Lesson 1 — Intro & baseline: What AI in Gmail means for readers. Short practice comparing drafts.
- Lesson 2 — Deep evaluation and peer review: Apply a rubric to blind human vs. AI emails.
- Lesson 3 — Revision lab and ethics debate: Improve AI drafts and hold a structured ethics discussion.
Time estimates
- Three 45–60 minute lessons (recommended)
- One extended 90–120 minute workshop (condensed alternative)
Materials & tech
- Set of paired email drafts (human and AI versions) — sample bank included below.
- Rubric and peer review sheet (printable)
- Access to Gmail (to demo AI Overviews) and one AI text generator (teacher-controlled) for examples.
- Readability tool (e.g., Hemingway, Flesch–Kincaid) or built-in reader stats.
- Projector or shared screen for group discussion.
Preparation: sourcing and creating authentic pairs
Authenticity matters. Create pairs that vary by audience (students, parents, donors, small business customers) and by intent (inform, persuade, remind, upsell). Use the following process:
- Collect real sample emails from volunteer educators or publicly available marketing examples.
- Generate an AI version of each email with a consistent prompt. Note the prompt and model (e.g., Gemini 3 or classroom-safe tool).
- Edit both drafts lightly so they are the same length; ensure the only major difference is voice and structure.
Sample paired emails (classroom-ready)
Below are short, classroom-friendly examples teachers can paste into worksheets.
Pair A: Volunteer request (school newsletter)
Human draft: "Hi Ms. Rivera — I hope you are well. Our spring book fair is next month and we could use three volunteers for setup on Friday, May 8. If you’re available, reply ‘Yes’ and I’ll send the schedule. Thank you — your time helps our students choose books they love."
AI draft: "Hello valued community member, The Spring Book Fair is scheduled for May 8. We require volunteers. Reply to confirm availability. Thank you for your support."
Pair B: Marketing promotion (local business)
Human draft: "Hi Jordan — We’re hosting a free pastry tasting this Thursday 4–6 pm. Pop in for a sample and a 10% ‘first-visit’ coupon. No RSVP needed — hope to see you! — Ana, Bakery Manager"
AI draft: "Subject: Free pastry tasting this Thursday. Join us this Thursday for a free pastry tasting from 4–6 pm. Receive a 10% coupon for your first visit. Don’t miss out."
Lesson 1: Baseline & warm-up (45 mins)
Hook (10 mins)
Begin with a short demo of Gmail’s AI Overviews (or a screenshot). Ask: "If an AI tells you the gist of a message, what could you miss?" Collect quick responses.
Activity: Quick compare (25 mins)
- Distribute 2 paired emails (A & B). Students read each pair silently.
- On a 1–10 scale, have students rate each draft for clarity and friendliness.
- In pairs, students justify their top and bottom picks with 2–3 textual reasons (word choice, specificity, call-to-action).
Wrap-up (10 mins)
Collect 3 representative reasons that students used to judge drafts. Save those insights for the rubric discussion in Lesson 2.
Lesson 2: Rubric-based blind review (60 mins)
Introduce a clear rubric (10 mins)
Use a simple 4-criteria rubric: Clarity (0–5), Persuasion/Call to action (0–5), Audience fit (0–5), and Ethical transparency (0–5). Explain each criterion and provide anchor examples.
Blind review activity (35 mins)
- Distribute anonymized emails (remove headers and signatures so students don’t know which is AI).
- Students score each email independently and add textual evidence (quotes and timestamps where relevant).
- After scoring, students discuss in groups and reach a consensus score and label which version they think is AI or human.
Data discussion & short reflection (15 mins)
Collect group verdicts and reveal which drafts were AI. Discuss where students were surprised and why. Introduce the concept of AI slop — generic wording and missing structure — and connect to real-world email marketing impacts (lower engagement, trust).
Sample rubric (print or digital)
- Clarity: 0 (confusing) — 5 (crystal clear)
- Persuasion: 0 (no clear action) — 5 (compelling CTA and benefits)
- Audience fit: 0 (mismatched tone) — 5 (tailored voice and details)
- Ethical transparency: 0 (misleading/omits key facts) — 5 (clear, honest, respects privacy)
Add short evidence fields like: "Quote that shows clarity:" and "Why this felt AI/human:"
Lesson 3: Revision lab + ethics debate (45–60 mins)
Revision lab (25–35 mins)
- Assign groups an AI draft they previously scored poorly.
- Each group rewrites the email to improve scores. Use explicit tactics: add specificity, humanizing details, audience references, and a clear CTA.
- Optional: measure readability scores before and after to document improvement.
Ethics debate (20–25 mins)
Frame a structured debate: "Should marketers disclose when an email is AI-written?" Assign pro/con roles. Encourage students to cite user trust, legal transparency, and practical concerns (efficiency vs. authenticity). Connect to real-world policy discussion and recent trends in 2025–26 around disclosure and labeling.
Assessment & evidence of learning
Assess students using a rubric score improvement, quality of revisions, and participation in the ethics debate. A summative assessment could be a portfolio that includes:
- Original scores and rationale for a pair
- Revised email and a 150–300 word reflection on revision choices
- Short position statement from the debate
Differentiation and accessibility
- Provide leveled email pairs (simpler language for younger students, complex marketing copy for advanced classes).
- Allow students with reading challenges to use text-to-speech for emails.
- Offer sentence starters for peer review and debate to scaffold participation.
Tools and prompts — practical teacher cheatsheet
When generating AI drafts for class, use consistent prompts. Example prompt to produce an AI email you can critique:
"Write a concise email from a local bakery inviting neighborhood residents to a free pastry tasting this Thursday 4–6 pm. Keep it professional, use 3–4 sentences, include a simple call to action, and do not include personalization."
To get varied AI voices, change these parameters:
- Audience specificity (parents vs. donors)
- Tone (friendly, formal, urgent)
- Level of detail (vague vs. specific times, names, benefits)
Quick teacher tips to avoid pitfalls
- Always label AI-generated drafts for internal records; don’t disclose to students which is which until after analysis to keep blind review valid.
- Ensure student safety: sanitize any real contact info and avoid sending live marketing emails from class accounts.
- Model respectful critique—students should focus on text features, not the person who wrote it.
Real classroom case study (example)
At Lincoln High (fictional), a 10th-grade English teacher ran this unit in January 2026. Pre-unit, 78% of students believed they could always tell AI from human writing. After three lessons, only 42% were confident — but their actual detection accuracy rose from 55% to 81% based on blind tests. Crucially, students learned to improve AI drafts: average rubric scores rose 2.1 points per email after revisions. The teacher reported richer ethical conversations, especially after showing how AI Overviews can omit nuance or subjective tone.
Connections to standards and broader learning goals
This unit maps to common learning goals in many curricula: rhetorical analysis, evidence-based writing, digital citizenship, and media literacy. It also prepares students for careers where evaluating AI outputs is a necessary workplace skill.
Extensions and interdisciplinary hooks
- Computer science: Have students write prompts and measure differences across models (Gemini 3 vs. other APIs).
- Marketing/economics: Run A/B tests (classroom-scale) and analyze engagement metrics.
- History/ethics: Compare historical persuasion techniques to modern AI-enabled messaging.
Addressing the ethics of AI in email
Important ethical questions include: Should recipients be told content was AI-generated? Who is accountable for misleading claims in automated emails? How do privacy and consent apply when AI summarizes or rewrites private messages? Use the debate to surface these issues, and connect them to recent policy conversations in 2025–2026 where transparency and human oversight are central themes.
Final assessment rubric (teacher-ready)
Combine rubric scores with qualitative evidence. A simple scoring guide:
- 16–20 points: Exceeds expectations — clear revision and ethical understanding
- 11–15 points: Meets expectations — solid analysis and revision
- 6–10 points: Approaching — needs stronger evidence or clearer revisions
- 0–5 points: Beginning — incomplete or missing key analysis
Closing: What students take away
After this unit students will not only spot generic or manipulative AI email copy, they’ll know how to improve it and think critically about the ethics of automation. They’ll have practical tools — rubrics, revision strategies, and debate skills — that apply beyond the inbox.
Call to action
Ready to bring this lesson to your classroom? Download the printable rubrics, sample paired emails, and editable slides at classroom.top/ai-email-lesson (teacher resources). Try a one-day workshop this week: demo Gmail’s AI Overviews, run a blind review, and lead a 10-minute ethics snapshot. Share your classroom results with us — we’ll feature standout lessons and student projects in our educator community.
Related Reading
- Teach Stocks with Social Media: A Classroom Guide Using Bluesky Cashtags
- The Traveler’s Mat: 10 Hotel and Airport Yoga Routines for Frequent Flyers (Based on 2026 Hot Destinations)
- Using Smart RGBIC Lamps to Calm Anxious Pets: Practical Lighting Hacks for Kennels
- Forecast 2026: How AI and Enterprise Workflow Trends Will Reshape Immunization Programs
- How Media Reboots Create Windows for Women’s Sport Documentaries
Related Topics
Unknown
Contributor
Senior editor and content strategist. Writing about technology, design, and the future of digital media. Follow along for deep dives into the industry's moving parts.
Up Next
More stories handpicked for you
Using AI-Powered Tools for Engaging Assignments: A Hands-On Approach
The Evolution of Meme Culture and Its Impact on Classroom Communication
Navigating the Shift: Educators Respond to Microsoft 365 Outages
Navigating the New World of Digital Payments in Educational Institutions
How AI Video Verification Can Enhance Digital Literacy in the Classroom
From Our Network
Trending stories across our publication group