
Welcome — this lesson zooms in on where AI meets sexuality education, social‑emotional learning (SEL) and health literacy. We’ll look for the real opportunities AI can bring into classrooms and products for young people, and we’ll call out the real risks — bias, exclusion, over‑filtering and misrepresentation. Most importantly, you’ll get practical ways to design, evaluate and govern tools so they’re inclusive, accountable and centered on youth and educators.
This lesson is for educators, designers, and policy makers who want to move past buzzwords and into concrete practice: no advanced AI degree required — just curiosity, care, and a commitment to young people’s well‑being.
What you’ll get out of this lesson
- Understand concrete ways AI can support sexuality education, SEL and health literacy.
- Spot common harms and trade‑offs (bias, privacy, over‑moderation, misinformation).
- Learn participatory design approaches that include young people and educators.
- Know the basics of accountable governance and ethical tool development.
- See example tools and expert reactions in a short demo.
- Practice an inclusive design checklist you can adapt and reuse.
How this lesson is structured
- Short readings and prompts to ground the ideas.
- A demo showing example tools and expert responses.
- A hands‑on activity: an inclusive design checklist you’ll use to evaluate or prototype a tool.
- Reflection questions you can use alone or with colleagues/students.
Topics at a glance
-
Opportunities in sexuality education, SEL and health literacy
— Practical examples of how AI (e.g., personalized learning, chat assistants, analytics) can support understanding, engagement and access to trusted information. -
Risks: bias, exclusion, misrepresentation and over‑filtering
— How models and systems can unintentionally harm young people through stereotyping, censoring valid content, amplifying misinformation, or excluding marginalized voices. -
Inclusive, participatory and co‑design approaches with youth and educators
— Methods for involving learners and teachers in design and testing so tools actually meet their needs and respect their rights. -
Accountability, governance and ethical tool development
— Practical governance patterns: documentation, privacy safeguards, redress mechanisms, and when to say “no” to a tool or feature. -
Demo: example tools and expert reactions
— Short walkthroughs of sample tools used for sexual health, SEL or literacy and quick reactions from educators and subject experts. -
Activity: inclusive design checklist
— A ready‑to‑use checklist to assess accessibility, safety, cultural relevance, bias risk, privacy, and governance — plus prompts to adapt it for your context.
How to use this lesson
- If you’re an individual learner: follow the topics in order, try the checklist, and reflect on how AI shows up in your setting.
- If you’re a facilitator or PD leader: use the demo and checklist as discussion prompts; run the activity in small groups and capture emergent design criteria.
- If you’re a policy maker: focus on the risk and governance topics, and consider the checklist as a starting point for procurement or guidance.
Time estimate & materials
- Estimated time: 60–90 minutes (can be split across sessions).
- You’ll need: internet access, a device for demos, and a copy of the checklist (editable doc recommended). If running co‑design activities with youth, bring informed‑consent templates and follow safeguarding protocols.
Before we start
This lesson deals with sensitive topics (sexuality, health, emotions) and with technologies that affect minors. Set clear ground rules, protect privacy, and follow local laws and safeguarding policies when involving young people. If you’re unsure, pause and consult your institution’s safeguarding or legal lead.
Ready? Let’s explore how to make AI tools that actually help young people thrive — thoughtfully, safely and with their voices front and center.
