Learn
Home Foundations Glossary Research
Do
Prompts Workflows Tasks
Adapt
Domains Settings Patterns
Verify
Antipatterns Case Studies Policies Resources

Copilot

Clinical Definition

The recommended mental model for using LLMs in clinical work: the AI organizes, structures, and drafts, but you make every clinical decision. Think of it like a well-prepared student clinician. They can pull together background information, format a note, and suggest phrasing, but they don't determine the diagnosis, set the goals, or sign the report. You review everything. You change what needs changing. Your name is on it.

Technical Definition

A human-in-the-loop AI assistance paradigm where the model's role is constrained to support tasks while a human retains decision authority. In copilot architectures, the AI generates suggestions or drafts that require explicit human review and approval before taking effect. This contrasts with autonomous agent architectures where the model acts independently.

Also known as: AI assistant, human-in-the-loop, AI copilot, augmented intelligence

Why SLPs Need to Know This

The copilot framing solves the biggest anxiety SLPs have about AI: “Is it replacing me?” No. It is doing the parts of your job that don’t require your degree (formatting, organizing, first-drafting). You are doing the parts that do (observing, reasoning, deciding, judging). The risk comes when the boundary blurs, when you start accepting AI output without review because you’re tired and the caseload is 80 students deep. The copilot model only works if you actually fly the plane.

Practical Guide

  1. Never submit AI output without reading every word. If your name is on it, you’ve asserted that it’s accurate
  2. Use AI for structure, yourself for substance. Let the model set up the SOAP format; you fill in the clinical observations
  3. Treat AI suggestions as hypotheses. A draft goal is a starting point, not a recommendation
  4. Set a personal rule for review depth. Decide in advance how carefully you review AI output and stick to it, especially on heavy caseload days
  5. Remember scope of practice. AI has no scope of practice, no license, and no liability. You have all three

The Clinical Analogy

A copilot is a student clinician with infinite stamina and zero clinical judgment. They’ll prep materials, write draft notes at 2 AM, and never complain, but they’ll also miss a tongue thrust, write a goal for the wrong client, or recommend a treatment they read about once in a blog post. You supervise. You correct. You sign.

  • Clinical Voice: the copilot model preserves your voice because you remain the author
  • Informed Consent: even in a copilot model, clients should know AI is part of the workflow
  • Hallucination: the copilot model’s value depends entirely on your ability to catch errors in the AI’s output

SLP/IO Assistant

Powered by Claude · No PHI accepted
AI assistant for clinical workflow support. Never enter student names, DOBs, or identifiable information.
Hi! I'm the SLP/IO assistant, an opinionated AI grounded in clinical practice. I can help with goal wording, note structure, ethical reflection, and navigating LLMs responsibly. What are you working on?