Settings
University & Supervision
Clinical education, CF mentoring, student feedback, and academic integrity in the age of AI.
Clinical supervisors, CF mentors, and university clinic directors face a unique challenge: preparing the next generation of SLPs to use AI responsibly while maintaining academic integrity and developing independent clinical thinking.
Key Considerations
- Students may over-rely on LLMs before developing foundational clinical reasoning
- The line between “AI-assisted” and “AI-dependent” is critical during training
- Academic integrity policies may or may not address AI use in clinical documentation
- CFs need to develop their own clinical voice, not an AI-polished version of one
Red Flags Specific to This Setting
- Students submitting AI-generated reports as their own clinical work
- Using AI to bypass the learning process (writing goals without understanding the rationale)
- CFs relying on AI for clinical decisions that should develop through mentored practice
- Supervisors unable to distinguish student clinical thinking from AI output
Where LLMs Can Help
- Teaching prompt engineering as a clinical skill
- Demonstrating the difference between AI-generated and clinician-authored documentation
- Using AI output as a teaching tool (critique this AI-generated goal)
- Scaffolding clinical writing (use AI to organize, then revise into your own voice)
- Modeling responsible AI disclosure in clinical settings
ASHA Practice Portal Alignment
This content aligns with guidance from the following ASHA Practice Portal topics. Always consult the portal for the most current clinical standards.