Clinician Learning Brief

Communication Has Entered the Skills Lab

Topics: Communication skills, Learning design, Outcomes planning
Coverage 2025-02-24–2025-03-02

Abstract

This week’s clinician-learning signal is tighter than a generic call for empathy: communication is being treated as a skill to practice and assess, while content quality is also being judged by how clearly it teaches.

Key Takeaways

  • Communication is being framed less as professional demeanor and more as a trainable, observable clinical skill.
  • That raises the bar for CME design: communication activities need rehearsal, feedback, and outcomes plans tied to performance, not just awareness.
  • A secondary signal is that clinicians and educators are also judging content quality through teachability and usability, not information volume alone.

Communication education is being framed less as something clinicians absorb by example and more as something they should practice, review, and assess. The evidence this week is cross-specialty but still source-limited, so the right read is a recurring pattern with clear design implications, not a claim of universal adoption.

Communication is moving from virtue to competency

Across this week’s sources, communication was discussed in concrete, trainable terms: difficult conversations can be improved with coaching, shared decision-making can be practiced with feedback, and language-appropriate communication can be taught and assessed. That framing appears in a JAMA discussion, an oncology conversation describing video review and feedback for hard conversations (Kidney Cancer Unfiltered), and a pulmonology discussion that emphasized teaching patients how to use action plans rather than simply handing them over (Keeping Current CME).

For CME providers, the implication is straightforward: lecture-only communication content is harder to justify when the skill itself is being framed as observable performance. If the objective is better counseling, shared decisions, or clearer patient instructions, the activity likely needs rehearsal, feedback, or some form of guided review.

This is distinct from generic empathy programming. It points toward communication as a clinical skill that benefits from scenarios, faculty observation, and outcomes measures tied to specific tasks. CME teams should ask where their current communication portfolio still teaches principles without giving learners a chance to demonstrate them.

Clinicians are also screening for whether content actually teaches

A second, narrower pattern this week is that educational quality is being judged through visible teachability. In an RSNA review, strong education exhibits were described as timely, image-rich, and easy to absorb rather than dense research summaries. A conference-style oncology program on YouTube also foregrounded interactivity and downloadable materials (Medscape), while a provider-owned CME example highlighted practice aids alongside the teaching itself (PeerView).

This is not broad proof of all-specialty consensus, and part of the support here comes from provider-owned educational content. Still, it is a useful editorial cue. Clinicians and educators are not treating “good science update” and “good teaching” as the same thing. Visual clarity, digestibility, and usable support tools are part of how educational quality is being recognized.

That lands less as a new packaging thesis than as a faculty and content-standard issue. As an earlier brief on why dense education fails at first glance argued, content that is accurate but hard to absorb can still miss the mark. The operator question is whether review workflows explicitly test for teachability: can a busy clinician grasp the point quickly, use it in practice, and return to a visual or aid later without re-consuming the full session?

What CME Providers Should Do Now

  • Audit communication activities for where they rely on expert explanation without rehearsal, observation, or feedback.
  • Update faculty and instructional-design briefs so communication objectives are written as specific skills to demonstrate and measure, not broad intentions to improve bedside manner.
  • Add a simple teachability check to content review: visual clarity, slide density, practical takeaways, and at least one usable scaffold such as a scenario, heuristic, or aid.

Watchlist

  • New-therapy education may need to cover team learning curves, including prophylaxis, supportive care routines, and dose-adjustment know-how, not just efficacy and safety facts. For now this remains a single-source, oncology-heavy watch item anchored in an oncology discussion of ADC toxicity management.
  • Legal and safety discussions around apology, regret, and adverse outcomes may turn into clearer demand for disclosure training. The current evidence is still policy-heavy, but a JAMA legal discussion is worth watching alongside this week’s broader communication-competency framing.

Turn learner questions into outcomes data

ChatCME surfaces the questions clinicians actually ask — so you can build activities that close real knowledge gaps.

Request a demo