Clinician Learning Brief

Can Learning Change the Workplace?

Topics: Workflow-based education, Learning design, Outcomes planning
Coverage clinician conversation observed March 17–23, 2025

Abstract

A narrow but important CPD argument: education may be judged increasingly by whether it helps change practice, not just deliver updates.

Key Takeaways

  • An emerging CPD argument says education should be built around care delivery, team function, and skill maintenance inside practice, not just knowledge transfer.
  • Format choice is being framed as audience- and culture-dependent, with feedback loops mattering more than loyalty to any one teaching tactic.
  • For CME providers, the near-term implication is to test where activities need implementation support, local adaptation, and outcomes plans tied to practice-process change.

The clearest idea this week is a challenge to how CME defines value: if education is supposed to improve care, content updates alone may not be enough. The evidence is narrow and comes mainly from educator and CPD voices rather than broad clinician conversation, so this is best treated as an emerging direction, not settled market consensus.

Education is being judged against care delivery

A CPD-focused discussion argued that continuing education still leans too heavily on individual knowledge transfer, even though care is delivered by teams inside specific systems and settings. The case was that education should help clinicians apply new information where they work, maintain capabilities over time, and translate learning into team and process changes—not just attend a lecture on the latest update (source).

This is a single-source CPD-insider argument, so it should not be overstated as broad market consensus. But the strategic implication for CME providers is real. If education leaders or enterprise buyers start asking whether an activity can help change care delivery, lecture-first planning may look incomplete. This connects to our earlier brief on value being judged by what happens after the activity, but the emphasis here is narrower: workplace change, not just stronger follow-through.

The decision for CME teams is practical: which planned activities actually need workplace rehearsal, team participation, or follow-through support to affect practice?

Stop treating format as a universal answer

A separate education conversation, this time from radiology, made a simpler point: no format works everywhere. What landed best depended on learner level, local culture, and whether the setting felt safe enough for people to participate honestly. The speaker also stressed active feedback and iteration over allegiance to any single engagement method (source).

This too is narrow, single-source evidence and should not be generalized too aggressively across specialties. Still, the lesson is useful beyond radiology. Standardizing one “high-engagement” format across audiences can miss what actually makes an activity work in a given setting. In practice, segmentation and lightweight feedback may matter more than picking the most fashionable modality.

The decision for providers: where are format choices being made by habit, and where should local audience evidence carry more weight?

What CME Providers Should Do Now

  • Audit upcoming activities to identify which ones depend mainly on content delivery and which ones require implementation support, team learning, or skills rehearsal.
  • Add a simple post-activity feedback loop that captures context-specific usefulness by audience segment, not just overall satisfaction scores.
  • Review one outcomes plan this quarter and test whether it measures any care-process or practice-change effect rather than stopping at knowledge gain alone.

Watchlist

  • For procedural education, small-group hands-on training with strong faculty access looks worth watching. Current evidence is still too specialty-bound to generalize, but a urology course discussion made a clear case that in-person time is justified when it centers deliberate practice and case-based transfer back to practice (source).

Turn learner questions into outcomes data

ChatCME surfaces the questions clinicians actually ask — so you can build activities that close real knowledge gaps.

Request a demo