Educator Roles Now Demand Master's Degrees That Outpace Actual Teaching Needs
Earlier coverage of accreditation operations and its implications for CME providers.
Clinicians are shifting from high-stakes MOC exams to longitudinal assessments that deliver flexible feedback and gap data; faculty development conversations point to the same need for sustained trajectories over one-off
Clinicians in radiation oncology and cardiology are moving from high-stakes MOC exams toward longitudinal assessments that fit their workflow, remain private, and supply immediate references plus gap data. Educator conversations this week made a parallel point about faculty development: sustained identity formation and deliberate practice outperform isolated workshops.
The clearest signal came from maintenance of certification. In a radiation oncology discussion of ABR board certification, clinicians contrasted the old high-stakes MOC exam with Online Longitudinal Assessment: weekly questions, private completion, immediate rationale and references, and a structure that can be both formative and summative. The same discussion noted that ABR’s OLA data are being used as public gap information for CME development, including areas such as treatment planning, bioethics, evidence interpretation, and patterns of failure (ACRO podcast on ABR certification).
That extends an earlier brief on MOC frustrations driving micro-CME and new certifying boards, but the issue is sharper now. The complaint is no longer only that certification is burdensome. It is that clinicians can now point to formats that feel closer to how they actually keep current: distributed, lower-friction, reference-supported, and linked to observable gaps.
The NBPAS discussion adds the alternative-board angle, though it should be read as provider-owned policy commentary rather than broad clinician consensus. In that conversation, accredited specialty CME is positioned as the core mechanism for staying current, with a pathway built around recent CME documentation rather than exam performance (Healthcare Unfiltered on NBPAS).
For CME providers, the implication is not simply “offer more MOC credit.” It is to treat certification as a learning workflow. If longitudinal assessments generate gap data and clinicians prefer low-friction proof of learning, CME portfolios need cleaner links among assessment, credit, documentation, and follow-up education. The question for teams: which activities could be personalized from board-level gap data without asking clinicians to do another disconnected task?
A second, narrower signal came from academic educator discussion of faculty development as professional identity formation. The conversation focused on whether “faculty developer” is a distinct role, what expertise it requires, and how people move from delivering sessions to belonging to a community of practice. Participants debated content expertise, process expertise, needs assessment, curriculum planning, deliberate practice, and peer relationships as part of that role (The PAPERs Podcast).
This evidence is mainly an academic podcast discussion of a single paper, with limited independent practicing-clinician corroboration. Still, it matters for CME providers because many organizations design and deliver faculty development as if the job were just session delivery. The discussion points to a more demanding model: faculty developers need repeated practice, reflection, organizational validation, and communities that help them see the role as legitimate.
The provider implication is operational. If faculty development is meant to improve teaching quality, curriculum design, facilitation, and learner support, attendance alone is a thin outcome. CME teams should ask whether faculty programs create a path from novice participation to confident application: practice opportunities, feedback, peer exchange, and evidence that skills are being used after the session.
The common thread this week is continuity. Certification conversations are moving toward assessment systems that produce feedback and gap data. Faculty-development conversations are asking whether people can grow into a role through repeated practice and community. For CME teams, that makes the event-by-event portfolio look incomplete. The stronger question is which programs still work if clinicians expect learning to connect across assessment, credit, feedback, identity, and follow-up—not as extra burden, but as the normal shape of professional learning.
Practicing clinicians describe traditional MOC as high-stakes, anxiety-inducing, and misaligned with site-specific practice while praising OLA-style longitudinal assessments for privacy, flexibility, and immediate feedback.
"Literally did it today because if I still practiced exactly how I was trained it would now be malpractice"Open source
Discussion of NBPAS traction as less onerous certification pathway focused on accredited CME rather than exam performance.
Open sourceEarlier coverage of accreditation operations and its implications for CME providers.
Earlier coverage of accreditation operations and its implications for CME providers.
Earlier coverage of accreditation operations and its implications for CME providers.
ChatCME surfaces the questions clinicians actually ask — so you can build activities that close real knowledge gaps.
Request a demoRadiation oncology examples highlight preference for formative/summative longitudinal assessment over high-stakes exams.
Open sourceEmphasizes that domain-specific expertise in needs assessment, curriculum design, and facilitation is required beyond generic skills and that identity formation benefits from communities of practice.
Open sourceHighlights logistical barriers and need for organizational validation and peer relationships to support faculty developer identity.
Open sourceStresses need for public release of training data specifications and clear accountability to build trust in medical AI.
Open sourceDocuments concentration of payments in key specialties and calls for excluding heavily conflicted individuals from guideline panels.
"From 2013-2022 pharma paid 12 billion dollars to US physicians. That’s mind boggling. Insane. That’s how silence is bought, the minds of physicians influenced, and ultimately patient care/prescribing patterns influenced."
Show captured excerptCollapse excerpt