Health Policy, Opinions
Leave a comment

In Defense of Step 2 Clinical Skills

How do you learn to be a good doctor? Sure, you study anatomy and make flashcards and memorize nerve pathways, but what makes someone a good doctor is the way they apply all that knowledge to real patients with compassion and diagnostic expertise. Strong “doctoring” skills like listening carefully, taking a comprehensive history, explaining medication and treatment options clearly and patiently, and writing notes that colleagues can understand are what add the art to the science of medicine.

In my role as Senior Associate Dean for Graduate Medical Education and Accreditation at the Medical College of Wisconsin, I see firsthand just how important it is to train our students in all of these critical skills. But we can’t just show them, we have to make sure they can demonstrate their competence in all aspects of a physician’s role so that — wherever they match or whoever they work with — they can be sure they share the same baseline of knowledge and know-how.

That’s where the USMLE comes in. None of us looks forward to an expensive series of high-stakes exams, but those exams are there to protect the public and to ensure that all of our future colleagues will not only have the required skills and judgment we can rely on but are also practitioners with whom we would entrust our own families if need be.

In the past two months, a group of Harvard medical students have launched the “End Step 2 CS” campaign, an effort to do away with the portion of the U.S. Medical Licensing Exam that tests clinical and communications skills in a hands-on, day-long clinic simulation using standardized patients.  The students cite the test as being too expensive and too inconvenient, and having little proof that the exam improves practice. While I sympathize with the burden felt by students who already carry the much greater weight of their medical school costs, their petition fails to recognize a point of critical importance: Not only is the Step 2 Clinical Skills exam a necessary public safeguard, it has greatly strengthened the curriculum of medical schools nationwide.

For many years, the AAMC urged medical schools to expand their instruction in clinical skills but it didn’t happen in a widespread way until it was clear that Step 2 CS was going to become reality in 2004. Even on our campus, though we had experiences with standardized patients as part of our curriculum, we built our current clinical skills facility, the STAR Center (Standardized Teaching Assessment Resource Center), specifically because I wanted our young men and women to be prepared for Step 2 CS. We wanted to put some heft behind this area of instruction and make it more rigorous. Twelve years later, not only have our students’ clinical skills improved but the intended emphasis on communication and interpersonal skills has permeated the whole culture of our medical school. The importance of what used to be considered ancillary expertise is now integrated into every aspect of our curriculum — an example of positive impact that every patient would applaud.

But having students tested on their own campuses is not enough. I hate to admit it, but we all know the truth: every campus has one or two brilliant medical scientists with inadequate interpersonal skills. It’s all too easy to pass them on thinking, “they’ll learn in their residency program,” which soon becomes, “they’ll learn when they are fellows,” and so forth. The end result? A physician who cannot communicate with patients resulting in poor compliance, poor outcomes, and, most unfortunately, poor patient relationships. It is not a coincidence that the majority of actions brought before state medical boards involve failures in communication.

Having an objective national standard for the demonstration of clinical skills is also critical because of another factor: variability. Variability means all medical schools are not identical, which allows students to find the right fit for them. But it also means medical schools have different curricula, different modes of instruction, different facilities, and different standards. We cannot assume a student is proficient in clinical skills just because they have been tested on their home campus any more than we would license them without testing their knowledge of pharmacology or the management of diabetes mellitus. Assessments in medical school are teaching tools. Licensing assessments set a national standard that must be met to ensure public safety. Their subject matter is similar but their goals are not the same.

Licensing bodies don’t like to say that their exams drive curriculum — that is not their purpose — but they certainly do set the standard that we train our students to meet or exceed. And of course, our schools, along with accrediting bodies like the Liaison Committee on Medical Education (LCME), use the USMLE as but one outcome measure for evaluating curriculum on our campuses. We all strive to teach much more: professionalism, wellness, analytical skills, scholarly inquiry, social determinants of health, etc.

The USMLE allows us to avoid any conflict of interest in assessing our students and guides medical education by establishing common standards that patients and medical colleagues alike can rely on every time they walk into a physician’s office. The past decade’s additional emphasis on clinical skills has enhanced medical education, guiding us to prepare even more well-rounded practitioners of the art and science of medicine.

Editor’s note: A version of this piece first appeared at KevinMD on April 22, 2016.

Ken Simons, MD Ken Simons, MD (1 Posts)

Attending Physician Guest Writer

Medical College of Wisconsin


Ken Simons, MD serves as the Senior Associate Dean for Graduate Medical Education and Accreditation at the Medical College of Wisconsin in Milwaukee, WI. A medical graduate of Boston University, he then completed his residency and fellowship training in Ophthalmology at The University of North Carolina School of Medicine and Jules Stein Eye Institute, University of California-LA, respectively.