ARRS: Radiologists don't feel 'competent' on some business matters

May 03, 2012
by Brendon Nafziger, DOTmed News Associate Editor
Radiologists report a smidgen less confidence in their understanding of health policy and quality issues, such as imaging costs and malpractice, than other physicians, according to a new survey. They also report on average a less-than-competent understanding of core business issues that could help them stay competitive in a volatile health care environment, such as knowledge of marketing and equipment costs.

In two related reports presented at the American Roentgen Ray Society's annual meeting in Vancouver this week, researchers found that, on average, radiologists gave slightly lower marks for competence on four of five policy issues than their non-radiologist peers, as well as lower marks for four business concepts.

In fact, the study also found that the only subject in the survey about which the participating radiologists, on average, reported feeling truly "competent" was patient safety.

The study's co-authors, Dr. Richard Sharpe, chief radiology resident at Thomas Jefferson University in Philadelphia, and Dr. Rajni Natesan, a resident physician at Northwestern University in Chicago, Ill., argue this lack of knowledge could be hurting radiologists: they likely need more business know-how to run a successful practice in troubled times, and they need to be savvier about the health policies affecting them.

"I think if radiologists are not involved in health care policy, there will be policy initiatives acted by nonradiologists that are not favorable," Sharpe told DOTmed News. "If radiologists aren't involved in advocating for radiology, then no one else would be."

Nonrads vs. rads

The online surveys tracked responses from every medical residency program in the country, and involved 3,396 doctors, including 711 radiologists and 2,685 non-radiologists. The radiologists included 247 attending radiologists and 464 trainees.

The survey asked the doctors to report how competent they felt on a subject along a 0-5 scale, with 0 representing just about complete ignorance, and 5 the highest competence. To feel "competent" in a subject, a participant would need to rank it at least a 3.

According to the study, radiologists reported less-than-competent scores for everything but patient safety. On average, radiologists gave a 2.17 for their knowledge of imaging costs, while non-radiologists reported a 2.32. For malpractice knowledge, radiologists gave themselves a 2.3, compared with the 2.36 reported by other doctors. Quality assurance drew a 2.57 from rads, but a 2.62 from their non-rad peers. However, patient safety got a 3.1 from the imaging specialists, but a 3.33 from the others. Still, both scores mean self-admitted "competence."

Of the five policy areas measured, only on the topic of health care policy as a whole, did radiologists self-report higher competence than other doctors, even if it still ranked below the actual "competence" measure. Here, radiologists gave themselves an average of 2.43, compared with other doctors, who gave themselves a 2.33.

For the four business topics, radiologists reported, on average, across-the-board less-than-competent rankings. For IT, they gave themselves a weighted average rating of 2.48 (nonradiologists: 2.69); for knowledge of equipment and test costs, a 2.15 (nonrads: 2.23); for practice management, a 1.97 (nonrads: 2.16); and for marketing, a 1.94 (nonrads: 1.95).

Wondering whether attending radiologists, who have been playing the game longer, might be more knowledgeable, the researchers separately compared attending rads to nonrads, and trainee rads to nonrads. Here, some, but not all of the differences disappeared. Attending radiologists were still significantly less likely to report competence than nonradiology attending docs in information technology, practice management, test costs and patient safety. Trainee radiologists reported less competence in information technology and practice management than nonrad trainees, although curiously enough, trainee radiologists reported slightly, but significantly, higher competence in health care policy than their nonrad peers.

Study limitations

The researchers acknowledged some limitations of their studies, which have not been published in a journal yet. In surveys like this, respondents tend to inflate their competence, so their actual understanding could be lower than reported here, and self-reported competence could differ from the results gotten from other tests, such as objective assessments of knowledge.

However, the researchers didn't think it plausible that radiologists were somehow more critical of themselves or more likely to underrate their competence than doctors from the other 35 or so specialties included in the surveys.

"We all went through the same medical school training," Sharpe said. "All kinds of personalities are represented in radiology, so presumably they would have similar types of self evaluation" as other physicians.

Education

If there is a difference between radiologists and other specialties, what could account for it? Possibly the nature of radiologist education, the researchers suggested.

"There's so much information that needs to be known on the interpretive side, it leaves less time for the noninterpretive topics," said study co-author Natesan. "There's not as much time or effort put in for formalizing education in these areas."

While greater efforts are needed to make sure more newly minted radiologists have a better grasp of policy, business and quality assurance issues, the researchers said positive recent moves include the American College of Radiology's leadership institute, which launches this summer; the Journal of the American College of Radiology's quality-oriented features; and new conference courses, such as one offered at ARRS this week on "pitfalls" in radiology, that addressed patient safety, quality and malpractice issues.

The researchers said in addition to writing up their results for a journal submission, they're also looking back at the data to see if they can discover if some other specialties scored high for competence.

"Obviously if radiologist are below the curve and nonradiologists are middle-of-the-road, then there must be high achievers in this, and what can radiologists learn from those types of specialties?" Sharpe said.