(Courtesy ContextVision)

Another day in digital radiology

October 21, 2013
By Peter Kövamees

From the 19th century to present time, the X-ray has played a pivotal role in medical imaging. Similar to nearly any other technology, X-ray imaging has evolved significantly since its inception, most recently with the shift from analogue to digital.

For several years, analogue X-ray existed as the de facto standard in the market, but similar to other methods, its existence has been plagued with numerous problems and challenges. Not only has digital become the gold-standard in developed countries, but analogue's presence is slowly but surely disappearing in developing countries in place of new digital technology. In fact, the global digital X-ray market is estimated to be worth $4.82 billion by 2018, growing at a CAGR of 4.5% from 2013 to 2018, according to a recent Markets and Markets report.

Benefits of shifting from analogue to digital

The ideal imaging system should permit a high quality image with minimal radiation exposure. Digital radiology has the potential to achieve this and further advances will possibly lead to lowering the radiation dose and using higher sensitivity plates to provide better resolution and sharpness of images.

The benefits of digital radiology are many; both financially, environmentally, and clinically. These include:
1. Providing high quality images, faster turn around and outcome, making it possible to review previous imaging on a patient much easier;
2. Minimal radiation exposure, for both patient and personnel;
3. Resource efficiency with less film cost and storage space and fewer staff handling the archive;
4. Opening up for future image processing and analysis improvements; and,
5. Sending and sharing difficult cases with colleagues can easily be done.

Minimizing radiation exposure

In the earlier days of X-ray when film-screen radiology was standard, the relation between blackness of the film and detector (film) exposure was clear. Every X-ray technologist could adjust the dose (mAs) to achieve the requested blackness of the film, so exposure was more precise and specific for each image.

Radiology has played an important role in the diagnosis and management of patients for more than 110 years. Traditional screen-film systems use overall film density as an exposure indicator. Direct feedback to the technologist regarding exposure is obtained by the appearance of the processed film image. Optimized technique factors (kVp and mAs) are based upon the patient size and body part and radiographic speed of the screen film combination being used. Particularly in situations where automatic exposure control is not used (for instance, in the majority of small pediatric patients), the use of fixed exposure parameters requires the technologist to use experience and appropriate judgment to set radiographic techniques.

Since the mid-1990s, a steady replacement of analog screen-film detectors with digital radiology detectors has occurred, along with an expectation of lower dose because of minimal retakes and consistent image quality. This is because computed radiology and direct radiology devices have wide exposure latitude/dynamic range, and image post-processing capabilities that provide consistent image appearance even with underexposed and overexposed images. Determining correct exposure parameter settings and patient exposure by image appearance (e.g., density on a film image) is no longer possible. While underexposed images have smaller numbers of X-rays absorbed by the digital detector and can be recognized by a noisy appearance, overexposed images can easily go unnoticed, resulting in unnecessary radiation for the patient.

Today, in the digital X-ray world, modern systems use automatic image processing, so the ratio between detector exposure and image brightness is not always clear. The old visual control function for too-high dosage has been removed, so over/under exposures may go unnoticed on the screen with different brightness as this is corrected automatically.

Maintaining quality and consistency

So how can X-ray examinations maintain quality and secure that the equipment not drift in radiation over time? The Exposure Index, a method by which digital radiology manufacturers provide feedback to the technologist regarding the estimated exposure on the detector, is an indirect way of measuring the image signal-to-noise ratio and an indirect indication of digital image quality. This provides the operator a secured control that the actual exposure has been done with optimal image quality, and that no excess radiation outside of the manufacturer's recommendation has been used.
Manufacturers have begun introducing exposure indicators for digital radiology equipment. The International Electro Technical Commission explains:

"While considerable underexposure results in an increased level of noise, the more alarming aspect (from a radiation protection point of view) is that overexposure cannot be recognized easily in the displayed image.

Therefore, various manufacturers of digital radiology systems have introduced so-called exposure indicators for their equipment. These are numbers, determined from the original image data of each image taken, which allow conclusions about the level of the exposure at the image receptor. However, the exposure indicators are manufacturer of system specific, i.e. they differ for the systems of different manufacturers in their definition and scaling. A unified EXPOSURE INDEX for all digital radiology systems is needed to simplify its usage, e.g. for the establishment of exposure guidelines, particularly when systems of different manufacturers are used within the same department."

In 2012, Germany was the first country to have regulatory demands that all manufacturers are to have Exposure Index for digital X-ray units sold in the country, a trend that more countries will follow in the coming years as Exposure Index brings quality assurance to digital X-ray examinations.

Today's state-of-the-art exposure index will adapt to the actual tissue being imaged, allowing for improved accuracy and reliability.

With image quality, manufacturers adjust to ensure the best image possible is available, but image tastes can differ a lot between continents, countries and even between doctors at the same clinic, so technology that allows doctors to modify image quality is essential.

One of the advantages with digital imaging is that OEMs develop or buy image enhancement software and tune the image into specific taste that fit the market. Now OEMs also offer the clinicians the possibilities to do the same, adjust the image to their preference locally. This is a feature that first originated with the ultrasound clinicians but now the product has spread to the X-ray field so technicians and doctors can adjust the image enhancement to their taste to optimize the reading.

X-ray Tomorrow

The demand of image quality will continue to set higher levels of standards, making the need of standardization more important. Image control will be provided to the end-user in a broader range. For instance a radiologist can make adjustment to the image so he/she can see the details that he/she needs to make a proper diagnosis. Then comes the second opinion of another radiologist who wants the image to look slightly different and can make these adjustments directly into the image. Therefore, image control will be more important.

The question is who will have this control? Can the manufacturers "give up" the image control to the clinical users? Do the end users really want a variety of choices or just a one-button machine? How much will the different tastes differ? Will there be an international common view of what a "good image quality" is? When it comes to dose control, how little dose will be necessary in the future to get a good image? The future brings enhanced image quality with minimum use of radiation, possibly with mathematical simulation algorithms for noise subtraction. Quality of image is determined by the number of pixels - higher the pixels, higher will be the quality. But high numbers of pixels also causes high noise, so what will the low dose noise limit be? Today there already are good image enhancement products available to take care of this problem. In the future, the image enhancement software will be even more sophisticated, making it possible for the manufacturers to mesh this software even more seamless and with less hardware resources.

The future of digital radiology moves towards more mobile systems that are individually customized to fit the clinician and patient needs. Low dose will continue to be a main concern, therefore the future systems will provide a low-dose mobile system adjusted for every customer. All this at a low cost since prices will continue to decrease. At the same time, new ways of using combined modalities will be more and more common. The X-ray image of tomorrow will be more optimized for the specific clinical question the clinician has. The purpose of a good image quality with good image enhancement is to get a correct diagnosis fast.

In the end, the analogue and the digital radiology systems will have the same purpose, even though we develop technology in a world with continuous progress, to save lives with correct diagnosis fast.

Peter Kövamees is manager of the strategic planning and insourcing at ContextVision. He has a MSc in Engineering and has during most of his professional carrier had leading position at global medical imaging companies. He has extensive experience in the global medical image market since more than 20 years.