Home / Scientific activities / Articles / Image quality versus optimization

Image quality versus optimization

Image quality plays an essential role in radiographic dose optimisation. Optimisation involves producing an image with acceptable image quality and low patient radiation dose (ALARP principle) (Department of Health, 2007; ICRP, 2006). Estimates of the radiation dose received by patients are relatively easy to make; by contrast, image quality assessments can be difficult and time consuming (Martin, Sutton, & Sharp, 1999). It is well known that establishing an accurate and reliable diagnosis from radiographic images requires a certain level of image quality. In this context, image optimisation generally concerns itself with creating an image which is fit for purpose (Tingberg et al., 2000). The term, fit for purpose is rarely defined adequately within clinical journal papers. Consequently, the quality of an image refers to the subjective analysis of the visual data contained within it (Jessen, 2004). This would confirm that any image quality measure other than those based on the eyes of an observer could be regarded as a supportive or predictive measure (i.e. physical measure).This is because image perception is almost always based on the visualization of anatomical features within an image, whereas the physical measure relates to a measure of detectability of relevant features but does not directly measure the fidelity of features. When defining the quality of an image, the purpose of the image should also be considered (Lemoigne, Caner, & Rahal, 2007). It is widely agreed that image quality can be defined in terms of its acceptability for answering the primary clinical questions (Sharp, 1990, Kundel, 1979; Shet, Chen, & Siegel, 2011).

Check Also

Image quality evaluation

Both the utility of radiographic images and the precision of image interpretation are highly dependent …

Leave a Reply

Your email address will not be published. Required fields are marked *