Cannot open file (/var/www/vhosts/aprendtech.com/httpdocs/wordpress/wp-content/backup/.htaccess)Cannot write to file (/var/www/vhosts/aprendtech.com/httpdocs/wordpress/wp-content/backup/.htaccess) AprendBlog » Clinical hardware


Jun 12 2015

Beam hardening 1

Tag: Clinical hardware,Physicsadmin @ 2:50 pm
Beam hardening artifacts were seen soon after the introduction of CT. Radiologists noticed a ring of increased Hounsfield numbers against the inside of the skull. At first they thought the increase was due to the difference between the white matter in the interior and the gray matter in the cortex of the brain but images of skulls filled with only water also showed the ring so it was obvious the increased values were an artifact.
The EMI corporation, which produced the first CT scanners, must have known about the artifact but they were notoriously close mouthed about the scanner design. In their first scanner the patient stuck his head into a plastic bladder filled with water and the x-ray system measured through the head surrounded by the water. This reduced the dynamic range requirements for the electronics but it also reduced the beam hardening nonlinearity as well as other artifacts as I will show.
In Al Macovski’s group at Stanford, we quickly figured out that the change in average energy of the transmitted photons as the object thickness increases, spectral shift as we called it, would produce a nonlinear relationship between the logarithm of the measurements and the line integral of the attenuation coefficient. We also showed that this nonlinearity could produce the artifact. We were quite interested in it because it was an effect of x-ray energy on the image and we wanted to extract energy dependent information.
Fig. 1↓ shows that the change in average energy and the effective attenuation coefficient as object thickness increases are both quite large. In this post, I will show how this change leads to a nonlinearity between the log of the measurements and the line integral of the object. I will derive expressions for the magnitude of the nonlinearity. These will lead to ways to reduce the nonlinearity and therefore the artifacts. In later posts I will show that the nonlinearity cannot in general be corrected using a lookup table, the no-linearize theorem. I will then describe a general way to understand the effect of the nonlinearity on the reconstructed CT image. Finally, I will examine whether iterative reconstruction methods can be used to correct the artifacts by making the projections and the image consistent.

more –>;


Nov 09 2014

Improve noise by throwing away photons?

Tag: Clinical hardware,Noise,Physicsadmin @ 11:48 am
Photon counting systems with pulse height analysis (PHA) count the number of photons whose energy falls within a set of energy ranges, which I will call bins. Usually the bins are contiguous, non-overlapping, and span the incident energy spectrum so each photon falls within one bin. A paper[6] by Wang and Pelc showed that the A-vector noise variance can be decreased by using bins that are not contiguous. That is, if we use bins that only cover the low and high energy regions and do not include intermediate energies, we can lower the noise variance. Photons with energies in these intermediate regions are not counted i.e. they are thrown away. Improving noise by throwing away photons is an interesting concept and I will discuss it in this post. It turns out to be an example where the choice of the quality measure fundamentally changes the hardware design, which happens often, so it is important to study it.

more –>;


Dec 26 2013

Parameters for the estimator

You may ask, what is the fundamental advantage of the new estimator? Yes, it is faster than the iterative method but so what? With Moore’s law, we can just throw silicon at the problem by doing the processing in parallel. I have two responses. The first is that not only is the iterative estimator slow but it also takes a random time to complete the calculation. This is a substantial problem since CT scanners are real-time systems. The calculations have to be done in a fixed time or the data are lost. The gantry cannot be stopped to wait for a long iteration to complete!
The second problem is that, as it has been implemented in the research literature, the iterative estimator requires measurement of the x-ray tube spectrum and the detector energy response to compute the likelihood for a given measurement. These are difficult measurements that cannot be done at most medical institutions. Because of drift of the system components, the measurements have to be done periodically to assure accurate results. There may be a way to implement an iterative estimator with simpler measurements but I am not aware of it.
In this post, I will show how the parameters required for the new estimator can be determined from measurements on a phantom placed in the system. This could be done easily by personnel at medical institutions and is similar to quality assurance measurements now done routinely on other medical systems.

more → Continue reading “Parameters for the estimator”


Mar 12 2013

Summary ebook available

I have prepared an ebook that compiles and organizes the posts in this blog to today’s date. You can access it by sending me email:
Energy-selective x-ray imaging and other topics.

I plan to update the book regularly and I will post an entry to the blog when an updated version is available.