Hubbry Logo
search
logo
1421251

Coherent diffraction imaging

logo
Community Hub0 Subscribers
Write something...
Be the first to start a discussion here.
Be the first to start a discussion here.
See all
Coherent diffraction imaging

Coherent diffractive imaging (CDI) is a computational microscopy method that reconstructs images from coherent diffraction patterns without the use of lenses. It was first experimentally demonstrated in 1999 by Miao and collaborators using synchrotron X-rays and iterative phase retrieval. CDI has been applied to image structures such as nanotubes, nanocrystals, porous nanocrystalline layers, defects, potentially proteins, and more. In CDI, a highly coherent beam of X-rays, electrons or other wavelike particle or photon is incident on an object. The beam scattered by the object produces a diffraction pattern downstream which is then collected by a detector. This recorded pattern is then used to reconstruct an image via an iterative feedback algorithm. Effectively, the objective lens in a typical microscope is replaced with software to convert from the reciprocal space diffraction pattern into a real space image. The advantage in using no lenses is that the final image is aberration–free and so resolution is only diffraction and dose limited (dependent on wavelength, aperture size and exposure). Applying a simple inverse Fourier transform to information with only intensities is insufficient for creating an image from the diffraction pattern due to the missing phase information. This is called the phase problem.

The overall imaging process can be broken down in four simple steps: 1. Coherent beam scatters from sample 2. Modulus of Fourier transform measured 3. Computational algorithms used to retrieve phases 4. Image recovered by Inverse Fourier transform

In CDI, the objective lens used in a traditional microscope is replaced with computational algorithms and software which are able to convert from the reciprocal space into the real space. The diffraction pattern picked up by the detector is in reciprocal space while the final image must be in real space to be of any use to the human eye.

To begin, a highly coherent light source of x-rays, electrons, or other wavelike particles must be incident on an object. This beam, although popularly x-rays, has potential to be made up of electrons due to their decreased overall wavelength; this lower wavelength allows for higher resolution and, thus, a clearer final image. However, electron beams are limited in penetration depth compared to X-rays, as electrons have an inherent mass. Due to this incident light, a spot is illuminated on the object being detected and reflected off of its surface. The beam is then scattered by the object producing a diffraction pattern representative of the Fourier transform of the object. The complex diffraction pattern is then collected by the detector and the Fourier transform of all the features that exist on the object's surface are evaluated. With the diffraction information being put into the frequency domain, the image is not detectable by the human eye and, thus, very different from what we're used to observing using normal microscopy techniques.

A reconstructed image is then made through utilization of an iterative feedback phase-retrieval algorithm where a few hundred of these incident rays are detected and overlapped to provide sufficient redundancy in the reconstruction process. Lastly, a computer algorithm transforms the diffraction information into the real space and produces an image observable by the human eye; this image is what we would likely see by means of traditional microscopy techniques. The hope is that using CDI would produce a higher resolution image due to its aberration-free design and computational algorithms.

There are two relevant parameters for diffracted waves: amplitude and phase. In typical microscopy using lenses there is no phase problem, as phase information is retained when waves are refracted. When a diffraction pattern is collected, the data is described in terms of absolute counts of photons or electrons, a measurement which describes amplitudes but loses phase information. This results in an ill-posed inverse problem as any phase could be assigned to the amplitudes prior to an inverse Fourier transform to real space.

Three ideas developed that enabled the reconstruction of real space images from diffraction patterns. The first idea was the realization by Sayre in 1952 that Bragg diffraction under-samples diffracted intensity relative to Shannon's theorem. If the diffraction pattern is sampled at twice the Nyquist frequency (inverse of sample size) or denser it can yield a unique real space image. The second was an increase in computing power in the 1980s which enabled iterative hybrid input output (HIO) algorithm for phase retrieval to optimize and extract phase information using adequately sampled intensity data with feedback. This method was introduced by Fienup in the 1980s. In 1998, Miao and collaborators used numerical simulations to demonstrate that when the independently measured intensity points is more than the unknown variables, the phase can be in principle retrieved from the diffraction pattern via iterative algorithms. These developments culminated in 1999 when Miao et al. demonstrated CDI experimentally by reconstructing micrometer-sized, non-crystalline specimens from synchrotron X-ray diffraction patterns using iterative algorithms.

In a typical reconstruction the first step is to generate random phases and combine them with the amplitude information from the reciprocal space pattern. Then a Fourier transform is applied back and forth to move between real space and reciprocal space with the modulus squared of the diffracted wave field set equal to the measured diffraction intensities in each cycle. By applying various constraints in real and reciprocal space the pattern evolves into an image after enough iterations of the HIO process. To ensure reproducibility the process is typically repeated with new sets of random phases with each run having typically hundreds to thousands of cycles. The constraints imposed in real and reciprocal space typically depend on the experimental setup and the sample to be imaged. The real space constraint is to restrict the imaged object to a confined region called the "support". For example, the object to be imaged can be initially assumed to reside in a region no larger than roughly the beam size. In some cases this constraint may be more restrictive, such as in a periodic support region for a uniformly spaced array of quantum dots. Other researchers have investigated imaging extended objects, that is, objects that are larger than the beam size, by applying other constraints.

See all
User Avatar
No comments yet.