Skip to main content

Thank you for visiting You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

Device downsizing via signal enhancement

To facilitate diagnostic radiology at the point of care, improvements in imaging hardware and processing software that raise the signal away from the noise floor are being leveraged toward improving device portability or accessibility.

To diagnose and monitor aneurysms, vascular malformations, stroke, or the progression of neurodegenerative diseases, it is essential to acquire anatomically accurate molecular or physiological information via radiological examination. For this, the head scan needs to provide images with sufficient contrast resolution and spatial resolution. Similarly, for detecting or staging the progression of prevalent diseases of the liver — most prominently, non-alcoholic fatty liver disease and non-alcoholic steatohepatitis — the measured spin–lattice and spin–spin relaxation times (via magnetic resonance) or shear-wave amplitudes (through elastography) have to be above noise levels.

Figure reproduced with permission from: top, Article by Cooley and colleagues; bottom left, Article by Cima and colleagues; bottom right, Article by Tanter and colleagues.

Most often quantified in the form of a signal-to-noise ratio (for a particular molecular or physical biomarker), signal quality can, in general, be enhanced via specific molecular, nanoscopic or microscopic agents, by updating the hardware to improve signal acquisition (for instance, via a larger field of view, the generation of optimal pulses and gradients of radiofrequency or pressure, or improved detector sensitivity), and by enhancing the signal through data processing. Unsurprisingly, improvements in signal are typically used to enhance the quality of the information obtained. Yet they can also be leveraged to simplify the instrumentation, with the intention of reducing its cost or size or increasing device portability or accessibility. Downsizing the infrastructure requirements of devices for imaging or sensing the human body and increasing their portability can facilitate and broaden their use and availability at the bedside or at points of care. This is exemplified by three research Articles included in this issue.

Clarissa Cooley and colleagues report in their Article a prototype compact scanner (pictured) for performing magnetic resonance imaging of the brain. Partly because it’s only designed to scan the head, the device is sufficiently small and light (230 kg; the use of lighter components could reduce it to 160 kg), for it to be mounted on a cart and transported to the bedside. Also, it is inexpensive when compared with the typical budgets required to purchase, operate and run a clinical scanner, can be powered from a standard wall socket, and generates T1-weighted, T2-weighted, proton-density-weighted and diffusion-weighted contrasts. Although its imaging resolution is lower than that typical of high-magnetic-field scanners, it is sufficient for detecting brain lesions and haemorrhages. Rather than generating a homogenous field and switchable field gradients (and using the appropriate readout system), Cooley and colleagues used an array of permanent rare-earth magnets that provide a static vertical-field gradient in the range of 50–200 mT and that eliminates the need for cryogenic cooling. For image reconstruction, they used knowledge from the prior characterization of the inhomogeneous magnetic field as well as generalized algorithms to partially correct for spatial deformations. Images from the head of healthy volunteers had to be acquired in an electromagnetically shielded room (as with clinical scanners), yet external reference coils and image post-processing techniques could, in principle, be used to supress artefacts from electromagnetic interference. As noted in an accompanying News & Views article by Michael Tyszka, the power consumption of the prototype device is sufficient for it to be supported by a solar battery system or a diesel generator.

A portable and low-cost sensor for grading liver fibrosis and steatosis (the abnormal accumulation of fat in the liver) would facilitate the screening of individuals at risk, and help guide interventions before patients progress to end-stage liver disease. Ultrasound has limited specificity for detecting steatosis, and magnetic resonance (either imaging or elastography) is expensive and not widely accessible. Michael Cima and collaborators describe in their Article that diffusion-weighted T2 relaxometry, implemented in a one-sided portable sensor (pictured), can accurately grade steatosis in anaesthetized mice (by using an average acquisition time of 10 min) and in human liver samples. The researchers estimated the fraction of fat by decomposing the T2-relaxometry signal and attributing the amplitude of each constituent relaxation time to the appropriate fluid component within the liver (intracellular water, extracellular water and fat; with the slowest relaxation time corresponding to fat). A magnetic-resonance sensor for clinical use would need to balance device portability, the depth of signal penetration (a main constraint in the current preclinical prototype), and the strength of the magnetic field (higher strengths are needed for increased penetration depths, which would increase device size and weight, and thus reduce portability). Optimization of the pulse sequences and signal processing could help balance the trade-offs in favour of a portable form factor.

Because the skull strongly attenuates sound waves, clinical ultrasound imaging of brain vasculature is largely restricted to neonates through the open fontanelle, and to adults during brain surgery through a skull flap. In their Article, Mickael Tanter and colleagues now show that a handheld ultrasonic transducer positioned by the temporal-bone window (the thinnest area of the skull, about 20 mm in diameter; pictured) can be used to image, at microscopic resolution, the haemodynamics of brain vasculature deep in the adult brain. The researchers used transcranial ultrasound localization microscopy, which offers a much better trade-off between sound-wave penetration and imaging resolution than clinical ultrasound. To enhance the contrast, the researchers leveraged intravenously injected microbubbles and the tracking of the speckle patterns that the microbubbles generated in the raw data to correct for micrometric artefacts arising from the natural motion of the brain and for sound-wave aberrations arising from mismatches in the speed of sound in brain tissue and bone tissue. The researchers show that the technique can be used to detect a distinct blood-flow vortex within an aneurysm deep in the brain of a patient. This implementation of transcranial ultrafast ultrasound localization microscopy does not allow for real-time imaging, and is constrained by access through the temporal-bone window; yet parallelization of the image-processing algorithms and large ultrasonic helmets operating at a low frequency (and thus higher sound-wave penetration) might eventually enable whole-brain ultrasonic imaging in real time.

These three proof-of-concept demonstrations of non-invasive medical technology for diagnostics are a promising signal of what is in store for point-of-care diagnostics in radiology. Eventually, their impact in healthcare will also make some noise.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Device downsizing via signal enhancement. Nat Biomed Eng 5, 195–196 (2021).

Download citation


Quick links

Nature Briefing

Sign up for the Nature Briefing newsletter — what matters in science, free to your inbox daily.

Get the most important science stories of the day, free in your inbox. Sign up for Nature Briefing