283 Views
Quick Fire Session
SCMR 22nd Annual Scientific Sessions
Daniel Kim, PhD
Professor
Northwestern University Feinberg School of Medicine
Nivedita Naresh, PhD
Assistant Research Professor
Children's Hospital Colorado
Daniel Lee, MD
Associate Professor of Medicine (Cardiology) and Radiology
Northwestern University, Feinberg School of Medicine
Background:
Myocardial blood flow (MBF) quantification derived from perfusion MR images requires accurate measurement of gadolinium concentration [Gd] in the arterial input function (AIF). Regrettably, a standard perfusion MRI protocol with saturation to center of k-space acquisition time (TS) of approximately 100 ms produces severe underestimation of AIF. Why? Because the blood signal at peak enhancement approaches the asymptotic (nonlinear) part of the signal-to-[Gd] relationship. Both dual-bolus [1] and dual-imaging [2,3] methods are designed to avoid this nonlinear relationship. While avoiding the nonlinear relationship is well established, the exact mechanism remains unclear. We hypothesize that T2* decay and noise are major sources of error for quantification of AIF. The purpose of this study was to test this hypothesis using a theoretical analysis.
Methods: For convenience, we performed our theoretical analysis using imaging parameters described by an optimal dual-imaging sequence [4]. Namely, TE = 1 ms and TS = 24 ms for AIF and TS = 100 ms for tissue enhancement. For dual-bolus, we considered TS=100 ms and peak [Gd] = 1mM. For dual-imaging, we considered peak [Gd] ranging from 5-10 mM. While T2* is more relevant, we elected to model T2 instead. For each iteration, we defined the region of interest with 100 voxels representing the left ventricular cavity and computed the mean value. This experiment was repeated 1,000 times to plot the average trends, where reported values represent mean ± SD of mean values from all iterations. Assuming perfect saturation pulse [4] and ignoring excitation pulses, we modeled using an ideal saturation recovery equation (see Eq.1 in Figure 1). For [Gd] ranging from 0 to 10 mM, assuming fast water exchange with gadobutrol (Gadavist) [5], we calculated R1 and R2 of blood using Eqs. 5-10 shown in Figure 1. To account for T2 decay, Eq.1 was attenuated as shown in Eq. 2. For noise analysis, white Gaussian noise with mean = 0 and standard deviation = 0.15 was added (Eq.3) such that M0 (proton density image) signal-to-noise ratio was equal to 11, which is typical. For converting normalized signal (Mz/M0) to [Gd], we assumed T1 relaxation only (Eq.1) and calculated R1 and then [Gd].
Results:
Figures 2 and 3 shows plots of Mz/M0 and measured [Gd], respectively, as a function of ground truth [Gd] for two different TS values. Errors induced by noise and T2 decay in normalized signal and measured [Gd] are smaller for TS = 24 ms than TS = 100 ms at high [Gd] values between 5-10 mM representing peak blood enhancement. At [Gd] = 1mM representing peak blood enhancement with a diluted bolus, the errors were negligible.
Conclusion: Our analysis shows that both T2(T2*) decay and noise are major sources of error for quantification of AIF using standard perfusion MRI and confirms that both dual-bolus (diluted injection, TS=100ms) dual-imaging (normal injection, TS=24ms) are insensitive to these effects and produce accurate AIF.