SIR 2024
Men's Health
Sheng Xu, PhD
Staff Scientist
National Institutes of Health Clinical Center
Financial relationships: Full list of relationships is listed on the CME information page.
Charisse Garcia, RN
Research Nurse
National Institutes of Health
Disclosure information not submitted.
Lindsey Hazen, Clinical Nurse
Clinical Research Nurse
National Institutes of Health
Disclosure information not submitted.
Tabea Borde, MD, PhD
Clinical Research Fellow
National Institutes of Health
Financial relationships: Full list of relationships is listed on the CME information page.
Bradford J. Wood, MD, FSIR
Director NIH Center for IO, Chief IR
NIH
Financial relationships: Full list of relationships is listed on the CME information page.
Conventional B-mode ultrasound is not sensitive to many cancers, including prostate. The incorporation of magnetic resonance imaging (MRI) to detect areas of suspicion for prostate cancer along with hardware-based MRI-ultrasound image fusion platforms allows physicians to biopsy MRI targets in an office setting, outside of an MRI suite. While fusion guided prostate biopsy can significantly improve the biopsy yield, most prostate biopsies are still performed without this advanced technology. The technology has been initially deployed in suburban and well-resources metropolitan communities. To benefit the large at-risk patient population with less resources and access and to reduce health disparities, this project aims to develop and clinically translate an innovative lesion visualization technology enabled by artificial intelligence (AI), and without hardware investment and ergonomic barriers to adoption.
Materials and Methods:
A conventional 2D ultrasound probe (mc7-2, Philips Healthcare) was used to scan the prostate from the patient’s perineum by performing a manual sweeping arc motion. The ultrasound image frames were recorded. A pre-trained AI model {1} was applied to the ultrasound frames to estimate the inter-frame motion of the ultrasound probe during the sweep, based on imaging. A 3D ultrasound volume was then reconstructed and registered to the preoperative MRI image. All MRI targets were then overlaid onto the recorded 2D ultrasound frames, enabling targeted biopsy. The physician first picked a recorded ultrasound frame with an MRI target, and then matched the real-time ultrasound with the selected frame. The MRI target was subsequently mapped to the real-time ultrasound to guide needle insertion with real-time fusion.
Results:
The system was evaluated as a bystander setting, alongside standard workflows in a prostate cancer patient with two MRI targets. An interventional radiologist and a urologist operated the ultrasound probe separately to find real-time ultrasound images that contained the MRI targets. Post-procedure imaging assessment was performed.
Conclusion: It is feasible to biopsy MRI fusion targets using conventional ultrasound enabled by AI, without additional tracking devices and hardware. This completely software-based approach can be integrated with ultrasound system to potentially reduce cost, setup time, and hardware requirements to enhance access. While deployed in prostate fusion biopsy, the approach is generic and can be extended to other organs for percutaneous biopsy and ablation via fusion.