[month] [year]

Abhishek Tyagi – Regional Anesthesia

Abhishek Tyagi, supervised by Prof. Jayanthi Sivaswamy received his Master of Science in  Computer Science and Engineering (CSE). Here’s a summary of his research work on Nerve Block Target Localization and Needle Guidance for Autonomous Robotic Ultrasound Guided Regional Anesthesia:

Ultrasound guided regional anesthesia (UGRA) involves approaching target nerves through a needle in real-time, enabling precise deposition of drug with increased success rates and fewer complications. Development of autonomous robotic systems capable of administering UGRA is desirable for remote settings and localities where anesthesiologists are unavailable. Real-time segmentation of nerves, needle tip localization and needle trajectory extrapolation are required for developing such a system. In the first part of this thesis, we developed models to localize nerves in the ultrasound domain using a large dataset. Our prospective study enrolled 227 subjects who were systematically scanned for brachial plexus nerves in various settings using three different ultrasound machines to create a dataset of 227 unique videos. In total, 41,000 video frames were annotated by experienced anaesthesiologists using partial automation with object tracking and active contour algorithms. Four baseline neural network models were trained on the dataset and their performance was evaluated for object detection and segmentation tasks. Generalizability of the best suited model was then tested on the datasets constructed from separate ultrasound scanners with and without fine-tuning. The results demonstrate that deep learning models can be leveraged for real time segmentation of brachial plexus in neck ultrasonography videos with high accuracy and reliability. Using these nerve segmentation predictions, we define automated anesthesia needle targets by fitting an ellipse to the nerve contours. The second part of this thesis focuses on localization of the needles and development of a framework to guide the needles toward their targets. For the segmentation of the needle, a natural RGB pre-trained neural network is first fine-tuned on a large ultrasound dataset for domain transfer and then adapted for the needle using a small dataset. The segmented needle’s trajectory angle is calculated using Radon transformation and the trajectory is extrapolated from the needle tip. The intersection of extrapolated trajectory with the needle target guides the needle navigation for drug delivery. The needle trajectory’s average angle error was 2 o , average error in trajectory’s distance from center of the image was 10 pixels (2 mm) and the average error in needle tip was 19 pixels (3.8 mm) which is within acceptable range of 5 mm as per experienced anesthesiologists. The entire dataset has been released publicly for further study by the research community.

November 2023

  •  
  •  
  •  
  •  
  •  
  •  
  •  
  •