Ultrasound guidance is used for a variety of surgical needle insertion procedures, but there is currently no standard for the teaching of ultrasound skills. Recently, computer ultrasound simulation has been introduced as an alternative teaching method to traditional manikin and cadaver training because of its ability to provide diverse scenario training, quantitative feedback, and objective assessment. Current computer ultrasound training simulation is limited in its ability to image tissue deformation caused by needle insertions, even though tissue deformation identification is a critical skill in performing an ultrasound-guided needle insertion. To fill this need for improved simulation, a novel method of simulating ultrasound tissue–needle deformation is proposed and evaluated. First, a cadaver study is conducted to obtain ultrasound video of a peripheral nerve block. Then, optical flow analysis is conducted on this video to characterize the tissue movement due to the needle insertion. Tissue movement is characterized into three zones of motion: tissue near the needle being pulled, and zones above and below the needle where the tissue rolls. The rolling zones were centered 1.34 mm above and below the needle and 4.53 mm behind the needle. Using this characterization, a vector field is generated mimicking these zones. This vector field is then applied to an ultrasound image using inverse mapping to simulate tissue movement. The resulting simulation can be processed at 3.1 frames per second. This methodology can be applied through future optimized graphical processing to allow for accurate real time needle tissue simulation.
The implementation of ultrasound guidance into needle insertion medical procedures has resulted in improved patient outcomes across a wide variety of medical specialties. In anesthesia, studies have shown that applying ultrasonography to peripheral nerve blocks is associated with a lower rate of serious complication, such as local anesthetic systemic toxicity . It also results in a higher rate of successful nerve blocks, improved patient safety, higher quality of pain relief, shorter procedure and onset times, and longer block durations [2–4]. In surgery and interventional radiology, it was found that the use of ultrasound in jugular central venous catheterization resulted in a significant decrease in overall patient complications of up to 71% when compared to the traditional landmark insertion technique [5–8]. Using ultrasound to visualize the needle insertion site for thoracentesis and paracentesis results in a decreased risk of serious bleeding complications and pneumothorax . Because of these advantages, ultrasound guidance has become standard in many procedures [3,10]. However, progress can still be made to further decrease the number of complications currently associated with needle insertion procedures. For example, ultrasound-guided central venous catheterization still reports complication rates ranging from 5.0% to 21% and includes dangerous complications due to erroneous insertion, such as accidental arterial puncture and pneumothorax [11–13]. These injuries result in increased hospital stay and costs for the patient as well as significant costs for the hospital. Central venous catheterization injury claims due to arterial puncture average $41,000 while pneumothorax claims average $143,000 . Despite these ongoing complication concerns, there is currently no standard method for the teaching of ultrasound skills [15–18].
Initial training for ultrasound-guided needle insertion procedures typically uses specially designed manikins [19,20]. Manikins provide a safe controlled environment for practicing, but often replicate ideal patients, failing to portray a less common patient, such as one who is severely injured or obese. Cadavers have also been utilized in the training of ultrasound-guided anesthesia procedures, but can be expensive and are not always available . Both of these training methods also require the presence of a trained resident or attending to oversee progress and evaluate the trainees. Recently, improvements in computing power have resulted in the development of computer-based ultrasound simulators as an alternative to training with manikins and cadavers. This new type of training allows for diverse scenario training, quantitative feedback, and assessment [22–25].
Many methods for simulating ultrasound imaging have been developed that provide realistic imaging, but do not allow for instrument insertion. For example, Sonosim (Santa Monica, CA) Ultrasound Training Solution is a commercially available ultrasound simulator that uses a technique called three-dimensional (3D) ultrasound reconstruction to generate a real time ultrasound image from pre-acquired ultrasound images [26,27]. This device can only simulate the rotations of an ultrasound probe around a fixed point, not full 3D motion. It also does not allow for the simulation of tissue interactions and movement caused by interventions, such as needle insertion. Another common ultrasound simulation method uses computed tomography scans of living patients to generate 3D virtual anatomy. The anatomy is then assigned physical properties, such as ultrasound reflectivity. These simulations combine the physical ultrasound properties of tissues and organs with image filtering techniques that simulate features, such as ultrasound shadowing artifacts, speckle noise, and radial blurring [28,29]. While these simulation techniques provide realistic virtual ultrasound images, they do not account for the effects of tissue deformation due to instrument interactions. Simulating this type of interaction is critical to effectively training ultrasound-guided needle insertion procedures since tissue deformation provides an indication of the location and orientation of the needle . An example of the importance of tissue deformation can be observed during an ultrasound-guided peripheral nerve block. During this type of insertion, the needle is only visible only when it crosses the ultrasound plane which can be difficult to achieve. One method of obtaining proper visualization is by using short rapid back and forth movements of the needle to create tissue movement. This tissue movement aids the user in finding and properly aligning the ultrasound probe over the needle so that it can be properly visualized .
One method of simulating ultrasound image deformation due to instrument interactions is by first simulating the mechanical deformation of soft tissue and using this information to update a tissue model. This updated model is then used in an ultrasound simulation. Mass-spring-dampers are commonly used to simulate mechanical deformation due to their low computational requirements, but these simulations are often only able to show tissue deformation in the path of the needle [30,32]. This limits the model's ability to simulate how different layers of tissue interact outside of the direct needle path. During a needle insertion, varying tissue properties cause different tissue layers to deform at differing rates. This causes additional tissue deformation outside of the direction of the needle insertion. Ultrasound tissue deformation has also been simulated using a fixed real ultrasound image or solid gray image as a background image and the manipulation of foreground objects representing anatomy, such as veins and arteries. The haptic robotic central venous catheterization simulator utilizes this method . A six degrees-of-freedom motion tracker is used as a mock ultrasound probe. The orientation and position of the probe determines the position and shape of the simulated vessels in the ultrasound foreground, causing them to transform realistically. Tissue deformation is then simulated using vertical deformation of the background image and target vessels. While this simulator has been shown to be an effective teaching tool for central venous catheterization, this deformation model is extremely simplistic and there is significant room for improvement in its ultrasound simulation. To fill the need for improved computer simulation of ultrasound images and tissue–needle interactions, this paper proposes a novel technique for simulating tissue–needle deformation in ultrasound images utilizing data acquired from real ultrasound-guided needle insertions.
A high fidelity interactive ultrasound model is developed using a four step process. First, ultrasound video was captured from an ultrasound-guided needle insertion into a cadaver. Second, tissue motion in this ultrasound video was measured using optical flow analysis and characterized into three zones of motion. Third, a method of emulating these three zones of tissue motion using vector fields was developed. Finally, inverse mapping using these vector fields and real ultrasound images is used to simulate tissue–needle interactions in an ultrasound image. Computational performance of the simulation is then evaluated through measuring the time it takes for the simulation to process each image frame, and ways of optimizing this processing time are proposed. Overall, this unique simulation process can allow for accurate training of ultrasound-guided procedures and provides deeper insight into tissue–needle interactions.
Cadaver Experiment Methods.
The cadaver experiment was conducted to obtain ultrasound video from an ultrasound-guided peripheral nerve block so that tissue movement due to the needle insertion could be characterized. For the cadaver experiment, a needle insertion was conducted into an adult male fresh-frozen cadaver. Sciatic peripheral nerve block was chosen because it is a common ultrasound-guided needle insertion procedure, and it is a deep insertion relative to procedures such as jugular central lines. This is allowed for greater overall movement in the ultrasound video thereby allowing for more accurate characterization of tissue movement. The needle used was a 20 Gauge, 100 mm, 30 deg bevel, B. Braun Medical, Inc. (Bethlehem, PA) Stimuplex Ultra 360 Insulated Echogenic Needle. The needle was chosen because it is often used in peripheral nerve block procedures, and it is echogenic making it clearly visible in ultrasound imaging. The needle insertion was conducted into the thigh near the sciatic nerve to emulate a sciatic peripheral nerve block. The insertion was conducted by an anesthesiologist with over 15 years of experience. The anesthesiologist was directed to insert the needle using a constant velocity so that the tissue movement could be clearly observed and characterized. Ultrasound video was captured using a Sonosite (Bothell, WA) L38xp linear ultrasound probe at a frame rate of approximately 12 frames/s and a resolution of 580 px × 600 px. For this experiment, the pixel scale is 15.2 px/mm.
Ultrasound video captured was analyzed using Mathworks (Natick, MA) matlab and matlab's Farnebäck optical flow estimation algorithm with a local neighborhood size around each pixel of 9 and 3 pyramid levels. Farnebäck optical flow approximates the neighborhood around the pixels of two consecutive frames using a polynomial expansion transform . The algorithm then generates an image pyramid to better estimate pixel movement greater than the size of the neighborhood. The result of this analysis is a vector field with the estimated relative motion of each pixel between frames of the video.
Cadaver Experiment Results.
The recorded ultrasound video contains minimal movement of the ultrasound probe and nearly horizontal needle orientation allowing for the accurate characterization of the tissue movement. Figure 1 shows the calculated optical flow for one of the ultrasound video frames. The optical flow vector field revealed a trend in the motion of the tissue. First, there are three distinct zones of deformation in the tissue. Zone 1 is near the needle where tissue is pulled in the direction of needle movement. This pulling in zone 1 is caused by friction between the needle and the tissue. Zone 2 is above the needle and shows that the tissue rolls in a counterclockwise motion as the needle moves from left to right across the screen. Zone 3 is below the needle and shows that the tissue rolls in a clockwise motion as the needle moves from left to right. Zones 2 and 3 are caused by the friction in the needle pulling to the right, while connective tissue attempts to resist this pull. Rotations of zones 2 and 3 were found on average to be centered 20.3 px (1.34 mm) above and below the needle, respectively, and 68.8 px (4.53 mm) behind the needle tip.
The variable y is the row location in a given column. The parameters for the Gaussian distribution fit were ai as the amplitude of the distribution peak, bi as the centroid peak, and ci as a value relating to the peak width. The number of parameters n was set to 2. This was to prevent an unnecessary increase in computational demand that would only provide a marginal increase in fit. Minimizing computational complexity is critical to decreasing computational time for real time simulation. For all frames, bi was found to be equal to or very close to the vertical pixel location of the needle. An average overall Gaussian distribution was calculated by averaging the Gaussian distribution parameters for all of the optical flow plots calculated between ultrasound video frames. These parameters are shown in Table 1. The R2 for the Gaussian distribution was 0.916. Utilizing the Gaussian magnitude distribution from Fig. 2 in the Cadaver Experiment Results section and the three distinct zones of optical flow motion, an accurate simulation method to create realistic ultrasound image deformation is developed and discussed in the Simulating Ultrasound Tissue Deformation section.
Simulating Ultrasound Tissue Deformation
The proposed method of simulating needle deformation in an ultrasound image uses inverse mapping and bilinear interpolation to transform an original ultrasound image. There are three main parts to this process. First, three vector fields representing the three zones of tissue deflection are generated and merged. Second, the magnitude of the vector field is scaled using the Gaussian distribution measured in the cadaver study. Third, a base ultrasound image is transformed according to the vector field to simulate tissue deformation.
Building Vector Fields.
A vector field representing the simulated motion of the pixels is generated by combining three vector fields. Each vector field represents one of the three zones of tissue motion described in the cadaver study. The flowchart in Fig. 3 shows how the three different vector fields are combined with a Gaussian distribution and tissue stiffness factor matrix to make a single vector field that simulates the overall ultrasound tissue deformation. This method is applied to each generated frame in the ultrasound simulation. For each frame, the simulated needle position in the frame is known. For all of the following matrices, the rows and columns of the matrices correspond directly to a pixel location in the simulated ultrasound image.
The vector field around the needle tip has a parabolic shape, with a maximum value of 1, to prevent a rapid decrease in vector magnitude between two pixels. This prevents a disjointed appearance at the edge of the vector field from appearing in the simulated image. A final operation is performed to prevent the vector field from reversing direction at locations far from the needle due to Eqs. (2) and (3). For insertions in positive directions, values lower than 0 are set to 0, while for insertions in negative directions, values greater than 0 are set to 0.
Now that the unit vector fields for the tissue rolling have been formed, magnitudes are applied to these unit vectors. The magnitudes are determined using the Gaussian distribution found during the cadaver study, except that bi, the location of distribution peak, is altered to equal the location of the needle in the y direction for each vertical column of pixels in the simulated image. This change in bi causes the peak magnitude to occur at the location of the needle in every vertical line of pixels. Figure 4 shows a magnitude plot of the Gaussian matrix, G, values for a given needle position. Each vertical column in matrix G has a Gaussian distribution shape with a peak at the location the column crosses the needle. For the full matrix G, it can be seen that the magnitude is greatest along the direction of the needle and decreases in magnitude according to the Gaussian distribution further away in the direction perpendicular to the needle.
The final factor in the simulation considers the stiffness of different layers of tissue in the ultrasound image. The stiffness factor matrix, S, creates a matrix where different zones are given a value between 1 and 0, where 1 is the tissue that will be unaltered by the stiffness factor and 0 is the tissue that will not move in the simulation. For example, Fig. 5 shows a real ultrasound image and an ultrasound stiffness factor matrix used to estimate the stiffness of different tissues seen in the image. The ultrasound stiffness factor matrix is visualized as a grayscale image where white corresponds with matrix elements values of 1 and black corresponds with 0. The white region has no additional stiffness factor, while the region at the top has a stiffness factor of 0.3 which will reduce tissue defection in that region by 70%. A two-dimensional Gaussian filter is applied S to smooth the edges between the different stiffness zones to prevent a disjoint image effect from appearing these transition zones in the simulation.
Figure 6 shows an example tissue deformation vector field created using the above methodology. This type of vector field is generated for each frame of the simulation, taking into account the updated needle position.
Inverse Mapping of the Ultrasound Image.
The pixel value to be determined is f(x,y), while the following values of f(Qij) are known at pixels xi and yj with Q11= (x1, y1), Q12 = (x1, y2), Q21= (x2, y1), and Q11= (x2, y2). Inverse mapping and bilinear interpolation is applied to an original reference ultrasound image, using each frame's unique tissue deformation vector field for each image frame in the ultrasound simulation. The reference image is not updated using the previous transformed image because continuous transforming of this image generates a blurring effect over time.
Ultrasound Simulation Performance.
This simulation method was tested using a PC with an Intel (Santa Clara, CA) Core i7-5930K 3.50 GHz, six core central processing unit (CPU) and 32 GB of RAM. A test program was written using Mathworks matlab 2018a. The ultrasound image and stiffness factor matrix from Fig. 5 was used at a resolution of 580 px 600 px. A mock needle insertion path was created for the simulation and the needle was simulated using a gray line. 100 image frames were simulated for 5 s of simulated ultrasound video. Parallel processing utilizing 6 CPU cores was used within the simulation loop to improve processing time.
The simulation was conducted ten times, and the average time to simulate 100 frames was 32.5 s. One of these frames with its vector field overlaid and a line indicating the needle position is show in Fig. 6. Figure 7 shows the processing time of each ultrasound frame. The longest time to process a single frame was 0.48 s while the shortest time was 0.22 s. The major reason for the varying differences in frame processing time was the use of a matrix determinant when calculating the distance between the pixels in the image and the line segment representing the needle for matrix D1. For pixels that are located in front of the needle path, using a determinant is not necessary when calculating the distance between the line segment representing the needle and the pixel since this distance equal to the Euclidean norm between the needle tip and the pixel coordinates. The result of this is that image frames that contain more needle will take longer to process than image frame that contain less needle.
A methodology of characterizing tissue deformation was explored and verified. Using the stated methodology unique zones of characterization were discovered, and it was found that the magnitude of the motion could be characterized as a Gaussian distribution. This methodology proposed is flexible and can include any tissue motion through additions in the deformation vector field. Additionally, this methodology outlined allows for an increase or decrease in complexity by altering the number of zones of motion to be simulated. Different needle insertion locations in the body would create different zones of motion and different stiffness factor matrices based on the specific tissue connectivity. However, this same methodology could be used to explore and properly characterize this tissue movement thereby allowing for virtual simulation to be developed.
The processing time for this work was found to be at a lower rate than the required 0.033 s or 30 frames per second for real time simulation. The CPU processing rate restricted this speed. It is likely that by implementing multithreading using a graphical processing unit, the simulation speed could be increased to allow for real time simulation.
A new method of simulating ultrasound tissue–needle deformation was developed and tested. First, ultrasound video of a needle insertion was captured during a cadaver study to determine the characteristics tissue deformation seen during an ultrasound-guided needle insertion. Using this image data, an optical flow analysis was performed and found that there were three distinct zones of tissue deflection: one zone where tissue near the needle being pulled in the direction of needle insertion, and a second and third zone above and below the needle where tissue rolls. Using this information, a method building a tissue deformation vector field to imitate these three zones of deformation was detailed. Finally, each frame of ultrasound deformation simulation was created using inverse mapping and bilinear interpolation to map the tissue deformation vector fields to an ultrasound image. While this simulation method is not fast enough to simulate real time ultrasound-guided needle insertion using only a CPU, it may be possible to achieve 30 frames per second of simulation by implementing multithreading with a graphics processing unit. The main limitation of this method is that currently we evaluate the impact of only in-plane tissue motion through two-dimensional ultrasound environment. Further work exploring 3D ultrasound movement would expand the accuracy and understanding of more complex tissue motions.
Research reported in this publication was supported by the National, Heart, Lung, and Blood Institute of the National Institutes of Health under Award Number R01HL127316 and by the Penn State College of Engineering.
National Heart, Lung, and Blood Institute (R01HL127316; Funder ID: 10.13039/100000050).
- ai =
amplitude of Gaussian distribution peak, px
- Ax =
x component of vector field vortex matrix for zone 2, px
- Ay =
y component of vector field vortex matrix for zone 2, px
- bi =
centroid of Gaussian distribution, px
- Bx =
x component of vector field vortex matrix for zone 3, px
- By =
y component of vector field vortex matrix for zone 3, px
- ci =
Gaussian distribution peak width factor, px
- D =
matrix where each element corresponds to perpendicular pixel distance from needle
- f =
pixel color, 0 to 255
- G =
Gaussian distribution matrix, px
- i =
Gaussian distribution index, 1 or 2
- Î =
x direction unit vector
- ĵ =
y direction unit vector
- Mx =
x coordinate grid matrix
- My =
y coordinate grid matrix
- n =
Gaussian distribution parameters, 2
- p =
pixel color reference location, (x, y)
- px =
- Qij =
known pixel location, (xi, yj)
- S =
stiffness factor matrix, 0 to 1
- T-1 =
inverse mapping transformation matrix
- U =
simulation horizontal pixel movement matrix, px
- U1 =
zone 1 horizontal pixel movement matrix, px
- U2 =
zone 2 horizontal pixel movement matrix, px
- U3 =
zone 3 horizontal pixel movement matrix, px
- V =
simulation vertical pixel movement matrix, px
- ν1 =
zone 2 vortex center location, px
- ν2 =
zone 3 vortex center location, px
- V1 =
zone 1 vertical pixel movement matrix, px
- V2 =
zone 2 vertical pixel movement matrix, px
- V3 =
zone 3 vertical pixel movement matrix, px
- x =
image pixel column location in original image
- x' =
image pixel column location in new image
- xi =
pixel reference location (i = 1 or 2)
- y =
image pixel row location in original image
- y' =
image pixel column location in new image
- yj =
pixel reference vertical location (j = 1 or 2)