Class for performing: US probe calibration; calibration quality assessment; voxel-array reconstruction
Methods
Adjust the original time vector of US images. The time delay set by setDevicesTimeDelay() will be subtracted from the original time vector extracted from US data.
Note
This method must be called before any method using optoelectronic data, such as calculatePoseForUSProbe().
Align US images in the global reference frame. This task can take some time, and computation time is proportional to the total number of US images to align.
Align US images in the global reference frame. This task can take some time, and computation time is proportional to the total number of US images to align.
Calculate roto-translation matrix from global reference frame to convenient reference frame. Voxel-array dimensions are calculated in this new refence frame. This rotation is important whenever the US scans sihouette is remarkably oblique to some axis of the global reference frame. In this case, the voxel-array dimensions (calculated by the smallest parallelepipedon wrapping all the realigned scans), calculated in the global refrence frame, would not be optimal, i.e. larger than necessary.
Parameters: | convR : mixed
|
---|
Estimate the delay between the US device and the optoelectronic device.
Parameters: | method: str
vertCoordIdx : int, optional
showGraphs : bool, optional
|
---|
Calculate the pose of the US images with respect to the global reference frame.
Note
In [Ref2], this is the product .
Calculate the attitude (or pose) of the marker-based US probe reference frame with respect to the global reference frame.
Note
In [Ref2], this is named .
After extracting markers data from the kinematics files set with method setKineFiles(), this data will be concatenated and resampled using USTimeVector, if this one is available. Otherwise, kinematics data will be resampled based on optoelectronic system frequency and US system frequency. Only after kinematics and US data have a common time line, US probe attitude will be calculated.
Parameters: | mkrList : list
USProbePoseFun : mixed
USProbePoseFunArgs : mixed
globPoseFunArgs : mixed
globPoseFun : mixed
kineFilesReadOpts : dict
showMarkers : bool
|
---|
Estimate calibration accuracy.
Parameters: | acc : str
L : float
P : np.ndarray
|
---|
Estimate calibration precision.
Parameters: | prec : str
|
---|
Calculate dimensions for voxel array. The convenient reference frame (see calculateConvPose()) is translated to a voxel array reference frame, optimally containing the US images is the first quadrant.
Calculate the attitude (or pose) of the US images with respect to the probe reference frame.
Note
In [Ref2], this is named .
Parameters: | init : dict
xtol : float
ftol : float
method : str
method_args : dict
fixed : list
correctResults : bool
|
---|
(static) Evaluate calibration matrix with parameters values.
Parameters: | x : dict
|
---|---|
Returns: | prRim : np.ndarray
Tim : np.ndarray
|
Export US scan silhouette voxel-array to VTI file.
Parameters: | outFile : str
|
---|
Export US scan silhouette voxel-array to VTI file.
Parameters: | outFile : str
|
---|
Export grey-values voxel-array to VTI file.
VTI is a VTK file format (see here).
Parameters: | outFile : str
|
---|
Export grey-values voxel-array to VTI file.
VTI is a VTK file format (see here).
Parameters: | outFile : str
|
---|
Extract features (points, lines, ...) from US images.
The used file will be the one indicated in method setUSFiles().
Parameters: | feature : str
segmentation : str
segParams : dict
showViewer : bool
featuresFile : mixed
|
---|
Get adjusted US time vector (see adjustUSTimeVector()).
Returns: | list
|
---|
Get estimated delay between the US device and the optoelectronic device (See method calculateDevicesTimeDelay()).
Returns: | float
|
---|
Create virtual 3D points for US images corners with respect to the global reference frame.
Returns: | dict
|
---|
Get the pose of the US images with respect to the global reference frame.
Returns: | np.ndarray
|
---|
Get estimated calibration accuracy data.
Parameters: | acc : str
|
---|---|
Returns: | listDA : np.ndarray
DA : float
|
Get calibration results.
Returns: | prRim : np.ndarray
Tim : np.ndarray
sx, sy : float
calib : dict
|
---|
Get estimated calibration precision data.
Parameters: | prec : str
|
---|---|
Returns: | float
|
Return roto-translation matrix from voxel array reference frame to global reference frame.
Returns: | np.ndarray
|
---|
Get physical size for a single voxel.
Returns: | list
|
---|
Initialize voxel array. It instantiate data for the voxel array grey values.
Initialize voxel array. It instantiate data for the voxel array grey values.
Set data source properties (for US and/or optoelectronic system).
Parameters: | kineFreq : int, optional
USFreq : int, optional
w : int, optional
h : int, optional
USTimeVector : list, optional
fromUSFiles : list, optional
pixel2mmX, pixel2mmY : float; optional
|
---|
Set delay between the US device and the optoelectronic device.
Parameters: | timeDelay : float
|
---|
Set parameters for gap filling.
Parameters: | method : str
maxS : int
minPct : float
blocksN : int
blockDir : str
distTh : int
|
---|
Notes
Only the gaps internal to the wrapper created by alighImages() will beconsidered. If a gap is not filled, its value will be considered the same as a completely black voxel. See chapter In case of MemoryError for tips about setting these parameters.
Set kinematics files list.
Parameters: | kineFiles : list
|
---|
Set probe calibration data.
Parameters: | prRim : np.ndarray
Tim : np.ndarray
|
---|
Set or calculate scale factors that multiply real voxel-array dimensions.
Parameters: | fxyz : mixed
voxFramesBounds : mixed
|
---|
Set parameters for US scans alignement in global reference frame. See chapter In case of MemoryError for tips about setting these parameters.
Set the list of frames (US time line) of the images that can be contained in the voxel array. Frames are further filtered out based on the invalid kinematics frames calculated by calculatePoseForUSProbe().
Parameters: | voxFrames : mixed
voxFramesBounds : mixed
|
---|
Set parameters of vtkImageData object.
Whenever a vtkImageData has to be created (e.g. for exportation purpose) from the internal voxel-array structure, these parameters are used.
Parameters: | sxyz : mixed
|
---|
Emulates Matrix(MatrixSymbol(b, r, c)) as for SymPy version 0.7.2-.
From SymPy 0.7.5+, M = Matrix(MatrixSymbol(b, r, c)) is not safe since:
Calculate Distance Accuracy, as indicated in [Ref2]. It needs 2 single-point features to be extracted for some US images of a calibration quality assessment acquisition. These 2 points (each for different US images) are reconstructed in and the distance is calculated. This process can be repeated for other couples of US images. For instance, if one point is indicated for frames 1, 4, 10, 15, 25, 40, then 3 distances are calculated (1-4, 10-15, 25-40). DA is the mean of the difference between these distances and the gold-standard measured real distance L.
Parameters: | T : np.ndarray
sx, sy : float
points : dict
L : float
|
---|---|
Returns: | listDA : np.ndarray
DA : float
|
Calculate Reconstruction Accuracy, as indicated in [Ref2]. It needs 1 single-point feature to be extracted for some US images of a calibration quality assessment acquisition. These points (each for different US images) are reconstructed in global reference frame. RA is the mean of the norm of the difference between these points and the gold-standard points P.
Parameters: | T : np.ndarray
sx, sy : float
points : dict
P : np.ndarray
|
---|---|
Returns: | dist : np.ndarray
DA : float
|
Calculate point reconstruction reconstruction precision, as in [Ref1]. It needs single-point feature to be extracted for some US images of a calibration quality assessment acquisition. The points are the reconstructed in 3D space, creating a cloud of points. RP is the mean of the distances between each 3D point and the 3D average point.
Parameters: | T : np.ndarray
sx, sy : float
points : dict
|
---|---|
Returns: | float
|
Estimate the delay between two normalized signals by cross-correlation. Normalization consists of demeaning and dividing by the maximum of the rectified signal. From the cross-correlation signal, the maximum value within the time range (-lagsBound, lagsBound), in s, is found. The time instant in which that maximum occurs is the time delay estimation. If positive, s2 is early with respect to s1.
Parameters: | s1, s2 : np.ndarray
s1Label, s2Label : str
timeVector : np.ndarray
step : float
lagsBounds : mixed
withPlots : bool
|
---|---|
Returns: | float
|
Generate and return symbolic expression of 4 x 4 affine rotation matrix from US probe reference frame to US image reference frame.
Returns: | prTi : sympy.matrices.matrices.MutableMatrix
syms : list
|
---|
Generate and return symbolic calibration equations (1) in [Ref2].
Returns: | Pph : sympy.matrices.matrices.MutableMatrix
J : sympy.matrices.matrices.MutableMatrix*)
prTi : sympy.matrices.matrices.MutableMatrix*)
syms : dict
variables : list
mus : list
|
---|
Generate and return symbolic calibration roto-translation matrix in (3) in [Ref3].
Returns: | i2Ti1 : sympy.matrices.matrices.MutableMatrix*)
prTi : sympy.matrices.matrices.MutableMatrix*)
syms : dict
variables : list
mus : list
|
---|
Minimize a modification of expression (1) in [Ref3]. More specifically, it aims at maximizing the average Normalized Cross-Correlation of the intersection of pair of US images.
Parameters: | i2Ti1 : sympy.core.add.Add:
syms : dict
variables : list
init : list
Rpr : np.ndarray
Tpr : np.ndarray
I : np.ndarray
pixel2mmX, pixel2mmY : float
frames : list
savePath : str
thZ : float
maxExpr : str
mask : mixed
|
---|---|
Returns: | scipy.optimize.Result
|
Same as maximizeNCC(), but with Cython implementation. Needs compilation first!
(Deprecated) Same as maximizeNCC(), but with vectorized implementation. Despite it is about 2x faster than maximizeNCC(), we experienced that the this function is intensive memory-wise, so we suggest not to use it yet.
(Deprecated) Minimize a modification of expression (1) in [Ref3]. More specifically, it aims at maximizing the average Normalized Cross-Correlation of the intersection of pair of US images.
Parameters: | i2Ti1 : sympy.core.add.Add
syms : dict
variables : list
init : list
Rpr : np.ndarray
Tpr : np.ndarray
I : np.ndarray
pixel2mmX, pixel2mmY : float
frames : list
thZ : float
|
---|---|
Returns: | scipy.optimize.Result
|
Solve calibration equations (1) in [Ref2]. More specifically, a system of non-linear equations is created by coyping the symbolic equation eq, replacing the experimental data for each time frame, and stacking it in the system to be solved. The iterative method used to solve the system is Levenberg–Marquardt.
Parameters: | eq : sympy.core.add.Add
J : sympy.core.add.Add
syms : dict
variables : list
init : list
xtol : float
ftol : float
Rpr : np.ndarrayN x 3 x 3 array, where R[i,:,:] represents the rotation matrix from the US probe reference frame to the global reference frame, for time frame i. Tpr : np.ndarray
features: dict
regJ: bool
|
---|---|
Returns: | sol : scipy.optimize.Result
k : int
|
Calculate CD2 similarity measure (logarithm of division of Rayleigh noises). Images are supposed to be log-compressed.
Parameters: | I1, I2 : np.ndarray(uint8)
|
---|---|
Returns: | float
|
Calculate Normalized Cross-Correlation between 2 binary images.
Parameters: | I1, I2 : np.ndarray(uint8)
|
---|---|
Returns: | float
|
Create all pixel coordinates for a centered mask around a point. Center point is (cx, cy).
Parameters: | cx : int
cy : int
w : int
h : int
|
---|---|
Returns: | np.ndarray
|
Create all pixel coordinates for an image. Top-left corner is supposed to be the (0, 0) corner.
Parameters: | w : int
h : int
pixel2mmX, pixel2mmY : float
|
---|---|
Returns: | np.ndarray
|
Create corner coordinates for an image. Top-left corner is supposed to be the (0, 0) corner.
Parameters: | w : int
h : int
pixel2mmX, pixel2mmY : float
|
---|---|
Returns: | np.ndarray
|
Create random pixel coordinates for a centered mask around a point. Center point is (cx, cy).
Parameters: | cx : int
cy : int
w : int
h : int
N : int
|
---|---|
Returns: | np.ndarray
|
Create white mask in a grayscale frame, centered around (cx, cy).
Parameters: | frameGray : np.ndarray
cx : int
cy : int
w : int
h : int
|
---|---|
Returns: | tuple
|
Find Shi-Tomasi corners in a subpart of a frame. The research mask is centered around (cx, cy).
Parameters: | frameGray : np.ndarray
cx : int
cy : int
w : int
h : int
featureParams : dict
|
---|---|
Returns: | tuple
|
Calculate similarity measure between histograms.
Parameters: | H1, H2 : np.ndarray
dist : str
|
---|---|
Returns: | float
|
Execute template match between a template match and a search window, by using different similarity measures.
Parameters: | SW : np.ndarray(H x W)
T : np.ndarray(h x w)
meas : mixed
**kwargs : dict
|
---|---|
Returns: | np.ndarray(H-h+1 x W-w+1)
|
Convert pixel array to grey values.
Parameters: | D : np.ndarray
|
---|---|
Returns: | np.ndarray
|
Read DICOM file (containing data for Nc channels, Nf frames, and images of size Nr x Nc).
Parameters: | filePath : str
method : str
|
---|---|
Returns: | D : np.ndarray
ds : dicom.dataset.FileDataset
|
Helper for reading image sequence input file.
Parameters: | img : mixed
reader : str
**kwargs : dict
|
---|---|
Returns: | I : np.ndarray
metadata : dict
|
Helper for reading SITK-compatible input file
Parameters: | filePath : str
|
---|---|
Returns: | I : np.ndarray
image : sitk.Image
|
Convert RGB channels to grey levels, by using the formula here.
Parameters: | R, G, B : np.ndarray
|
---|---|
Returns: | np.ndarray
|
Methods
Create voxel array containing spheres with random position and radius. Spheres voxels have maximun gray level, the rest has minumum grey level. There is no internal check about spheres physically nesting into each other.
Parameters: | xl : int
yl : int
zl : int
N : int
rMax : mixed
|
---|---|
Returns: | np.array(uint8)
|
Create the convex hull for a list of points and the list of coorindates internal to it.
Parameters: | p : np.ndarray
|
---|---|
Returns: | np.ndarray
|
Create cube or parallelepipedon coordinates.
Parameters: | S : mixed
|
---|---|
Returns: | list
|
Create sphere coordinates.
Parameters: | r : int
|
---|---|
Returns: | list
|
Transform a list of indices of 1D array into coordinates of a 3D volume of certain sizes.
Parameters: | idx : np.ndarray
xl, yl, zl : int
|
---|---|
Returns: | list
|
Transform a 1D numpy array into vtk.vtkImageData object. The object contains only one scalar component.
Parameters: | v : np.ndarray
d : list
s : list
vtkScalarType :
|
---|---|
Returns: | vtk.vtkImageData
|
Export a vtk.vtkImageData object to VTI file.
Parameters: | filePath : str
source : vtk.vtkImageData
|
---|
Transform coordinates of a 3D volume of certain sizes into a list of indices of 1D array. This is the opposite of function idx2xyz().
Parameters: | x, y, z : np.ndarray
xl, yl, zl : int
idx: str
|
---|---|
Returns: | np.ndarray or list
|
Convert joint rotation matrix to ZXY Euler sequence.
Parameters: | Rvect : np.ndarray
|
---|---|
Returns: | list
|
Helper class for reconstructing stylus tip using source points rigidly connected to stylus.
Methods
Get tipa data.
Returns: | np.ndarray
|
---|
Set source points 3D coordinates.
Parameters: | P : dict
|
---|
Calculate roto-translation matrix from calcaneous to laboratory reference frame.
Parameters: | mkrs : dict
s : {‘R’, ‘L’}
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
|
References
Leardini A, Benedetti MG, Berti L, Bettinelli D, Nativo R, Giannini S. Rear-foot, mid-foot and fore-foot motion during the stance phase of gait. Gait Posture. 2007 Mar;25(3):453-62. Epub 2006 Sep 11. PubMed PMID: 16965916.
Calculate roto-translation matrix from calcaneous to laboratory reference frame, using rigid segment-connected cluster of technical markers.
Parameters: | mkrs : dict
clusterMkrList : list
args : mixed
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
mkrsSeg : dict
|
Helper function for: - markers cluster pose estimation (by SVD) - reconstruction of the stylus tip in the cluster reference frame.
Parameters: | markers : dict
clusterMkrList : list
clusterArgs : mixed
|
---|---|
Returns: | np.ndarray
|
Express markers in another reference frame.
Parameters: | mkrs : dict
Rfull : np.ndarray
|
---|---|
Returns: | dict
|
Tip reconstruction function for M collinear points.
Parameters: | P : dict
args : dict
|
---|---|
Returns: | np.ndarray
|
Create affine roto-translation matrix from rotation matrix and translation vector.
Parameters: | R : np.ndarray
T : np.ndarray
|
---|---|
Returns: | np.ndarray
|
Create cluster template data from existing markers data.
Parameters: | markers : dict
mkrList : list
timeWin : mixed
|
---|---|
Returns: | dict
|
Extract rotation matrix and translation vector from affine roto-translation matrix.
Parameters: | Rfull : np.ndarray
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
|
Compute K matrix products between a M x N array and a K x N x P array in a vectorized way.
Parameters: | a : np.ndarray
b : np.ndarrayK x N x P array
|
---|---|
Returns: | np.ndarray
|
Compute K matrix products between a K x M x N array and K x N x P array in a vectorized way.
Parameters: | a : np.ndarray
b : np.ndarray
|
---|---|
Returns: | np.ndarray
|
Calculate roto-translation matrix from foot (ISB conventions) to laboratory reference frame.
Parameters: | mkrs : dict
s : {‘R’, ‘L’}
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
|
References
Leardini A, Benedetti MG, Berti L, Bettinelli D, Nativo R, Giannini S. Rear-foot, mid-foot and fore-foot motion during the stance phase of gait. Gait Posture. 2007 Mar;25(3):453-62. Epub 2006 Sep 11. PubMed PMID: 16965916.
Calculate roto-translation matrix from foot (ISB conventions) to laboratory reference frame, using rigid segment-connected cluster of technical markers.
Parameters: | mkrs : dict
clusterMkrList : list
args : mixed
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
mkrsSeg : dict
|
Calculate Groot & Suntay anatomical joint angles from proximal and distal segment rotation matrices. Angles are related to flexion-extension (FE) axis of the proximal segment, internal-external (IE) axis of the distal segment, ab-adduction (AA) floating axis.
Parameters: | Rvect : np.ndarray
|
---|---|
Returns: | list
|
References
Grood et Suntay, A joint coordinate system for the clinical description of three- dimensional motion: application to the knee. J Biomech. Engng 1983 105: 136-144
Calculate 3 joint angles between 2 rigid bodies.
Parameters: | R1 : np.ndarray
R2 : np.ndarray
R2anglesFun : func
|
---|---|
Returns: | np.ndarray
|
Calculate versors of an array.
Parameters: | a : np.ndarray
|
---|---|
Returns: | np.ndarray
|
Interpolate data array, with extrapolation. Data can contain NaNs. The gaps will not be filled.
Parameters: | D : np.ndarray
x : np.ndarray
xNew : np.ndarray
kSpline : mixed
|
---|---|
Returns: | np.ndarray
|
Behaves like np.linalg.inv for multiple matrices, but does not raise exceptions if a matrix contains nans and it is not invertible.
Parameters: | R : np.ndarray
|
---|---|
Returns: | np.ndarray
|
Default function for calculating a roto-translation matrix from a cluster of markers to laboratory reference frame. It is based on the global position for the markers only, and there is not assumption of rigid body. The reference frame is defined as:
Parameters: | mkrs : dict
mkrList : list
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
|
Run Principal Component Analysis on data matrix. It performs SVD decomposition on data covariance matrix.
Parameters: | D : np.ndarray
|
---|---|
Returns: | list
|
Read C3D file.
Parameters: | fileName : str
sections : list
opts : dict
|
---|---|
Returns: | dict
|
Resample marker data. The function first tries to see if the new time scale x and origFreq (to create the old scale) or origX are available. If not, the resampling will take a frame each step frames.
Parameters: | M : np.ndarray
x : np.ndarray
origFreq : double
origX : np.ndarray
step : int
|
---|---|
Returns: | Mout : np.ndarray
ind : np.ndarray
|
Resample markers data.
Parameters: | M : dict
**kwargs
|
---|---|
Returns: | resM : dict
ind : np.ndarray
|
Function for calculating the optimal roto-translation matrix from a rigid cluster of markers to laboratory reference frame. The computation, by using SVD, minimizes the RMSE between the markers inthe laboratory reference frame and the position of the markers in the local reference frame. See rigidBodyTransformation() for more details.
Parameters: | mkrs : dict
mkrList : list
args : mixed
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
|
Estimate or rigid rotation and translation between x and y in such a way that y = Rx + t + e is optimal in a least square optimal. Details of the algorithm can be found here:
Parameters: | x : np.ndarray
y : np.ndarray
|
---|---|
Returns: | R : np.ndarray
t : np.ndarray
e : np.ndarray
|
Calculate roto-translation matrix from shank (ISB conventions) to laboratory reference frame.
Parameters: | mkrs : dict
s : {‘R’, ‘L’}
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
|
References
Leardini A, Benedetti MG, Berti L, Bettinelli D, Nativo R, Giannini S. Rear-foot, mid-foot and fore-foot motion during the stance phase of gait. Gait Posture. 2007 Mar;25(3):453-62. Epub 2006 Sep 11. PubMed PMID: 16965916.
Calculate roto-translation matrix from shank (ISB conventions) to laboratory reference frame, using rigid segment-connected cluster of technical markers.
Parameters: | mkrs : dict
clusterMkrList : list
args : mixed
|
---|---|
Returns: | R : np.ndarray
T : np.ndarray
mkrsSeg : dict
|
Compute dot product in a vectorized way.
Parameters: | a : np.ndarray
b : np.ndarray
|
---|---|
Returns: | np.ndarray
|
Write to C3D file.
Parameters: | fileName : str
data : dict
copyFromFile : str
|
---|
Class for visualization of 2D image frames and manually create a mask.
Parameters: | maskParams : int
data : dict
*args :
**kwargs :
|
---|
Methods
Class adding manual masking capabilities to class MaskImageUI or a derivate.
Parameters: | viewer : ViewerUI
maskParams : int
data : dict
|
---|
Methods
Callback for selector type.
Parameters: | label : str
|
---|
Allow to load masks data from file, by a user dialog.
Parameters: | event : matplotlib.backend_bases.MouseEvent
|
---|
Allow to save masks data to file, by a user dialog.
Parameters: | event : matplotlib.backend_bases.MouseEvent
|
---|
Allow to toggle add or remove modality.
Parameters: | event : matplotlib.backend_bases.MouseEvent
|
---|
Class adding automatic (Hough transform) line segmentation capabilities to class ViewerUI or a derivate. The images are supposed to have two areas of diffrent grays levels, divided by a single line. For the details on the automatic line detection algorithm, see function detectHoughLongestLine().
Parameters: | autoSegParams : dict
dataConstr : list
*args
|
---|
Methods
Create points on the automatically detected line.
Parameters: | event : matplotlib.backend_bases.MouseEvent:
|
---|
Callback for Canny edge detector Sobel kernel size.
Parameters: | val : float
|
---|
Callback for dilation kernel width.
Parameters: | val : float
|
---|
Callback for dilation kernel height.
Parameters: | val : float
|
---|
Callback for probabilistic Hough transform maximum line gap.
Parameters: | val : float
|
---|
Callback for probabilistic Hough transform minimum line length.
Parameters: | val : float
|
---|
Callback for Canny edge detector lower threshold.
Parameters: | val : float
|
---|
Callback for Canny edge detector higher threshold.
Parameters: | val : float
|
---|
Callback for probabilistic Hough transform threshold.
Parameters: | val : float
|
---|
Class adding manual points extraction capabilities to class ViewerUI or a derivate.
Parameters: | viewer : ViewerUI
Npoints : int
data : dict
|
---|
Methods
Allow to click on Npoints points manually in the current image
Parameters: | event : matplotlib.backend_bases.MouseEvent
|
---|
Allow to load points data from file, by a user dialog.
Parameters: | event : matplotlib.backend_bases.MouseEvent
|
---|
Allow to save points data to file, by a user dialog.
Class for visualization of 2D image frames and automatically segmentable points lying on a line. The images are supposed to have two areas of diffrent grays levels, divided by a single line. For the details on the automatic line detection algorithm, see function detectHoughLongestLine(). Automatically detected points can be manually adjusted.
Parameters: | Npoints : int
autoSegParams : dict
dataConstr : list
data : dict
*args:
**kwargs:
|
---|
Methods
Class for visualization of 2D image frames and manually segmentable points.
Parameters: | Npoints : int
data : dict
*args
**kwargs:
|
---|
Methods
(deprecated) Class for performing manual point feature extraction.
Parameters: | I : np.array
data : dict
Nclicks : int
block : bool
title : str
|
---|
Methods
Allow to click on Nclicks manually in the current image
Parameters: | event : matplotlib.backend_bases.MouseEvent
|
---|
Allow to load points data from file, by a user dialog.
Parameters: | event : param matplotlib.backend_bases.MouseEvent
|
---|
Allow to save points data to file, by a user dialog.
Parameters: | event : matplotlib.backend_bases.MouseEvent
|
---|
Get clicked points data.
Returns: | dict
|
---|
Show next image.
Parameters: | event : matplotlib.backend_bases.MouseEvent
|
---|
Class for visualization of 2D image frames.
Parameters: | I : np.array
title : str
|
---|
Methods
Callback for frame index slider.
Parameters: | val : float
|
---|
Show next image.
Parameters: | event : param matplotlib.backend_bases.MouseEvent
|
---|
Class for visualization of 2D image frames and image features.
Parameters: | *args
**kwargs
|
---|
Methods
Given a noisy gray-scale image containing 2 main blocks separated by a straight line, this function detects this line. The algorithm performs the following steps:
Parameters: | I : np.ndarray(uint8)
thI : float
thCan1 : int
thCan2 : int
kerSizeCan : int
kerSizeDil : list
thHou : int
minLineLength : int
maxLineGap : int
|
---|---|
Returns: | a, b : float
bw : np.ndarray
edges : np.ndarray
dilate : np.ndarray
|
Read feature file data.
Parameters: | filePath : str
|
---|---|
Returns: | dict
|
Transform a single points features structures to matrix containing 3D points data.
Parameters: | fea : dict
u, v : float
idx : list
|
---|---|
Returns: | np.ndarray
|
Convert gray-scale Numpy 3D image array to AVI file (use moviepy).
Parameters: | fileName : str
M : np.ndarray(uint8)
fps : int
|
---|
Convert gray-scale Numpy 3D image array to AVI file (use OpenCV).
Parameters: | fileName : str
M : np.ndarray(uint8)
fps : int
|
---|
Convert gray-scale Numpy 3D image array to gray-scale image-sequence DICOM file.
Parameters: | I : np.ndarray
fileOut : str
|
---|
Convert gray-scale Numpy 3D image array to image sequence file.
Parameters: | fileName : str
M : np.ndarray(uint8)
fps : int
|
---|
Convert gray-scale AVI file to gray-scale image-sequence DICOM file. Frame rate is not added as CineRate tag in the DICOM file.
Parameters: | fileIn : str
fileOut : str
|
---|
Convert voxel-array from VTI to Numpy 3D image array.
Parameters: | fileIn : str
fileOut : str
|
---|---|
Returns: | V : np.ndarray
metadata : dict
|
Calculate muscle-tendon junction (MJT) lengths based on insterion and MJT position.
Parameters: | P1 : np.ndarray
P2 : np.ndarray
P3 : np.ndarray
|
---|---|
Returns: | dict
|
Implements the method described in the article of Lee et al 2008 for calculating the muscle-tendon junction (MTJ) velocity vector (u, v) between a frame and the next one.
Parameters: | allDx, allDy : np.ndarray
pctEx : float
pctIn : float
direction : string
|
---|---|
Returns: | tuple
|
Function for enhancing the image containing muscle-tendon junction (MTJ) for tracking purpose. Two characteristics can be enhanced:
Parameters: | img : np.ndarray
method : str
enhanceArgs : tuple
enhanceKwargs : dict
|
---|---|
Returns: | np.ndarray
|
Creates quality indices for before optical flow computation.
Note
These considerations are important:
Parameters: | I : np.ndarray
show : bool
|
---|
Function for tracking muscle-tendon junction (MTJ) (semi)automatically.
Parameters: | img : mixed
plotRawInputImage : bool
f1, f2 : mixed
y1, y2 : mixed
x1, x2 : mixed
enhanceImage : bool
enhancePars : list
lowpassFilterTime : bool
lowpassFilterTimePars : tuple
cx, cy : int
h, w : int
adjustManuallyCxy : bool
adjustManuallyCxyOnTracking : bool
adjustManuallyCxyOnTrackingCond : list
cxyHelp : dict
cxOffset, cyOffset : int
adjustManuallyCxyOffset : bool
stepFramesN : int
technique : str
techniquePars : tuple
plotImageInTracking : bool
winName : str
timePerImage : mixed
plotTrackedFeatures : bool
plotCircle : str
plotTechniqueRes : bool
plotTechniqueResAddData : dict
saveTechniqueResTo : mixed
outFiles : list
|
---|---|
Returns: | tuple
|
Function for tracking muscle-tendon junction (MTJ) (semi)automatically, by using N different trackers. Only one tracker at a time is run.
Parameters: | img : mixed
trackersN : int
trackersPars : list
f1, f2 : mixed
y1, y2 : mixed
x1, x2 : mixed
plotImageInTracking : bool
timePerImage : mixed
plotTrackedFeatures : bool
outFiles : list
|
---|---|
Returns: | tuple
|
Calculate Weighted average (expectation estimator) of samples.
Parameters: | x : np.ndarray (nS x nP)
w : np.ndarray (nP)
|
---|---|
Returns: | np.ndarray (nS)
|
Calculate maximum likelihood estimation from samples (i.e. the sample with the higher weight).
Parameters: | x : np.ndarray (nS x nP)
w : np.ndarray (nP)
|
---|---|
Returns: | np.ndarray (nS)
|
Particle filter implementation.
These are the name conventions for variables:
Methods
Estimate and return state for next iteration.
Returns: | np.ndarray (nS)
|
---|
Get the number of effective particles, calculated by the inverse of the sum of squared normalized weights.
Returns: | int
|
---|
Get state for next iteration, from particle weights only.
Returns: | np.ndarray (nS)
|
---|
Get weights for updated particles.
Returns: | np.ndarray (nP)
|
---|
Perform the resampling step.
Parameters: | mode : str
|
---|---|
Returns: | np.ndarray (nS x nP)
|
Set additional data, to be passed to and used by model functions.
Parameters: | data : mixed
|
---|
Set particles weights for current interation.
Parameters: | w : np.ndarray (nP)
|
---|
Set particles data for current iteration.
Parameters: | x : np.ndarray (nS x nP)
|
---|
Set the a-posteriori probability model (i.e. p(x|Y))
Parameters: | model : fun
|
---|
Set the observation model (i.e. p(y|x))
Parameters: | model : fun
|
---|
Set the probability/state transition model (i.e. p(x+1|x))
Parameters: | model : fun
|
---|
Set estimated states history.
Parameters: | xEst : np.ndarray (nS x nI)
|
---|
Set the model for state estimation from weights (e.g. maximum likelyhood, expectation, etc.)
Parameters: | model : fun
|
---|
Implement a rectified NCC-based observation model for image tracking. Rectified NCC (RNCC) is the same as NCC, but truncated to 0 when NCC is lower than 0. The probability distribution is directly expressed as the RNCC value (0 <= RNCC <= 1) between a target patch and patches centered around the positional part of particles. The first 2 state variables must be x and y position of the patch center under tracking.
Parameters: | xNext : np.ndarray (nS x nP)
xEst : np.ndarray (nS x nI)
w : np.ndarray (nS).
addData : dict
|
---|---|
Returns: | np.ndarray (nS)
|
Use a-posteriori probability model described in arm01ConstVelEstFun() and perform target template histogram update with a certain learning rate, for image tracking purpose. This function has to be used in combination with any histogram-distance-based observation model (e.g. bhattDistObsFun()). The first 2 state variables must be x and y position of the patch center under tracking.
Parameters: | xEst : np.ndarray (nS x nI)
xNext : np.ndarray (nS x nP)
wNextNorm : np.ndarray (nP)
xNextFromWeights : np.ndarray (nS).
addData : dict
|
---|---|
Returns: | np.ndarray (nS)
|
Use a-posteriori probability model described in arm01ConstVelEstFun() and perform target template update with a certain learning rate, for image tracking purpose. This function has to be used in combination with any template-match-based observation model (e.g. RNCCObsFun()). The first 2 state variables must be x and y position of the patch center under tracking.
Parameters: | xEst : np.ndarray (nS x nI)
xNext : np.ndarray (nS x nP)
wNextNorm : np.ndarray (nP)
xNextFromWeights : np.ndarray (nS).
addData : dict
|
---|---|
Returns: | np.ndarray (nS)
|
Implement a-posteriori probability model, with adaptive state, for first-oder autoregressive model (constant velocity) for a 2D point (see arm01ConstVelFun()). The estimated state is a weighted average between xNextFromWeights and the last estimated state.
Parameters: | xEst : np.ndarray (nS x nI)
xFromWeights : np.ndarray (nS).
pars : dict
|
---|---|
Returns: | np.ndarray (nS)
|
Implement first-oder autoregressive model (constant velocity) for a 2D point. State vector is composed by x and y position, followed by x and y velocity. Gaussian noise is added to both position and velocity.
Parameters: | x : np.ndarray (nS x nP)
xEst : np.ndarray (nS x nI)
addData : dict
|
---|---|
Returns: | np.ndarray (nS x nP)
|
Implement a Bhattacharyya distance-based observation model for image tracking. The probability distribution is modelled as a Gaussian distribution of the Bhattacharyya histogram distance between a target patch and patches centered around the positional part of particles. The first 2 state variables must be x and y position of the patch center under tracking.
Parameters: | xNext : np.ndarray (nS x nP)
xEst : np.ndarray (nS x nI)
w : np.ndarray (nS).
addData : dict
|
---|---|
Returns: | np.ndarray (nS)
|
Create particles inside a bounded box in case bounding box center is given manually or there are no estimated states yet. The first 2 state variables must be x and y position of the box point.
Parameters: | nParticles : int
xEst : np.ndarray (nS x nI)
addData : dict
|
---|---|
Returns: | xParticles : np.ndarray (nS x nP)
|
Perform data resampling based on inverse transform.
Parameters: | x : np.ndarray (nS x nP)
pdf : np.ndarray (nP)
nSamples : int
|
---|---|
Returns: | np.ndarray (nS x nP)
|