You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
ENH: Update to support latex2e modern recommendations
Search for and replace obsolete font commands in all .tex
- Replace `{\bf ...}` with `\textbf{...}`
- Replace `{\it ...}` with `\textit{...}`
- Replace `{\tt ...}` with `\texttt{...}`
- Replace `{\rm ...}` with `\textrm{...}`
- Replace `{\sc ...}` with `\textsc{...}`
- Replace `{\sf ...}` with `\textsf{...}`
- Replace `{\sl ...}` with `\textsl{...}`
- Replace `{\cal ...}` with `\mathcal{...}`
- Replace `{\em ...}` with `\emph{...}`
Search for and replace other obsolete commands:
- Replace `\over` with `\frac{...}{...}` or `\overline{...}`:
Copy file name to clipboardExpand all lines: SoftwareGuide/Latex/DesignAndFunctionality/AnisotropicDiffusionFiltering.tex
+3-5Lines changed: 3 additions & 5 deletions
Original file line number
Diff line number
Diff line change
@@ -36,8 +36,7 @@
36
36
conductance at areas of large $|\nabla g|$, and can be any one of a number of
37
37
functions. The literature has shown \begin{equation} c(|\nabla g|) =
38
38
e^{-\frac{|\nabla g|^{2}}{2k^{2}}}\end{equation} to be quite effective.
39
-
Notice that conductance term introduces a free parameter $k$, the {\em
40
-
conductance parameter}, that controls the sensitivity of the process to edge
39
+
Notice that conductance term introduces a free parameter $k$, the \emph{conductance parameter}, that controls the sensitivity of the process to edge
41
40
contrast. Thus, anisotropic diffusion entails two free parameters: the
42
41
conductance parameter, $k$, and the time parameter, $t$, that is analogous to
43
42
$\sigma$, the effective width of the filter when using Gaussian kernels.
@@ -63,9 +62,8 @@
63
62
data (such as the color cryosection data of the Visible Human Project).
64
63
65
64
For a vector-valued input $\vec{F}:U \mapsto\Re^{m}$ the process takes the
66
-
form \begin{equation}\vec{F}_{t} = \nabla\cdot c({\cal D}\vec{F}) \vec{F},
67
-
\label{eq:vector_diff}\end{equation} where ${\cal D}\vec{F}$ is a {\em
68
-
dissimilarity} measure of $\vec{F}$, a generalization of the gradient magnitude
65
+
form \begin{equation}\vec{F}_{t} = \nabla\cdot c(\mathcal{D}\vec{F}) \vec{F},
66
+
\label{eq:vector_diff}\end{equation} where $\mathcal{D}\vec{F}$ is a \emph{dissimilarity} measure of $\vec{F}$, a generalization of the gradient magnitude
69
67
to vector-valued images, that can incorporate linear and nonlinear coordinate
70
68
transformations on the range of $\vec{F}$. In this way, the smoothing of the
71
69
multiple images associated with vector-valued data is coupled through the
Copy file name to clipboardExpand all lines: SoftwareGuide/Latex/DesignAndFunctionality/ConfidenceConnectedOnBrainWeb.tex
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -1,5 +1,5 @@
1
1
\subsubsection{Application of the Confidence Connected filter on the Brain Web Data}
2
-
This section shows some results obtained by applying the Confidence Connected filter on the BrainWeb database. The filter was applied on a 181 $\times$ 217 $\times$ 181 crosssection of the {\itbrainweb165a10f17} dataset. The data is a MR T1 acquisition, with an intensity non-uniformity of 20\% and a slice thickness 1mm. The dataset may be obtained from
2
+
This section shows some results obtained by applying the Confidence Connected filter on the BrainWeb database. The filter was applied on a 181 $\times$ 217 $\times$ 181 crosssection of the \textit{brainweb165a10f17} dataset. The data is a MR T1 acquisition, with an intensity non-uniformity of 20\% and a slice thickness 1mm. The dataset may be obtained from
Load the Deformation field in Paraview. (The deformation field must be capable of handling vector data, such as MetaImages). Paraview shows a color map of the magnitudes of the deformation fields as shown in \ref{fig:ParaviewScreenshot1}.
18
18
19
-
Convert the deformation field to 3D vector data using a {\itCalculator}. The Calculator may be found in the {\itFilter} pull down menu. A screenshot of the calculator tab is shown in Figure \ref{fig:ParaviewScreenshot2}. Although the deformation field is a 2D vector, we will generate a 3D vector with the third component set to 0 since Paraview generates glyphs only for 3D vectors. You may now apply a glyph of arrows to the resulting 3D vector field by using {\itGlyph} on the menu bar. The glyphs obtained will be very dense since a glyph is generated for each point in the data set. To better visualize the deformation field, you may adopt one of the following approaches.
19
+
Convert the deformation field to 3D vector data using a \textit{Calculator}. The Calculator may be found in the \textit{Filter} pull down menu. A screenshot of the calculator tab is shown in Figure \ref{fig:ParaviewScreenshot2}. Although the deformation field is a 2D vector, we will generate a 3D vector with the third component set to 0 since Paraview generates glyphs only for 3D vectors. You may now apply a glyph of arrows to the resulting 3D vector field by using \textit{Glyph} on the menu bar. The glyphs obtained will be very dense since a glyph is generated for each point in the data set. To better visualize the deformation field, you may adopt one of the following approaches.
20
20
21
-
Reduce the number of glyphs by reducing the number in {\itMax. Number of Glyphs} to a reasonable amount. This uniformly downsamples the number of glyphs. Alternatively, you may apply a {\itThreshold} filter to the {\itMagnitude} of the vector dataset and then glyph the vector data that lie above the threshold. This eliminates the smaller deformation fields that clutter the display. You may now reduce the number of glyphs to a reasonable value.
21
+
Reduce the number of glyphs by reducing the number in \textit{Max. Number of Glyphs} to a reasonable amount. This uniformly downsamples the number of glyphs. Alternatively, you may apply a \textit{Threshold} filter to the \textit{Magnitude} of the vector dataset and then glyph the vector data that lie above the threshold. This eliminates the smaller deformation fields that clutter the display. You may now reduce the number of glyphs to a reasonable value.
22
22
23
23
Figure \ref{fig:ParaviewScreenshot3} shows the vector field visualized using Paraview by thresholding the vector magnitudes by 2.1 and restricting the number of glyphs to 100.
Let us create a 3D deformation field. We will use Thin Plate Splines to warp a 3D dataset and create a deformation field. We will pick a set of point landmarks and translate them to provide a specification of correspondences at point landmarks. Note that the landmarks have been picked randomly for purposes of illustration and are not intended to portray a true deformation. The landmarks may be used to produce a deformation field in several ways. Most techniques minimize some regularizing functional representing the irregularity of the deformation field, which is usually some function of the spatial derivatives of the field. Here will we use {\itthin plate splines}. Thin plate splines minimize the regularizing functional
49
+
Let us create a 3D deformation field. We will use Thin Plate Splines to warp a 3D dataset and create a deformation field. We will pick a set of point landmarks and translate them to provide a specification of correspondences at point landmarks. Note that the landmarks have been picked randomly for purposes of illustration and are not intended to portray a true deformation. The landmarks may be used to produce a deformation field in several ways. Most techniques minimize some regularizing functional representing the irregularity of the deformation field, which is usually some function of the spatial derivatives of the field. Here will we use \textit{thin plate splines}. Thin plate splines minimize the regularizing functional
Copy file name to clipboardExpand all lines: SoftwareGuide/Latex/DesignAndFunctionality/WatershedSegmentation.tex
+2-3Lines changed: 2 additions & 3 deletions
Original file line number
Diff line number
Diff line change
@@ -87,8 +87,7 @@ \subsection{Overview}
87
87
operators, and the $f$ in question will therefore have floating point
88
88
values. The bottom-up strategy starts with seeds at the local minima in the
89
89
image and grows regions outward and upward at discrete intensity levels
90
-
(equivalent to a sequence of morphological operations and sometimes called {\em
91
-
morphological watersheds} \cite{Serra1982}.) This limits the accuracy by
90
+
(equivalent to a sequence of morphological operations and sometimes called \emph{morphological watersheds} \cite{Serra1982}.) This limits the accuracy by
92
91
enforcing a set of discrete gray levels on the image.
93
92
94
93
\begin{figure}
@@ -109,7 +108,7 @@ \subsection{Overview}
109
108
initial segmentation is passed to a second sub-filter that generates a
110
109
hierarchy of basins to a user-specified maximum watershed depth. The
111
110
relabeler object at the end of the mini-pipeline uses the hierarchy and the
112
-
initial segmentation to produce an output image at any scale {\embelow} the
111
+
initial segmentation to produce an output image at any scale \emph{below} the
113
112
user-specified maximum. Data objects are cached in the mini-pipeline so that
114
113
changing watershed depths only requires a (fast) relabeling of the basic
115
114
segmentation. The three parameters that control the filter are shown in
0 commit comments