next up previous
Next: About this document ... Up: Supervised classification for textured Previous: . Complete functional

. Numerical results


Information given by the user

As the classification is here supervised, the user has to give the number of classes (textures), as well as the parameters of each class (the first and second order moments of the energy distribution in each sub-band of the packet wavelet decomposition).

Parameters:

In our experiments, we always choose $ e_{1}=\dots =e_{K}=1.0$ and $ \gamma_{1}=\dots =\gamma_{K}= \gamma$ ( $ \gamma \in {\mathbb{R}}$). There remains only two parameters to set: the partition term coefficient $ \lambda$, and the common value of the contour regularization terms $ \gamma$.

Initialization

In a first time, we have proceeded to a manual initialization with circles or squares. Each circle represents then the zero level set of one of the class (see figure 7).

Figure 7: Manual initialization
\includegraphics[scale=0.8]{initmanuangl.eps}

To get an automatic initialization, and to make it independent of the user, we have then used ``seeds'': we split the initial image into small sub-images (in practice 5*5 images). In each sub-image, for each class $ k$, we compute the data term by assuming that all the pixels of the sub-image belong to the same class $ k$. We set all the pixels in the sub-image to the class $ k$ for which the whole sub-image's energy is the smallest (see figure 8). We have used this initialization in the examples presented here-after.

Figure 8: Seeds initialization
\includegraphics[scale=0.8]{initauto.eps}

Synthetic image with four textures

Figure 9: Classification of a synthetic image composed of four textures ( animation)
Image to classify Data term
\includegraphics[scale=1]{im.PS} \includegraphics[scale=1]{attache696.PS}
Contours (696 iterations) Classification (696 iterations)
\includegraphics[scale=1]{res-cont696.PS} \includegraphics[scale=1]{res-niv696.PS}

In this example (see figure 9), one sees clearly that our model can handle with triple junctions. On the contrary, as in the classical approach of the Mumford-Shah functional, the junction of four textures give two triple junctions in the classified image (at 120 degrees).

Synthetic image with two textures

Figure 10: Classification of a synthetic image composed of two textures ( animation)
Image to classify Data term
\includegraphics[scale=1]{textordue.PS} \includegraphics[scale=1]{attache645.PS}
Contours (645 iterations) Classification (645 iterations)
\includegraphics[scale=1]{res-cont645.PS} \includegraphics[scale=1]{res-niv645.PS}

This example (see figure 10) shows that our model can handle with any kind of geometrical shape.

Synthetic image with six textures

Figure 11: Classification of a synthetic image with six textures ( animation)
Image to classify Data term
\includegraphics[scale=0.8]{texcomplique.PS} \includegraphics[scale=0.8]{attachetexcomp6g3.PS}
Contours (714 iterations) Classification (714 iterations)
\includegraphics[scale=0.8]{res-conttexcomp6.PS} \includegraphics[scale=0.8]{res-nivtexcomp6.PS}

This example (see figure 11) shows that our model can handle complex textured images. Here, some of the textures are visually very close, and the geometrical shape of the contours are yet quite well detected. To get more homogeneous classes, we have applied here a Gaussian mask to the data term.


up previous
Next: About this document ... Up: Supervised classification for textured Previous: . Complete functional
Jean-Francois Aujol 2002-12-03