|
Publications of 2006
Result of the query in the list of publications :
8 Articles |
1 - SAR Image Filtering Based on the Heavy-Tailed Rayleigh Model. A. Achim and E.E. Kuruoglu and J. Zerubia. IEEE Trans. on Image Processing, 15(9): pages 2686-2693, September 2006. Keywords : SAR Images.
@ARTICLE{jz_ieee_tr_ip_06,
|
author |
= |
{Achim, A. and Kuruoglu, E.E. and Zerubia, J.}, |
title |
= |
{SAR Image Filtering Based on the Heavy-Tailed Rayleigh Model}, |
year |
= |
{2006}, |
month |
= |
{September}, |
journal |
= |
{IEEE Trans. on Image Processing}, |
volume |
= |
{15}, |
number |
= |
{9}, |
pages |
= |
{2686-2693}, |
pdf |
= |
{http://dx.doi.org/10.1109/TIP.2006.877362}, |
keyword |
= |
{SAR Images} |
} |
Abstract :
Synthetic aperture radar (SAR) images are inherently affected by a signal dependent noise known as speckle, which is due to the radar wave coherence. In this paper, we propose a novel adaptive despeckling filter and derive a maximum a posteriori (MAP) estimator for the radar cross section (RCS). We first employ a logarithmic transformation to change the multiplicative speckle into additive noise. We model the RCS using the recently introduced heavy-tailed Rayleigh density function, which was derived based on the assumption that the real and imaginary parts of the received complex signal are best described using the alpha-stable family of distribution. We estimate model parameters from noisy observations by means of second-kind statistics theory, which relies on the Mellin transform. Finally, we compare the proposed algorithm with several classical speckle filters applied on actual SAR images. Experimental results show that the homomorphic MAP filter based on the heavy-tailed Rayleigh prior for the RCS is among the best for speckle removal |
|
2 - Higher Order Active Contours. M. Rochery and I. H. Jermyn and J. Zerubia. International Journal of Computer Vision, 69(1): pages 27--42, August 2006. Keywords : Active contour, Shape, Higher-order, Prior, Road network.
@ARTICLE{mr_ijcv_06,
|
author |
= |
{Rochery, M. and Jermyn, I. H. and Zerubia, J.}, |
title |
= |
{Higher Order Active Contours}, |
year |
= |
{2006}, |
month |
= |
{August}, |
journal |
= |
{International Journal of Computer Vision}, |
volume |
= |
{69}, |
number |
= |
{1}, |
pages |
= |
{27--42}, |
url |
= |
{http://dx.doi.org/10.1007/s11263-006-6851-y}, |
pdf |
= |
{ftp://ftp-sop.inria.fr/ariana/Articles/2006_mr_ijcv_06.pdf}, |
keyword |
= |
{Active contour, Shape, Higher-order, Prior, Road network} |
} |
Abstract :
We introduce a new class of active contour models that
hold great promise for region and shape modelling, and
we apply a special case of these models to the
extraction of road networks from satellite and aerial
imagery. The new models are arbitrary polynomial
functionals on the space of boundaries, and thus
greatly generalize the linear functionals used in
classical contour energies. While classical energies
are expressed as single integrals over the contour,
the new energies incorporate multiple integrals, and
thus describe long-range interactions between
different sets of contour points. As prior terms, they
describe families of contours that share complex
geometric properties, without making reference to any
particular shape, and they require no pose estimation.
As likelihood terms, they can describe multi-point
interactions between the contour and the data. To
optimize the energies, we use a level set approach.
The forces derived from the new energies are non-local
however, thus necessitating an extension of standard
level set methods. Networks are a shape family of
great importance in a number of applications,
including remote sensing imagery. To model them, we
make a particular choice of prior quadratic energy
that describes reticulated structures, and augment it
with a likelihood term that couples the data at pairs
of contour points to their joint geometry. Promising
experimental results are shown on real images. |
|
3 - SAR amplitude probability density function estimation based on a generalized Gaussian model. G. Moser and J. Zerubia and S.B. Serpico. IEEE Trans. on Image Processing, 15(6): pages 1429-1442, June 2006. Keywords : SAR Images, Generalised Gaussians, Mellin transform. Copyright : IEEE
@ARTICLE{moser_ieeeip05,
|
author |
= |
{Moser, G. and Zerubia, J. and Serpico, S.B.}, |
title |
= |
{SAR amplitude probability density function estimation based on a generalized Gaussian model}, |
year |
= |
{2006}, |
month |
= |
{June}, |
journal |
= |
{IEEE Trans. on Image Processing}, |
volume |
= |
{15}, |
number |
= |
{6}, |
pages |
= |
{1429-1442}, |
url |
= |
{http://ieeexplore.ieee.org/xpl/articleDetails.jsp?arnumber=1632197}, |
pdf |
= |
{http://hal.archives-ouvertes.fr/inria-00561372/en/}, |
keyword |
= |
{SAR Images, Generalised Gaussians, Mellin transform} |
} |
Abstract :
In the context of remotely sensed data analysis, an important problem is the development of accurate models for the statistics of the pixel intensities. Focusing on synthetic aperture radar (SAR) data, this modeling process turns out to be a crucial task, for instance, for classification or for denoising purposes. In this paper, an innovative parametric estimation methodology for SAR amplitude data is proposed that adopts a generalized Gaussian (GG) model for the complex SAR backscattered signal. A closed-form expression for the corresponding amplitude probability density function (PDF) is derived and a specific parameter estimation algorithm is developed in order to deal with the proposed model. Specifically, the recently proposed “method-of-log-cumulants” (MoLC) is applied, which stems from the adoption of the Mellin transform (instead of the usual Fourier transform) in the computation of characteristic functions and from the corresponding generalization of the concepts of moment and cumulant. For the developed GG-based amplitude model, the resulting MoLC estimates turn out to be numerically feasible and are also analytically proved to be consistent. The proposed parametric approach was validated by using several real ERS-1, XSAR, E-SAR, and NASA/JPL airborne SAR images, and the experimental results prove that the method models the amplitude PDF better than several previously proposed parametric models for backscattering phenomena. |
|
4 - Richardson-Lucy Algorithm with Total Variation Regularization for 3D Confocal Microscope Deconvolution. N. Dey and L. Blanc-Féraud and C. Zimmer and Z. Kam and P. Roux and J.C. Olivo-Marin and J. Zerubia. Microscopy Research Technique, 69: pages 260-266, April 2006. Keywords : Confocal microscopy, Variational methods, Total variation, Deconvolution.
@ARTICLE{dey_mrt_05,
|
author |
= |
{Dey, N. and Blanc-Féraud, L. and Zimmer, C. and Kam, Z. and Roux, P. and Olivo-Marin, J.C. and Zerubia, J.}, |
title |
= |
{Richardson-Lucy Algorithm with Total Variation Regularization for 3D Confocal Microscope Deconvolution}, |
year |
= |
{2006}, |
month |
= |
{April}, |
journal |
= |
{Microscopy Research Technique}, |
volume |
= |
{69}, |
pages |
= |
{260-266}, |
url |
= |
{http://dx.doi.org/10.1002/jemt.20294}, |
keyword |
= |
{Confocal microscopy, Variational methods, Total variation, Deconvolution} |
} |
Abstract :
Confocal laser scanning microscopy is a powerful and popular technique for 3D imaging of biological specimens. Although confocal microscopy images are much sharper than standard epifluorescence ones, they are still degraded by residual out-of-focus light and by Poisson noise due to photon-limited
detection. Several deconvolution methods have been proposed to reduce these degradations, including the Richardson-Lucy iterative algorithm, which computes a maximum likelihood estimation adapted to Poisson statistics. As this algorithm tends to amplify noise, regularization constraints based on some prior knowledge on the data have to be applied to stabilize the solution. Here, we propose to combine the Richardson-Lucy algorithm with a regularization constraint based on Total Variation, which suppresses unstable oscillations while preserving object edges. We
show on simulated and real images that this constraint improves the deconvolution results as compared to the unregularized Richardson-Lucy algorithm, both visually and quantitatively. |
|
5 - A study of Gaussian mixture models of colour and texture features for image classification and segmentation. H. Permuter and J.M. Francos and I. H. Jermyn. Pattern Recognition, 39(4): pages 695--706, April 2006. Keywords : Classification, Segmentation, Texture, Colour, Gaussian mixture, Decison fusion.
@ARTICLE{permuter_pr06,
|
author |
= |
{Permuter, H. and Francos, J.M. and Jermyn, I. H.}, |
title |
= |
{A study of Gaussian mixture models of colour and texture features for image classification and segmentation}, |
year |
= |
{2006}, |
month |
= |
{April}, |
journal |
= |
{Pattern Recognition}, |
volume |
= |
{39}, |
number |
= |
{4}, |
pages |
= |
{695--706}, |
url |
= |
{http://dx.doi.org/10.1016/j.patcog.2005.10.028}, |
pdf |
= |
{ftp://ftp-sop.inria.fr/ariana/Articles/2006_permuter_pr06.pdf}, |
keyword |
= |
{Classification, Segmentation, Texture, Colour, Gaussian mixture, Decison fusion} |
} |
Abstract :
The aims of this paper are two-fold: to define Gaussian mixture models of coloured texture on several feature paces and to compare the performance of these models
in various classification tasks, both with each other and with other models popular in the literature. We construct Gaussian mixtures models over a variety of different colour and texture feature spaces, with a view to the retrieval of textured colour images from databases. We compare supervised classification results for different choices of colour and texture features using the Vistex database, and explore the best set of features and the best GMM configuration for this task. In addition we introduce several methods for combining the 'colour' and 'structure' information in order to improve the classification performance. We then apply the resulting models to the classification of texture databases and to the classification of man-made and natural areas in aerial images. We compare the GMM model with other models in the literature, and show an overall improvement in performance. |
|
6 - Dictionary-Based Stochastic Expectation-Maximization for SAR Amplitude Probability Density Function Estimation. G. Moser and J. Zerubia and S.B. Serpico. IEEE Trans. Geoscience and Remote Sensing, 44(1): pages 188-200, January 2006. Keywords : SAR Images, Stochastic EM (SEM), Dictionary. Copyright : IEEE
@ARTICLE{moser_ieeetgrs_05,
|
author |
= |
{Moser, G. and Zerubia, J. and Serpico, S.B.}, |
title |
= |
{Dictionary-Based Stochastic Expectation-Maximization for SAR Amplitude Probability Density Function Estimation}, |
year |
= |
{2006}, |
month |
= |
{January}, |
journal |
= |
{IEEE Trans. Geoscience and Remote Sensing}, |
volume |
= |
{44}, |
number |
= |
{1}, |
pages |
= |
{188-200}, |
url |
= |
{http://dx.doi.org/10.1109/TGRS.2005.859349}, |
pdf |
= |
{http://hal.archives-ouvertes.fr/inria-00561369/en/}, |
keyword |
= |
{SAR Images, Stochastic EM (SEM), Dictionary} |
} |
Abstract :
In remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of the pixel intensities. This paper deals with the problem of probability density function (pdf) estimation in the context of synthetic aperture radar (SAR) amplitude data analysis. Several theoretical and heuristic models for the pdfs of SAR data have been proposed in the literature, which have been proved to be effective for different land-cover typologies, thus making the choice of a single optimal parametric pdf a hard task, especially when dealing with heterogeneous SAR data. In this paper, an innovative estimation algorithm is described, which faces such a problem by adopting a finite mixture model for the amplitude pdf, with mixture components belonging to a given dictionary of SAR-specific pdfs. The proposed method automatically integrates the procedures of selection of the optimal model for each component, of parameter estimation, and of optimization of the number of components by combining the stochastic expectation–maximization iterative methodology with the recently developed “method-of-log-cumulants” for parametric pdf estimation in the case of nonnegative random variables. Experimental results on several real SAR images are reported, showing that the proposed method accurately models the statistics of SAR amplitude data. |
|
7 - An approximation of the Mumford-Shah energy by a family of dicrete edge-preserving functionals. G. Aubert and L. Blanc-Féraud and R. March. Nonlinear Analysis, 64: pages 1908-1930, 2006. Keywords : Gamma Convergence, Finite Element, Segmentation.
@ARTICLE{laure-na05,
|
author |
= |
{Aubert, G. and Blanc-Féraud, L. and March, R.}, |
title |
= |
{An approximation of the Mumford-Shah energy by a family of dicrete edge-preserving functionals}, |
year |
= |
{2006}, |
journal |
= |
{Nonlinear Analysis}, |
volume |
= |
{64}, |
pages |
= |
{1908-1930}, |
pdf |
= |
{ftp://ftp-sop.inria.fr/ariana/Articles/2006_laure-na05.pdf}, |
keyword |
= |
{Gamma Convergence, Finite Element, Segmentation} |
} |
Abstract :
We show the Gamma-convergence of a family of discrete functionals to the Mumford and Shah image segmentation functional.
The functionals of the family are constructed by modifying the elliptic approximating functionals proposed by Ambrosio and Tortorelli. The quadratic term of the energy related to the edges of the segmentation is replaced by a nonconvex functional. |
|
8 - Automatic building 3D reconstruction from DEMs. F. Lafarge and X. Descombes and J. Zerubia and M. Pierrot-Deseilligny. Revue Française de Photogrammétrie et de Télédétection (SFPT), 184: pages 48--53, 2006. Keywords : 3D-reconstruction, Digital Elevation Model, Building extraction, dense urban areas.
@ARTICLE{lafarge_sfpt06,
|
author |
= |
{Lafarge, F. and Descombes, X. and Zerubia, J. and Pierrot-Deseilligny, M.}, |
title |
= |
{Automatic building 3D reconstruction from DEMs}, |
year |
= |
{2006}, |
journal |
= |
{Revue Française de Photogrammétrie et de Télédétection (SFPT)}, |
volume |
= |
{184}, |
pages |
= |
{48--53}, |
url |
= |
{http://isprs.free.fr/documents/Papers/T07-32.pdf}, |
keyword |
= |
{3D-reconstruction, Digital Elevation Model, Building extraction, dense urban areas} |
} |
Abstract :
This paper is about an example of PLEIADES applications, the 3D building reconstruction. The future PLEIADES satellites are
especially well adapted to deal with 3D building reconstruction through the sub-metric resolution of images and its stereoscopic characteristics. We propose a fully automatic 3D-city model of dense urban areas using a parametric approach. First, a Digital Elevation
Model (DEM) is generated using an algorithm based on a maximum-flow formulation using three views. Then, building footprints are extracted from the DEM through an automatic method based on marked point processes : they are represented by an association of rectangles that we regularize by improving the connection of the neighboring rectangles and the facade discontinuity detection. Finally, a 3D-reconstruction method based on a skeleton process which allows to model the rooftops is proposed from the DEM and the building footprints. The different building heights constitute parameters which are estimated and then regularized by the ”K-means” algorithm including an entropy term. |
|
top of the page
PhD Thesis and Habilitation |
1 - Etude du couvert forestier par processus ponctuels marqués. G. Perrin. PhD Thesis, Ecole Centrale Paris, October 2006. Keywords : Tree Crown Extraction, Marked point process, Stochastic geometry, Object extraction, RJMCMC.
@PHDTHESIS{perrin_phd06,
|
author |
= |
{Perrin, G.}, |
title |
= |
{Etude du couvert forestier par processus ponctuels marqués}, |
year |
= |
{2006}, |
month |
= |
{October}, |
school |
= |
{Ecole Centrale Paris}, |
url |
= |
{http://www-sop.inria.fr/ariana/personnel/Guillaume.Perrin/resume.php}, |
pdf |
= |
{http://www-sop.inria.fr/ariana/personnel/Guillaume.Perrin/DOWNLOADS/these_perrin_2006.pdf}, |
keyword |
= |
{Tree Crown Extraction, Marked point process, Stochastic geometry, Object extraction, RJMCMC} |
} |
Résumé :
Cette thèse aborde le problème de l'extraction d'arbres à partir d'images aériennes InfraRouge Couleur (IRC) de forêts. Nos modèles reposent sur l'utilisation de processus objets ou processus ponctuels marqués. Il s'agit de variables aléatoires dont les réalisations sont des configurations d'objets géométriques. Une fois l'objet géométrique de référence choisi, nous définissons l'énergie du processus par le biais d'un terme a priori, modélisant les contraintes sur les objets et leurs interactions, ainsi qu'un terme image. Nous échantillonnons le processus objet grâce à un algorithme de type Monte Carlo par Chaînes de Markov à sauts réversibles (RJMCMC), optimisé par un recuit simulé afin d'extraire la meilleure configuration d'objets, qui nous donne l'extraction recherchée.
Dans ce manuscrit, nous proposons différents modèles d'extraction de houppiers, qui extraient des informations à l'échelle de l'arbre selon la densité du peuplement. Dans les peuplements denses, nous présentons un processus d'ellipses, et dans les zones de plus faible densité, un processus d'ellipsoïdes. Nous obtenons ainsi le nombre d'arbres, leur localisation, le diamètre de la couronne et leur hauteur pour les zones non denses. Les algorithmes automatiques résultant de cette modélisation sont testés sur des images IRC très haute résolution fournies par l'Inventaire Forestier National (IFN). |
Abstract :
This thesis addresses the problem of tree crown extraction from Colour InfraRed (CIR) aerial images of forests. Our models are based on object processes, otherwise known as marked point processes. These mathematical objects are random variables whose realizations are configurations of geometrical shapes. This approach yields an energy minimization problem, where the energy is composed of a regularization term (prior density), which introduces some constraints on the objects and their interactions, and a data term, which links the objects to the features to be extracted. Once the reference object has been chosen, we sample the process and extract the best configuration of objects with respect to the energy, using a Reversible Jump Markov Chain Monte Carlo (RJMCMC) algorithm embedded in a Simulated Annealing scheme.
We propose different models for tree crown extraction depending on the density of the stand. In dense areas, we use an ellipse process, while in sparse vegetation an ellipsoïd process is used. As a result we obtain the number of stems, their position, the diameters of the crowns and the heights of the trees for sparse areas. The resulting algorithms are tested on high resolution CIR aerial images provided by the French National Forest Inventory (IFN). |
|
top of the page
15 Conference articles |
1 - An improved 'gas of circles' higher-order active contour model and its application to tree crown extraction. P. Horvath and I. H. Jermyn and Z. Kato and J. Zerubia. In Proc. Indian Conference on Computer Vision, Graphics, and Image Processing (ICVGIP), Madurai, India, December 2006. Keywords : Tree Crown Extraction, Aerial images, Higher-order, Active contour, Gas of circles, Shape.
@INPROCEEDINGS{Horvath06_icvgip,
|
author |
= |
{Horvath, P. and Jermyn, I. H. and Kato, Z. and Zerubia, J.}, |
title |
= |
{An improved 'gas of circles' higher-order active contour model and its application to tree crown extraction}, |
year |
= |
{2006}, |
month |
= |
{December}, |
booktitle |
= |
{Proc. Indian Conference on Computer Vision, Graphics, and Image Processing (ICVGIP)}, |
address |
= |
{Madurai, India}, |
url |
= |
{http://dx.doi.org/10.1007/11949619_14}, |
pdf |
= |
{ftp://ftp-sop.inria.fr/ariana/Articles/2006_Horvath06_icvgip.pdf}, |
keyword |
= |
{Tree Crown Extraction, Aerial images, Higher-order, Active contour, Gas of circles, Shape} |
} |
Abstract :
A central task in image processing is to find the
region in the image corresponding to an entity. In a
number of problems, the region takes the form of a
collection of circles, eg tree crowns in remote
sensing imagery; cells in biological and medical
imagery. In~citeHorvath06b, a model of such regions,
the `gas of circles' model, was developed based on
higher-order active contours, a recently developed
framework for the inclusion of prior knowledge in
active contour energies. However, the model suffers
from a defect. In~citeHorvath06b, the model
parameters were adjusted so that the circles were local
energy minima. Gradient descent can become stuck in
these minima, producing phantom circles even with no
supporting data. We solve this problem by calculating,
via a Taylor expansion of the energy, parameter values
that make circles into energy inflection points rather
than minima. As a bonus, the constraint halves the
number of model parameters, and severely constrains one
of the two that remain, a major advantage for an
energy-based model. We use the model for tree crown
extraction from aerial images. Experiments show that
despite the lack of parametric freedom, the new model
performs better than the old, and much better than a
classical active contour. |
|
top of the page
These pages were generated by
|