Previous : Urban area model Up : Contents Next : Results



Algorithm

We have defined a probabilist model for urban aeras. The estimate we use is the configuration of rectangles that maximises the density previoulsy built.

The algorithm used is a sampler, coupled with a simulated annealing.


Point process sampler

We use a MCMC method. The sampler we built is a Hastings-Metropolis-Green one. The basic idea underlying this sampler has been proposed and justified by Geyer and Moller (see [8])

We built a Markov Chain that converges ergodicaly to the distibution we have defined by its density. Hastings Metropolis algorithm use a proposition kernel and an acceptance rate.

Proposition kernel

We first need a proposition kernel. It corresponds to stochastic perturbations that can be applied to a given configuration of rectangles. The perturbations we have implemented can be divided into two class :
  • Birth (B) and death (D) of a rectangle. These perturbations either add a rectangle to a current configuration x, or kill one of the rectangle belonging to x.
    These two basic kernels, are sufficient to explore the space of all configurations. However, using only these two kernels produces a highly correlated Markov chain. As shown in [2,12] while working with MCMC, one has to be careful with the proposition kernel involved in the sampler. [20] provides a good overview of other techniques that can be used to improve Markov Chain behavior. So, in order to improve the results, we use the work of Green in [10]. This work has been analyzed for point processes by Geyer in [7] and make us add the following transformations.

  • Other perturbations . We have implemented a translation (T), a rotation (R) , and two dilations (Dl,DL) as shown by figure k) .
The proposition kernel is a mixture between this 6 pertubations methods. We note Q the proposition kernel. For a given state x, it can be written as :



Green ratio

If we apply one of the kernels q_m to a configuration x, we obtain a new configuration y. We then have to compute a ratio, depending on x and y.


This ratio is computed analytically and ensures the convergence of the markov chain to the desired distribution.

Algorithm

The algorithm can be described a following. We suppose we want to sample a point process according to the density p(.)

Given a configuration of points :


  1. we randomly select a proposition kernel q_m according to the mixture distribution.

  2. we generate y according to the selected kernel :


  3. we compute the associated Green ratio :


  4. we compute the acceptance rate alpha given by :


    and,
    • with probability alpha the proposition is accepted :


    • other whise the proposition is rejected :


Under some assumtions on the density, the Markov chain built converges ergodically to the desired distribution.



Figure k) . Perturbations kernels T,R,DL and Dl.








Simulated annealing

Once a sampler has been defined, it is possible to optimize the density, using simulated annealing, as described by Winkler in [26]. A good overview on simulated annealing techniques can be found in [23]. We replace the density p(.) by a time depending density. We replace U(.) energy associated to the density, by :


where T is temperature depending on time.

In theory, if we use the above sampler and make simoultaneously decrease the temperature from a starting value to 0 using a logarithmic law, the chain converges in total variation to a Dirac measure whose mass is equally distributed on the global minima of U(.).

In practice, we use faster decreasing scemes.

Previous : Urban area model Up : Contents Next : Results



Back to M. Ortner's home page !