Back to All Events

Timo Koski: Likelihood-free inference using jensen-shannon divergence

Abstract: This talk studies analytically the technique of likelihood-free inference (LFI) for simulator-based modeling developed by Michael Gutmann and Jukka Corander. Simulator-based models are also known as implicit statistical models, since the likelihood function cannot be written down explicitly due to the complexity of the simulator model. A survey of simulator-based modeling is found in

J.Lintusaari, M.U.Gutmann.R.Dutta, S.Kaski and J.Corander:Fundamentals and recent developments in approximate Bayesian computation.Systematic Biology, 66, 2017, e66--e82


We restrict here to simulator models outputting categorical data with a known finite number of categories. Observed data is summarized by its empirical distribution, which is compared with the empirical distribution of simulated data, an output of the simulator model with a parameter value of the model and a random variable as input. Several synthetic empirical distributions are simulated under different parameter values of the simulator model and compared with the empirical distribution of the observed data.

The degree of disagreement between the observed empirical distribution and a synthetic empirical distribution is here measured by a numerical quantity called Jensen-Shannon Divergence (JSD). The minimum JSD-estimate of the model parameters can be computed by BOLFI, a software for Bayesian optimization for LFI, using the simulated summary statistics.


JSD is defined by a blend of two Kullback-Leibler divergences (KLD). We develop a series of inequalities between KLD and total variation distance (TVD) based on the fact that JSD, KLD and TVD total are examples of f-divergences. These inequalities are applied to prove that the minimum JSD-estimate of the model parameters is in fact asymptotically equivalent to the maximum likelihood estimate w.r.t. the implicit model, assuming that the simulator model is rich enough to express the true underlying distribution of the data. Simulation results will be presented to demonstrate this asymptotics. Similar techniques can be used to prove that the minimum JSD-estimate of the model parametersexists and is a measurable random variable.

If time permits, the talk will discuss various significant properties and interpretations of the JSD underlying this technique of LFI. This is joint work with Jukka Corander, Universitet i Oslo, University of Helsinki and HIIT, and Ulpu Remes, Universitet i Oslo.

Speakers:  Timo Koski

Affiliation: KTH Royal Institute of Technology, University of Helsinki and HIIT

Place of Seminar:  Zoom