KissABC 3.0
KissABC.ApproxKernelizedPosteriorKissABC.ApproxPosteriorKissABC.CommonLogDensityKissABC.FactoredBase.lengthBase.randDistributions.logpdfDistributions.pdfKissABC.cdf_g_invKissABC.sample_gKissABC.smcStatsBase.sample
KissABC.ApproxKernelizedPosterior — TypeApproxKernelizedPosterior(
prior::Distribution,
cost::Function,
target_average_cost::Real
)this function will return a type which can be used in the sample function as an ABC density, this type works by assuming Gaussianly distributed errors 𝒩(0,ϵ), ϵ is specified in the variable target_average_cost.
KissABC.ApproxPosterior — TypeApproxPosterior(
prior::Distribution,
cost::Function,
max_cost::Real
)this function will return a type which can be used in the sample function as an ABC density, this type works by assuming uniformly distributed errors in [-ϵ,ϵ], ϵ is specified in the variable max_cost.
KissABC.CommonLogDensity — TypeCommonLogDensity(nparameters, sample_init, lπ)this function will return a type for performing classical MCMC via the sample function.
nparameters: total number of parameters per sample.
sample_init: function which accepts an RNG::AbstractRNG and returns a sample for lπ.
lπ: function which accepts a sample, and returns a log-density float value.
KissABC.Factored — TypeFactored{N} <: Distribution{Multivariate, MixedSupport}a Distribution type that can be used to combine multiple UnivariateDistribution's and sample from them. Example: it can be used as prior = Factored(Normal(0,1), Uniform(-1,1))
Base.length — Methodlength(p::Factored) = beginreturns the number of distributions contained in p.
Base.rand — Methodrand(rng::AbstractRNG, factoreddist::Factored)function to sample one element from a Factored object
Distributions.logpdf — Methodlogpdf(d::Factored, x) = beginFunction to evaluate the logpdf of a Factored distribution object
Distributions.pdf — Methodpdf(d::Factored, x) = beginFunction to evaluate the pdf of a Factored distribution object
KissABC.cdf_g_inv — MethodInverse cdf of g-pdf, see eq. 10 of Foreman-Mackey et al. 2013.
KissABC.sample_g — MethodSample from g using inverse transform sampling. a=2.0 is recommended.
KissABC.smc — MethodAdaptive SMC from P. Del Moral 2012, with Affine invariant proposal mechanism, faster that AIS for ABC targets.
function smc(
prior::Distribution,
cost::Function;
rng::AbstractRNG = Random.GLOBAL_RNG,
nparticles::Int = 100,
M::Int = 1,
alpha = 0.95,
mcmc_retrys::Int = 0,
mcmc_tol = 0.015,
epstol = 0.0,
r_epstol = (1 - alpha)^1.5 / 50,
min_r_ess = alpha^2,
max_stretch = 2.0,
verbose::Bool = false,
parallel::Bool = false,
)prior: a Distribution object representing the parameters prior.cost: a function that given a prior sample returns the cost for said sample (e.g. a distance between simulated data and target data).rng: an AbstractRNG object which is used by SMC for inference (it can be useful to make an inference reproducible).nparticles: number of total particles to use for inference.M: number of cost evaluations per particle, increasing this can reduce the chance of rejecting good particles.alpha- used for adaptive tolerance, by solvingESS(n,ϵ(n)) = α ESS(n-1, ϵ(n-1))forϵ(n)at stepn.mcmc_retrys- if set > 0, whenever the fraction of accepted particles drops below the tolerancemcmc_tolthe MCMC step is repeated (no more thanmcmc_retrystimes).mcmc_tol- stopping condition for SMC, if the fraction of accepted particles drops belowmcmc_tolthe algorithm terminates.epstol- stopping condition for SMC, if the adaptive cost threshold drops belowepstolthe algorithm has converged and thus it terminates.min_r_ess- whenever the fractional effective sample size drops belowmin_r_ess, a systematic resampling step is performed.max_stretch- the proposal distribution ofsmcis the stretch move of Foreman-Mackey et al. 2013, the larger the parameters the wider becomes the proposal distribution.verbose- if set totrue, enables verbosity.parallel- if set totrue, threaded parallelism is enabled, keep in mind that the cost function must be Thread-safe in such case.
Example
using KissABC
prior=Factored(Normal(0,5), Normal(0,5))
cost((x,y)) = 50*(x+randn()*0.01-y^2)^2+(y-1+randn()*0.01)^2
results = smc(prior, cost, alpha=0.5, nparticles=5000).Poutput:
2-element Array{Particles{Float64,5000},1}:
1.0 ± 0.029
0.999 ± 0.012StatsBase.sample — Functionsample(model, AIS(N), Ns[; optional args])
sample(model, AIS(N), MCMCThreads(), Ns, Nc[; optional keyword args])
sample(model, AIS(N), MCMCDistributed(), Ns, Nc[; optional keyword args])Generalities
This function will run an Affine Invariant MCMC sampler, and will return a Particles object for each parameter, the mandatory parameters are:
model: a subtype of AbstractDensity, look at ApproxPosterior, ApproxKernelizedPosterior, CommonLogDensity.
N: number of particles in the ensemble, this particles will be evolved to generate new samples.
Ns: total number of samples which must be recorded.
Nc: total number of chains to run in parallel if MCMCThreads or MCMCDistributed is enabled.
the optional arguments available are:
discard_initial: number of mcmc particles to discard before saving any sample.
ntransitions: number of mcmc steps per particle between each sample.
retry_sampling: number of maximum attempts to resample an initial particle whose cost (or log-density) is ±∞ or NaN.
progress: a boolean to disable verbosity
Minimal Example for CommonLogDensity:
using KissABC
D = CommonLogDensity(
2, #number of parameters
rng -> randn(rng, 2), # initial sampling strategy
x -> -100 * (x[1] - x[2]^2)^2 - (x[2] - 1)^2, # rosenbrock banana log-density
)
res = sample(D, AIS(50), 1000, ntransitions = 100, discard_initial = 500, progress = false)
println(res)output:
Particles{Float64,1000}[1.43 ± 1.4, 0.99 ± 0.67]Minimal Example for ApproxKernelizedPosterior (ApproxPosterior)
using KissABC
prior = Uniform(-10, 10) # prior distribution for parameter
sim(μ) = μ + rand((randn() * 0.1, randn())) # simulator function
cost(x) = abs(sim(x) - 0.0) # cost function to compare simulations to target data, in this case simply '0'
plan = ApproxPosterior(prior, cost, 0.01) # Approximate model of log-posterior density (ABC)
# ApproxKernelizedPosterior can be used in the same fashion
res = sample(plan, AIS(100), 2000, discard_initial = 10000, progress = false)
println(res)output:
0.0 ± 0.46