An example using Nestle

Here we show a standalone example of using Nestle to estimate the parameters of a straight line model in data with Gaussian noise. The data and model used in this example are defined in createdata.py, which can be downloaded from here. The script shown below can be downloaded from here.

Example code

download
#!/usr/bin/env python
# -*- coding: utf-8 -*-

"""
Example of running Nestle to fit the parameters of a straight line.
"""

from __future__ import print_function, division

import os
import sys
from scipy.special import ndtri
import numpy as np

# import Nestle
import nestle

# import model and data
from createdata import *

LN2PI = np.log(2.*np.pi)
LNSIGMA = np.log(sigma)

def prior_transform(theta):
    """
    A function defining the tranform between the parameterisation in the unit hypercube
    to the true parameters.

    Args:
        theta (tuple): a tuple containing the parameters.
        
    Returns:
        tuple: a new tuple or array with the transformed parameters.
    """

    mprime, cprime = theta # unpack the parameters (in their unit hypercube form)

    cmin = -10.  # lower bound on uniform prior on c
    cmax = 10.   # upper bound on uniform prior on c

    mmu = 0.     # mean of Gaussian prior on m
    msigma = 10. # standard deviation of Gaussian prior on m

    m = mmu + msigma*ndtri(mprime) # convert back to m
    c = cprime*(cmax-cmin) + cmin  # convert back to c

    return (m, c)

def loglikelihood_nestle(theta):
    """
    The log-likelihood function.
    """

    m, c = theta # unpack the parameters

    # normalisation
    norm = -0.5*M*LN2PI - M*LNSIGMA

    # chi-squared (data, sigma and x are global variables defined early on in this notebook)
    chisq = np.sum(((data-straight_line(x, m, c))/sigma)**2)

    return norm - 0.5*chisq

nlive = 1024     # number of live points
method = 'multi' # use MutliNest algorithm
ndims = 2        # two parameters
tol= 0.5         # the stopping criterion (this is the nestle default, so doesn't need to be set)

res = nestle.sample(loglikelihood_nestle, prior_transform, ndims, method=method, npoints=nlive, dlogz=tol)

logZnestle = res.logz                         # value of logZ
infogainnestle = res.h                        # value of the information gain in nats
logZerrnestle = np.sqrt(infogainnestle/nlive) # estimate of the statistcal uncertainty on logZ

# output marginal likelihood
print('Marginalised evidence is {} ± {}'.format(logZnestle, logZerrnestle))

# re-scale weights to have a maximum of one
nweights = res.weights/np.max(res.weights)

# get the probability of keeping a sample from the weights
keepidx = np.where(np.random.rand(len(nweights)) < nweights)[0]

# get the posterior samples
postsamples = res.samples[keepidx,:]

print('Number of posterior samples is {}'.format(postsamples.shape[0]))

# plot posterior samples (if corner.py is installed)
try:
    import matplotlib as mpl
    mpl.use("Agg") # force Matplotlib backend to Agg
    import corner # import corner.py
except ImportError:
    sys.exit(1)

fig = corner.corner(postsamples, labels=[r"$m$", r"$c$"], truths=[m, c])
fig.savefig('Nestle.png')

Running the code

A description of installing Nestle is given here. If you have downloaded the createdata.py and test_Nestle.py scripts into the directory ${HOME}, then you can run it using:

python test_Nestle.py

If you have Matplotlib installed then the script will produce a plot of the posterior distributions on the straight line parameters $m$ and $c$.

A Python 3 Docker image with Nestle installed is available, which can be used with:

docker run -it -v ${HOME}:/work mattpitkin/samplers:python3

to enter an interactive container, and then within the container the test script can be run with:

python test_Nestle.py

Comments