============== Random numbers ============== There are four considerations for random number generation and consumption in PyNN: **Reproducibility**: When comparing simulations with different backends, we may wish to ensure that all backends use the same sequence of random numbers so that the only differences between simulations arise from the numerics of the simulators. **Performance**: All simulators have their own built-in facilities for random number generation, and it may be faster to use these than to use random numbers generated by PyNN. **Distributed simulations**: When distributing simulations across multiple processors using MPI, we may wish to ensure that the sequence of random numbers is independent of the number of computation nodes. **Quality**: Different models have different requirements for the quality of the (pseudo-)random number generator used. For models that are not strongly dependent on this, we may wish to use a generator that is faster but has lower-quality. For models that are highly sensitive, a slower but higher-quality generator may be desired. Because of these considerations, PyNN aims to provide a great deal of flexibility in specifying random number generation for those who need it, while hiding the details entirely for those who do not. :class:`RNG` classes ==================== All functions and methods in the PyNN API that can make use of random numbers have an optional *rng* argument, which should be an instance of a subclass of :class:`pyNN.random.AbstractRNG`. PyNN provides three such sub-classes: :class:`~pyNN.random.NumpyRNG`: Uses the :class:`numpy.random.RandomState` class (Mersenne Twister). :class:`~pyNN.random.GSLRNG`: Uses the `GNU Scientific Library random number generators`_. :class:`~pyNN.random.NativeRNG`: Signals that the simulator's own built-in RNG should be used. If you wish to use your own random number generator, it is reasonably straightforward to do so: see :doc:`reference/random` in the API reference. .. note:: If the *rng* argument is not supplied (or is `None`), then the method or function creates a new :class:`~pyNN.random.NumpyRNG` for its own use. All :class:`RNG` classes accept a *seed* argument, and a *parallel_safe* argument. The latter is `True` by default, and ensures that the simulation results will **not** depend on the number of MPI nodes in a distributed simulation. This independence can be computationally costly, however, so it is possible to set *parallel_safe=False*, accepting that the results will be dependent on the number of nodes, in order to get better performace. .. note:: *parallel_safe* may or may not have any effect when using a :class:`~pyNN.random.NativeRNG`, depending on the simulator. The :meth:`next` method ----------------------- Apart from the constructor, :class:`RNG` classes have only one important method: :meth:`next`, which returns a NumPy array containing random numbers from the requested distribution: .. testsetup:: from pyNN.random import NumpyRNG, GSLRNG, RandomDistribution .. doctest:: >>> rng = NumpyRNG(seed=824756) >>> rng.next(5, 'normal', {'mu': 1.0, 'sigma': 0.2}) array([ 0.65866423, 0.87500017, 0.90755753, 0.93793779, 0.94839735]) >>> rng = GSLRNG(seed=824756, type='ranlxd2') # RANLUX algorithm of Luescher >>> rng.next(5, 'normal', {'mu': 1.0, 'sigma': 0.2}) array([ 0.61104097, 0.83086026, 0.87072741, 0.7513628 , 1.12875371]) In versions of PyNN prior to 0.8, distribution names and parameterisations were not standardized: e.g. :class:`GSLRNG` needed 'gaussian' rather than 'normal'. As of PyNN 0.8, the following standardized names are used: ========================== ==================== =============================================== Name Parameters Comments -------------------------- -------------------- ----------------------------------------------- binomial n, p gamma k, theta exponential beta lognormal mu, sigma normal mu, sigma normal_clipped mu, sigma, low, high Values outside (low, high) are redrawn normal_clipped_to_boundary mu, sigma, low, high Values below/above low/high are set to low/high poisson lambda uniform low, high uniform_int low, high vonmises mu, kappa ========================== ==================== =============================================== The :class:`~pyNN.random.RandomDistribution` class ================================================== The :class:`~pyNN.random.RandomDistribution` class encapsulates a choice of random number generator and a choice of distribution, so that its :meth:`next` method requires only the number of values required as argument: .. doctest:: >>> gamma = RandomDistribution('gamma', (2.0, 0.3), rng=NumpyRNG(seed=72386)) >>> gamma.next(5) array([ 0.4325809 , 0.12952503, 1.58510406, 0.81182457, 0.07577787]) You can alternatively provide parameter names as keyword arguments, e.g.: .. doctest:: >>> gamma = RandomDistribution('gamma', k=2.0, theta=0.3, rng=NumpyRNG(seed=72386)) Note that :meth:`~pyNN.random.RandomDistribution.next` called without any arguments returns a single number, not an array: .. doctest:: >>> gamma.next() 0.52020946027308368 >>> gamma.next(1) array([ 0.4863944]) .. note:: the apparent difference in precision between the single number and the array is not real: NumPy only *displays* a limited number of digits but the numbers in the array have full precision. .. _`GNU Scientific Library random number generators`: http://pygsl.sourceforge.net/reference/pygsl/module-pygsl.rng.html