5.1.6.3. numdifftools.nd_algopy.Jacobian¶
- class Jacobian(fun, n=1, method='forward', full_output=False)[source]¶
Calculate Jacobian with Algorithmic Differentiation method
- Parameters
- fun: function
function of one array fun(x, *args, **kwds)
- method: string, optional {‘forward’, ‘reverse’}
defines method used in the approximation
- Returns
- jacob: array
Jacobian
See also
Notes
Algorithmic differentiation is a set of techniques to numerically evaluate the derivative of a function specified by a computer program. AD exploits the fact that every computer program, no matter how complicated, executes a sequence of elementary arithmetic operations (addition, subtraction, multiplication, division, etc.) and elementary functions (exp, log, sin, cos, etc.). By applying the chain rule repeatedly to these operations, derivatives of arbitrary order can be computed automatically, accurately to working precision, and using at most a small constant factor more arithmetic operations than the original program.
References
Sebastian F. Walter and Lutz Lehmann 2013, “Algorithmic differentiation in Python with AlgoPy”, in Journal of Computational Science, vol 4, no 5, pp 334 - 344, http://www.sciencedirect.com/science/article/pii/S1877750311001013
https://en.wikipedia.org/wiki/Automatic_differentiation
Examples
>>> import numpy as np >>> import numdifftools.nd_algopy as nda
#(nonlinear least squares)
>>> xdata = np.arange(0,1,0.1) >>> ydata = 1+2*np.exp(0.75*xdata) >>> fun = lambda c: (c[0]+c[1]*np.exp(c[2]*xdata) - ydata)**2
Jfun = nda.Jacobian(fun) # Todo: This does not work Jfun([1,2,0.75]).T # should be numerically zero array([[ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.],
[ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]])
>>> Jfun2 = nda.Jacobian(fun, method='reverse') >>> res = Jfun2([1,2,0.75]).T # should be numerically zero >>> np.allclose(res, ... [[ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], ... [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.], ... [ 0., 0., 0., 0., 0., 0., 0., 0., 0., 0.]]) True
>>> f2 = lambda x : x[0]*x[1]*x[2]**2 >>> Jfun2 = nda.Jacobian(f2) >>> np.allclose(Jfun2([1., 2., 3.]), [[ 18., 9., 12.]]) True
>>> Jfun21 = nda.Jacobian(f2, method='reverse') >>> np.allclose(Jfun21([1., 2., 3.]), [[ 18., 9., 12.]]) True
>>> def fun3(x): ... n = int(np.prod(np.shape(x[0]))) ... out = nda.algopy.zeros((2, n), dtype=x) ... out[0] = x[0]*x[1]*x[2]**2 ... out[1] = x[0]*x[1]*x[2] ... return out >>> Jfun3 = nda.Jacobian(fun3)
>>> np.allclose(Jfun3([1., 2., 3.]), [[[18., 9., 12.]], [[6., 3., 2.]]]) True >>> np.allclose(Jfun3([4., 5., 6.]), [[[180., 144., 240.]], ... [[30., 24., 20.]]]) True >>> np.allclose(Jfun3(np.array([[1.,2.,3.], [4., 5., 6.]]).T), ... [[[18., 0., 9., 0., 12., 0.], ... [0., 180., 0., 144., 0., 240.]], ... [[6., 0., 3., 0., 2., 0.], ... [0., 30., 0., 24., 0., 20.]]]) True
Methods
__call__: callable with the following parameters:
x: array_like value at which function derivative is evaluated args: tuple Arguments for function fun. kwds: dict Keyword arguments for function fun.
- __init__(fun, n=1, method='forward', full_output=False)¶
Methods
__init__
(fun[, n, method, full_output])computational_graph
(x, *args, **kwds)Attributes
fun