Autograd
swMATH ID: | 22077 |
Software Authors: | Maclaurin, Dougal; Duvenaud, David; Johnson, Matt |
Description: | Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python’s features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. For more information, check out the tutorial and the examples directory. |
Homepage: | https://github.com/HIPS/autograd |
Source Code: | https://github.com/HIPS/autograd |
Related Software: | TensorFlow; PyTorch; Python; Theano; NumPy; SciPy; JAX; DiffSharp; Adam; GitHub; Scikit; Keras; Julia; Tangent; CVXPY; Pymanopt; Stan; Matlab; UCI-ml; OpenFermion |
Cited in: | 24 Documents |
all
top 5
Cited by 75 Authors
all
top 5
Cited in 17 Serials
all
top 5