Details

Autograd: Automatic differentiator for Python and Numpy

Autograd: Automatic differentiator for Python and Numpy
5/5 based on 1 votes.
Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It uses reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments. The main intended application is gradient-based optimization. For more information, check out the tutorial and the examples directory.
,
Submitted by elementlist on Jan 13, 2017
358 views. Averaging 0 views per day.

Post Reply


Please login or register if you wish to leave a comment.

Quick Search

Statistics

3,012 listings in 21 categories, with 2,257,113 clicks. Directory last updated Sep 12, 2023. Welcome Amara Fatima, the newest member.