Details
Autograd: Automatic differentiator for Python and Numpy
|
Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It uses reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments. The main intended application is gradient-based optimization. For more information, check out the tutorial and the examples directory. |
python, machine learning |
Submitted by elementlist on Jan 13, 2017 |
358 views. Averaging 0 views per day. |
Please login or register if you wish to leave a comment.
Submit
New Links
Most Popular
Quick Search
Statistics
3,012 listings in 21 categories, with 2,257,113 clicks. Directory last updated Sep 12, 2023.
Welcome Amara Fatima, the newest member.
Comments on Autograd: Automatic differentiator for Python and Numpy