aboutsummaryrefslogtreecommitdiff
path: root/math/Makefile
diff options
context:
space:
mode:
authorRuslan Makhmatkhanov <rm@FreeBSD.org>2019-02-27 22:11:15 +0000
committerRuslan Makhmatkhanov <rm@FreeBSD.org>2019-02-27 22:11:15 +0000
commitedc98fd01e771202aa5b15b23421e6318f37f156 (patch)
treec33c1e687df9a895b5ee28da9c42dd5d0af5a6cd /math/Makefile
parentc7f0ef2cc127ee95eddaf51262de00291cb8391d (diff)
downloadports-edc98fd01e771202aa5b15b23421e6318f37f156.tar.gz
ports-edc98fd01e771202aa5b15b23421e6318f37f156.zip
Autograd can automatically differentiate native Python and Numpy code. It can
handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. WWW: https://github.com/HIPS/autograd
Notes
Notes: svn path=/head/; revision=494091
Diffstat (limited to 'math/Makefile')
-rw-r--r--math/Makefile1
1 files changed, 1 insertions, 0 deletions
diff --git a/math/Makefile b/math/Makefile
index 815b23289681..4689fbe81a6e 100644
--- a/math/Makefile
+++ b/math/Makefile
@@ -690,6 +690,7 @@
SUBDIR += py-algopy
SUBDIR += py-altgraph
SUBDIR += py-apgl
+ SUBDIR += py-autograd
SUBDIR += py-basemap
SUBDIR += py-basemap-data
SUBDIR += py-bayesian-optimization