diff options
author | Ruslan Makhmatkhanov <rm@FreeBSD.org> | 2019-02-27 22:11:15 +0000 |
---|---|---|
committer | Ruslan Makhmatkhanov <rm@FreeBSD.org> | 2019-02-27 22:11:15 +0000 |
commit | edc98fd01e771202aa5b15b23421e6318f37f156 (patch) | |
tree | c33c1e687df9a895b5ee28da9c42dd5d0af5a6cd /math/py-autograd | |
parent | c7f0ef2cc127ee95eddaf51262de00291cb8391d (diff) | |
download | ports-edc98fd01e771202aa5b15b23421e6318f37f156.tar.gz ports-edc98fd01e771202aa5b15b23421e6318f37f156.zip |
Notes
Diffstat (limited to 'math/py-autograd')
-rw-r--r-- | math/py-autograd/Makefile | 22 | ||||
-rw-r--r-- | math/py-autograd/distinfo | 3 | ||||
-rw-r--r-- | math/py-autograd/pkg-descr | 10 |
3 files changed, 35 insertions, 0 deletions
diff --git a/math/py-autograd/Makefile b/math/py-autograd/Makefile new file mode 100644 index 000000000000..9e9f5b8da2b3 --- /dev/null +++ b/math/py-autograd/Makefile @@ -0,0 +1,22 @@ +# $FreeBSD$ + +PORTNAME= autograd +DISTVERSION= 1.2 +CATEGORIES= math python +MASTER_SITES= CHEESESHOP +PKGNAMEPREFIX= ${PYTHON_PKGNAMEPREFIX} + +MAINTAINER= rm@FreeBSD.org +COMMENT= Efficiently computes derivatives of numpy code + +LICENSE= MIT + +RUN_DEPENDS= ${PYNUMPY} \ + ${PYTHON_PKGNAMEPREFIX}future>=0.15.2:devel/py-future@${PY_FLAVOR} + +USES= python +USE_PYTHON= autoplist distutils + +NO_ARCH= yes + +.include <bsd.port.mk> diff --git a/math/py-autograd/distinfo b/math/py-autograd/distinfo new file mode 100644 index 000000000000..1cee31731d1c --- /dev/null +++ b/math/py-autograd/distinfo @@ -0,0 +1,3 @@ +TIMESTAMP = 1551302910 +SHA256 (autograd-1.2.tar.gz) = a08bfa6d539b7a56e7c9f4d0881044afbef5e75f324a394c2494de963ea4a47d +SIZE (autograd-1.2.tar.gz) = 32540 diff --git a/math/py-autograd/pkg-descr b/math/py-autograd/pkg-descr new file mode 100644 index 000000000000..6f472f4ef6f7 --- /dev/null +++ b/math/py-autograd/pkg-descr @@ -0,0 +1,10 @@ +Autograd can automatically differentiate native Python and Numpy code. It can +handle a large subset of Python's features, including loops, ifs, recursion and +closures, and it can even take derivatives of derivatives of derivatives. It +supports reverse-mode differentiation (a.k.a. backpropagation), which means it +can efficiently take gradients of scalar-valued functions with respect to +array-valued arguments, as well as forward-mode differentiation, and the two +can be composed arbitrarily. The main intended application of Autograd is +gradient-based optimization. + +WWW: https://github.com/HIPS/autograd |