aboutsummaryrefslogtreecommitdiff
path: root/math/py-jax
diff options
context:
space:
mode:
authorSunpoet Po-Chuan Hsieh <sunpoet@FreeBSD.org>2020-07-09 18:08:06 +0000
committerSunpoet Po-Chuan Hsieh <sunpoet@FreeBSD.org>2020-07-09 18:08:06 +0000
commit4d0e669f8dc658817239bee59b9596acd2eceaf8 (patch)
tree19627229200c86f348277b54bd8cddcbaea2f4ca /math/py-jax
parent25b4607212d5feafbdbc78821d3c257e9703a568 (diff)
downloadports-4d0e669f8dc658817239bee59b9596acd2eceaf8.tar.gz
ports-4d0e669f8dc658817239bee59b9596acd2eceaf8.zip
Add py-jax 0.1.72
JAX is Autograd and XLA, brought together for high-performance machine learning research. With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order. What's new is that JAX uses XLA to compile and run your NumPy programs on GPUs and TPUs. Compilation happens under the hood by default, with library calls getting just-in-time compiled and executed. But JAX also lets you just-in-time compile your own Python functions into XLA-optimized kernels using a one-function API, jit. Compilation and automatic differentiation can be composed arbitrarily, so you can express sophisticated algorithms and get maximal performance without leaving Python. You can even program multiple GPUs or TPU cores at once using pmap, and differentiate through the whole thing. Dig a little deeper, and you'll see that JAX is really an extensible system for composable function transformations. Both grad and jit are instances of such transformations. Others are vmap for automatic vectorization and pmap for single-program multiple-data (SPMD) parallel programming of multiple accelerators, with more to come. WWW: https://github.com/google/jax
Notes
Notes: svn path=/head/; revision=541767
Diffstat (limited to 'math/py-jax')
-rw-r--r--math/py-jax/Makefile24
-rw-r--r--math/py-jax/distinfo3
-rw-r--r--math/py-jax/pkg-descr26
3 files changed, 53 insertions, 0 deletions
diff --git a/math/py-jax/Makefile b/math/py-jax/Makefile
new file mode 100644
index 000000000000..17acf6d916cb
--- /dev/null
+++ b/math/py-jax/Makefile
@@ -0,0 +1,24 @@
+# Created by: Po-Chuan Hsieh <sunpoet@FreeBSD.org>
+# $FreeBSD$
+
+PORTNAME= jax
+PORTVERSION= 0.1.72
+CATEGORIES= math python
+MASTER_SITES= CHEESESHOP
+PKGNAMEPREFIX= ${PYTHON_PKGNAMEPREFIX}
+
+MAINTAINER= sunpoet@FreeBSD.org
+COMMENT= Differentiate, compile, and transform Numpy code
+
+LICENSE= APACHE20
+
+RUN_DEPENDS= ${PYTHON_PKGNAMEPREFIX}absl-py>=0:devel/py-absl-py@${PY_FLAVOR} \
+ ${PYNUMPY} \
+ ${PYTHON_PKGNAMEPREFIX}opt-einsum>=0:math/py-opt-einsum@${PY_FLAVOR}
+
+USES= python:3.6+
+USE_PYTHON= autoplist concurrent distutils
+
+NO_ARCH= yes
+
+.include <bsd.port.mk>
diff --git a/math/py-jax/distinfo b/math/py-jax/distinfo
new file mode 100644
index 000000000000..56bfd32e33c5
--- /dev/null
+++ b/math/py-jax/distinfo
@@ -0,0 +1,3 @@
+TIMESTAMP = 1594308022
+SHA256 (jax-0.1.72.tar.gz) = b551a7b9fee31e744449191f83e0121d9a6a5a04755494df5b3cd468477f2119
+SIZE (jax-0.1.72.tar.gz) = 398705
diff --git a/math/py-jax/pkg-descr b/math/py-jax/pkg-descr
new file mode 100644
index 000000000000..abc6fb51f8d8
--- /dev/null
+++ b/math/py-jax/pkg-descr
@@ -0,0 +1,26 @@
+JAX is Autograd and XLA, brought together for high-performance machine learning
+research.
+
+With its updated version of Autograd, JAX can automatically differentiate native
+Python and NumPy functions. It can differentiate through loops, branches,
+recursion, and closures, and it can take derivatives of derivatives of
+derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation)
+via grad as well as forward-mode differentiation, and the two can be composed
+arbitrarily to any order.
+
+What's new is that JAX uses XLA to compile and run your NumPy programs on GPUs
+and TPUs. Compilation happens under the hood by default, with library calls
+getting just-in-time compiled and executed. But JAX also lets you just-in-time
+compile your own Python functions into XLA-optimized kernels using a
+one-function API, jit. Compilation and automatic differentiation can be composed
+arbitrarily, so you can express sophisticated algorithms and get maximal
+performance without leaving Python. You can even program multiple GPUs or TPU
+cores at once using pmap, and differentiate through the whole thing.
+
+Dig a little deeper, and you'll see that JAX is really an extensible system for
+composable function transformations. Both grad and jit are instances of such
+transformations. Others are vmap for automatic vectorization and pmap for
+single-program multiple-data (SPMD) parallel programming of multiple
+accelerators, with more to come.
+
+WWW: https://github.com/google/jax