Skip site navigation (1)Skip section navigation (2)
Date:      Thu, 9 Jul 2020 18:08:06 +0000 (UTC)
From:      Sunpoet Po-Chuan Hsieh <sunpoet@FreeBSD.org>
To:        ports-committers@freebsd.org, svn-ports-all@freebsd.org, svn-ports-head@freebsd.org
Subject:   svn commit: r541767 - in head/math: . py-jax
Message-ID:  <202007091808.069I86cw036835@repo.freebsd.org>

next in thread | raw e-mail | index | archive | help
Author: sunpoet
Date: Thu Jul  9 18:08:06 2020
New Revision: 541767
URL: https://svnweb.freebsd.org/changeset/ports/541767

Log:
  Add py-jax 0.1.72
  
  JAX is Autograd and XLA, brought together for high-performance machine learning
  research.
  
  With its updated version of Autograd, JAX can automatically differentiate native
  Python and NumPy functions. It can differentiate through loops, branches,
  recursion, and closures, and it can take derivatives of derivatives of
  derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation)
  via grad as well as forward-mode differentiation, and the two can be composed
  arbitrarily to any order.
  
  What's new is that JAX uses XLA to compile and run your NumPy programs on GPUs
  and TPUs. Compilation happens under the hood by default, with library calls
  getting just-in-time compiled and executed. But JAX also lets you just-in-time
  compile your own Python functions into XLA-optimized kernels using a
  one-function API, jit. Compilation and automatic differentiation can be composed
  arbitrarily, so you can express sophisticated algorithms and get maximal
  performance without leaving Python. You can even program multiple GPUs or TPU
  cores at once using pmap, and differentiate through the whole thing.
  
  Dig a little deeper, and you'll see that JAX is really an extensible system for
  composable function transformations. Both grad and jit are instances of such
  transformations. Others are vmap for automatic vectorization and pmap for
  single-program multiple-data (SPMD) parallel programming of multiple
  accelerators, with more to come.
  
  WWW: https://github.com/google/jax

Added:
  head/math/py-jax/
  head/math/py-jax/Makefile   (contents, props changed)
  head/math/py-jax/distinfo   (contents, props changed)
  head/math/py-jax/pkg-descr   (contents, props changed)
Modified:
  head/math/Makefile

Modified: head/math/Makefile
==============================================================================
--- head/math/Makefile	Thu Jul  9 18:08:00 2020	(r541766)
+++ head/math/Makefile	Thu Jul  9 18:08:06 2020	(r541767)
@@ -762,6 +762,7 @@
     SUBDIR += py-hdbscan
     SUBDIR += py-hdmedians
     SUBDIR += py-intspan
+    SUBDIR += py-jax
     SUBDIR += py-keras
     SUBDIR += py-keras-applications
     SUBDIR += py-keras-preprocessing

Added: head/math/py-jax/Makefile
==============================================================================
--- /dev/null	00:00:00 1970	(empty, because file is newly added)
+++ head/math/py-jax/Makefile	Thu Jul  9 18:08:06 2020	(r541767)
@@ -0,0 +1,24 @@
+# Created by: Po-Chuan Hsieh <sunpoet@FreeBSD.org>
+# $FreeBSD$
+
+PORTNAME=	jax
+PORTVERSION=	0.1.72
+CATEGORIES=	math python
+MASTER_SITES=	CHEESESHOP
+PKGNAMEPREFIX=	${PYTHON_PKGNAMEPREFIX}
+
+MAINTAINER=	sunpoet@FreeBSD.org
+COMMENT=	Differentiate, compile, and transform Numpy code
+
+LICENSE=	APACHE20
+
+RUN_DEPENDS=	${PYTHON_PKGNAMEPREFIX}absl-py>=0:devel/py-absl-py@${PY_FLAVOR} \
+		${PYNUMPY} \
+		${PYTHON_PKGNAMEPREFIX}opt-einsum>=0:math/py-opt-einsum@${PY_FLAVOR}
+
+USES=		python:3.6+
+USE_PYTHON=	autoplist concurrent distutils
+
+NO_ARCH=	yes
+
+.include <bsd.port.mk>

Added: head/math/py-jax/distinfo
==============================================================================
--- /dev/null	00:00:00 1970	(empty, because file is newly added)
+++ head/math/py-jax/distinfo	Thu Jul  9 18:08:06 2020	(r541767)
@@ -0,0 +1,3 @@
+TIMESTAMP = 1594308022
+SHA256 (jax-0.1.72.tar.gz) = b551a7b9fee31e744449191f83e0121d9a6a5a04755494df5b3cd468477f2119
+SIZE (jax-0.1.72.tar.gz) = 398705

Added: head/math/py-jax/pkg-descr
==============================================================================
--- /dev/null	00:00:00 1970	(empty, because file is newly added)
+++ head/math/py-jax/pkg-descr	Thu Jul  9 18:08:06 2020	(r541767)
@@ -0,0 +1,26 @@
+JAX is Autograd and XLA, brought together for high-performance machine learning
+research.
+
+With its updated version of Autograd, JAX can automatically differentiate native
+Python and NumPy functions. It can differentiate through loops, branches,
+recursion, and closures, and it can take derivatives of derivatives of
+derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation)
+via grad as well as forward-mode differentiation, and the two can be composed
+arbitrarily to any order.
+
+What's new is that JAX uses XLA to compile and run your NumPy programs on GPUs
+and TPUs. Compilation happens under the hood by default, with library calls
+getting just-in-time compiled and executed. But JAX also lets you just-in-time
+compile your own Python functions into XLA-optimized kernels using a
+one-function API, jit. Compilation and automatic differentiation can be composed
+arbitrarily, so you can express sophisticated algorithms and get maximal
+performance without leaving Python. You can even program multiple GPUs or TPU
+cores at once using pmap, and differentiate through the whole thing.
+
+Dig a little deeper, and you'll see that JAX is really an extensible system for
+composable function transformations. Both grad and jit are instances of such
+transformations. Others are vmap for automatic vectorization and pmap for
+single-program multiple-data (SPMD) parallel programming of multiple
+accelerators, with more to come.
+
+WWW: https://github.com/google/jax



Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?202007091808.069I86cw036835>