From owner-svn-ports-all@freebsd.org Thu Jul 9 18:08:09 2020 Return-Path: Delivered-To: svn-ports-all@mailman.nyi.freebsd.org Received: from mx1.freebsd.org (mx1.freebsd.org [IPv6:2610:1c1:1:606c::19:1]) by mailman.nyi.freebsd.org (Postfix) with ESMTP id D616E3517C6; Thu, 9 Jul 2020 18:08:09 +0000 (UTC) (envelope-from sunpoet@FreeBSD.org) Received: from mxrelay.nyi.freebsd.org (mxrelay.nyi.freebsd.org [IPv6:2610:1c1:1:606c::19:3]) (using TLSv1.3 with cipher TLS_AES_256_GCM_SHA384 (256/256 bits) key-exchange X25519 server-signature RSA-PSS (4096 bits) server-digest SHA256 client-signature RSA-PSS (4096 bits) client-digest SHA256) (Client CN "mxrelay.nyi.freebsd.org", Issuer "Let's Encrypt Authority X3" (verified OK)) by mx1.freebsd.org (Postfix) with ESMTPS id 4B2kgF3L8Zz4NRV; Thu, 9 Jul 2020 18:08:08 +0000 (UTC) (envelope-from sunpoet@FreeBSD.org) Received: from repo.freebsd.org (repo.freebsd.org [IPv6:2610:1c1:1:6068::e6a:0]) (using TLSv1.2 with cipher ECDHE-RSA-AES256-GCM-SHA384 (256/256 bits)) (Client did not present a certificate) by mxrelay.nyi.freebsd.org (Postfix) with ESMTPS id 4F7261515C; Thu, 9 Jul 2020 18:08:07 +0000 (UTC) (envelope-from sunpoet@FreeBSD.org) Received: from repo.freebsd.org ([127.0.1.37]) by repo.freebsd.org (8.15.2/8.15.2) with ESMTP id 069I87cD036839; Thu, 9 Jul 2020 18:08:07 GMT (envelope-from sunpoet@FreeBSD.org) Received: (from sunpoet@localhost) by repo.freebsd.org (8.15.2/8.15.2/Submit) id 069I86cw036835; Thu, 9 Jul 2020 18:08:06 GMT (envelope-from sunpoet@FreeBSD.org) Message-Id: <202007091808.069I86cw036835@repo.freebsd.org> X-Authentication-Warning: repo.freebsd.org: sunpoet set sender to sunpoet@FreeBSD.org using -f From: Sunpoet Po-Chuan Hsieh Date: Thu, 9 Jul 2020 18:08:06 +0000 (UTC) To: ports-committers@freebsd.org, svn-ports-all@freebsd.org, svn-ports-head@freebsd.org Subject: svn commit: r541767 - in head/math: . py-jax X-SVN-Group: ports-head X-SVN-Commit-Author: sunpoet X-SVN-Commit-Paths: in head/math: . py-jax X-SVN-Commit-Revision: 541767 X-SVN-Commit-Repository: ports MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit X-BeenThere: svn-ports-all@freebsd.org X-Mailman-Version: 2.1.33 Precedence: list List-Id: SVN commit messages for the ports tree List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-List-Received-Date: Thu, 09 Jul 2020 18:08:09 -0000 Author: sunpoet Date: Thu Jul 9 18:08:06 2020 New Revision: 541767 URL: https://svnweb.freebsd.org/changeset/ports/541767 Log: Add py-jax 0.1.72 JAX is Autograd and XLA, brought together for high-performance machine learning research. With its updated version of Autograd, JAX can automatically differentiate native Python and NumPy functions. It can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order. What's new is that JAX uses XLA to compile and run your NumPy programs on GPUs and TPUs. Compilation happens under the hood by default, with library calls getting just-in-time compiled and executed. But JAX also lets you just-in-time compile your own Python functions into XLA-optimized kernels using a one-function API, jit. Compilation and automatic differentiation can be composed arbitrarily, so you can express sophisticated algorithms and get maximal performance without leaving Python. You can even program multiple GPUs or TPU cores at once using pmap, and differentiate through the whole thing. Dig a little deeper, and you'll see that JAX is really an extensible system for composable function transformations. Both grad and jit are instances of such transformations. Others are vmap for automatic vectorization and pmap for single-program multiple-data (SPMD) parallel programming of multiple accelerators, with more to come. WWW: https://github.com/google/jax Added: head/math/py-jax/ head/math/py-jax/Makefile (contents, props changed) head/math/py-jax/distinfo (contents, props changed) head/math/py-jax/pkg-descr (contents, props changed) Modified: head/math/Makefile Modified: head/math/Makefile ============================================================================== --- head/math/Makefile Thu Jul 9 18:08:00 2020 (r541766) +++ head/math/Makefile Thu Jul 9 18:08:06 2020 (r541767) @@ -762,6 +762,7 @@ SUBDIR += py-hdbscan SUBDIR += py-hdmedians SUBDIR += py-intspan + SUBDIR += py-jax SUBDIR += py-keras SUBDIR += py-keras-applications SUBDIR += py-keras-preprocessing Added: head/math/py-jax/Makefile ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-jax/Makefile Thu Jul 9 18:08:06 2020 (r541767) @@ -0,0 +1,24 @@ +# Created by: Po-Chuan Hsieh +# $FreeBSD$ + +PORTNAME= jax +PORTVERSION= 0.1.72 +CATEGORIES= math python +MASTER_SITES= CHEESESHOP +PKGNAMEPREFIX= ${PYTHON_PKGNAMEPREFIX} + +MAINTAINER= sunpoet@FreeBSD.org +COMMENT= Differentiate, compile, and transform Numpy code + +LICENSE= APACHE20 + +RUN_DEPENDS= ${PYTHON_PKGNAMEPREFIX}absl-py>=0:devel/py-absl-py@${PY_FLAVOR} \ + ${PYNUMPY} \ + ${PYTHON_PKGNAMEPREFIX}opt-einsum>=0:math/py-opt-einsum@${PY_FLAVOR} + +USES= python:3.6+ +USE_PYTHON= autoplist concurrent distutils + +NO_ARCH= yes + +.include Added: head/math/py-jax/distinfo ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-jax/distinfo Thu Jul 9 18:08:06 2020 (r541767) @@ -0,0 +1,3 @@ +TIMESTAMP = 1594308022 +SHA256 (jax-0.1.72.tar.gz) = b551a7b9fee31e744449191f83e0121d9a6a5a04755494df5b3cd468477f2119 +SIZE (jax-0.1.72.tar.gz) = 398705 Added: head/math/py-jax/pkg-descr ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-jax/pkg-descr Thu Jul 9 18:08:06 2020 (r541767) @@ -0,0 +1,26 @@ +JAX is Autograd and XLA, brought together for high-performance machine learning +research. + +With its updated version of Autograd, JAX can automatically differentiate native +Python and NumPy functions. It can differentiate through loops, branches, +recursion, and closures, and it can take derivatives of derivatives of +derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation) +via grad as well as forward-mode differentiation, and the two can be composed +arbitrarily to any order. + +What's new is that JAX uses XLA to compile and run your NumPy programs on GPUs +and TPUs. Compilation happens under the hood by default, with library calls +getting just-in-time compiled and executed. But JAX also lets you just-in-time +compile your own Python functions into XLA-optimized kernels using a +one-function API, jit. Compilation and automatic differentiation can be composed +arbitrarily, so you can express sophisticated algorithms and get maximal +performance without leaving Python. You can even program multiple GPUs or TPU +cores at once using pmap, and differentiate through the whole thing. + +Dig a little deeper, and you'll see that JAX is really an extensible system for +composable function transformations. Both grad and jit are instances of such +transformations. Others are vmap for automatic vectorization and pmap for +single-program multiple-data (SPMD) parallel programming of multiple +accelerators, with more to come. + +WWW: https://github.com/google/jax