Date: Wed, 27 Feb 2019 22:11:15 +0000 (UTC) From: Ruslan Makhmatkhanov <rm@FreeBSD.org> To: ports-committers@freebsd.org, svn-ports-all@freebsd.org, svn-ports-head@freebsd.org Subject: svn commit: r494091 - in head/math: . py-autograd Message-ID: <201902272211.x1RMBFkZ060269@repo.freebsd.org>
next in thread | raw e-mail | index | archive | help
Author: rm Date: Wed Feb 27 22:11:15 2019 New Revision: 494091 URL: https://svnweb.freebsd.org/changeset/ports/494091 Log: Autograd can automatically differentiate native Python and Numpy code. It can handle a large subset of Python's features, including loops, ifs, recursion and closures, and it can even take derivatives of derivatives of derivatives. It supports reverse-mode differentiation (a.k.a. backpropagation), which means it can efficiently take gradients of scalar-valued functions with respect to array-valued arguments, as well as forward-mode differentiation, and the two can be composed arbitrarily. The main intended application of Autograd is gradient-based optimization. WWW: https://github.com/HIPS/autograd Added: head/math/py-autograd/ head/math/py-autograd/Makefile (contents, props changed) head/math/py-autograd/distinfo (contents, props changed) head/math/py-autograd/pkg-descr (contents, props changed) Modified: head/math/Makefile Modified: head/math/Makefile ============================================================================== --- head/math/Makefile Wed Feb 27 22:09:42 2019 (r494090) +++ head/math/Makefile Wed Feb 27 22:11:15 2019 (r494091) @@ -690,6 +690,7 @@ SUBDIR += py-algopy SUBDIR += py-altgraph SUBDIR += py-apgl + SUBDIR += py-autograd SUBDIR += py-basemap SUBDIR += py-basemap-data SUBDIR += py-bayesian-optimization Added: head/math/py-autograd/Makefile ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-autograd/Makefile Wed Feb 27 22:11:15 2019 (r494091) @@ -0,0 +1,22 @@ +# $FreeBSD$ + +PORTNAME= autograd +DISTVERSION= 1.2 +CATEGORIES= math python +MASTER_SITES= CHEESESHOP +PKGNAMEPREFIX= ${PYTHON_PKGNAMEPREFIX} + +MAINTAINER= rm@FreeBSD.org +COMMENT= Efficiently computes derivatives of numpy code + +LICENSE= MIT + +RUN_DEPENDS= ${PYNUMPY} \ + ${PYTHON_PKGNAMEPREFIX}future>=0.15.2:devel/py-future@${PY_FLAVOR} + +USES= python +USE_PYTHON= autoplist distutils + +NO_ARCH= yes + +.include <bsd.port.mk> Added: head/math/py-autograd/distinfo ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-autograd/distinfo Wed Feb 27 22:11:15 2019 (r494091) @@ -0,0 +1,3 @@ +TIMESTAMP = 1551302910 +SHA256 (autograd-1.2.tar.gz) = a08bfa6d539b7a56e7c9f4d0881044afbef5e75f324a394c2494de963ea4a47d +SIZE (autograd-1.2.tar.gz) = 32540 Added: head/math/py-autograd/pkg-descr ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-autograd/pkg-descr Wed Feb 27 22:11:15 2019 (r494091) @@ -0,0 +1,10 @@ +Autograd can automatically differentiate native Python and Numpy code. It can +handle a large subset of Python's features, including loops, ifs, recursion and +closures, and it can even take derivatives of derivatives of derivatives. It +supports reverse-mode differentiation (a.k.a. backpropagation), which means it +can efficiently take gradients of scalar-valued functions with respect to +array-valued arguments, as well as forward-mode differentiation, and the two +can be composed arbitrarily. The main intended application of Autograd is +gradient-based optimization. + +WWW: https://github.com/HIPS/autograd
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?201902272211.x1RMBFkZ060269>