Date: Thu, 21 Mar 2019 19:40:06 +0000 (UTC) From: Sunpoet Po-Chuan Hsieh <sunpoet@FreeBSD.org> To: ports-committers@freebsd.org, svn-ports-all@freebsd.org, svn-ports-head@freebsd.org Subject: svn commit: r496485 - in head/math: . py-nevergrad Message-ID: <201903211940.x2LJe6F6055609@repo.freebsd.org>
next in thread | raw e-mail | index | archive | help
Author: sunpoet Date: Thu Mar 21 19:40:05 2019 New Revision: 496485 URL: https://svnweb.freebsd.org/changeset/ports/496485 Log: Add py-nevergrad 0.1.6 Nevergrad is a gradient-free optimization platform. The goals of this package are to provide: - gradient/derivative-free optimization algorithms, including algorithms able to handle noise. - tools to instrument any code, making it painless to optimize your parameters/hyperparameters, whether they are continuous, discrete or a mixture of continuous and discrete variables. - functions on which to test the optimization algorithms. - benchmark routines in order to compare algorithms easily. WWW: https://github.com/facebookresearch/nevergrad Added: head/math/py-nevergrad/ head/math/py-nevergrad/Makefile (contents, props changed) head/math/py-nevergrad/distinfo (contents, props changed) head/math/py-nevergrad/pkg-descr (contents, props changed) Modified: head/math/Makefile Modified: head/math/Makefile ============================================================================== --- head/math/Makefile Thu Mar 21 19:18:16 2019 (r496484) +++ head/math/Makefile Thu Mar 21 19:40:05 2019 (r496485) @@ -739,6 +739,7 @@ SUBDIR += py-munkres10 SUBDIR += py-networkx SUBDIR += py-networkx1 + SUBDIR += py-nevergrad SUBDIR += py-numarray SUBDIR += py-numeric SUBDIR += py-numexpr Added: head/math/py-nevergrad/Makefile ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-nevergrad/Makefile Thu Mar 21 19:40:05 2019 (r496485) @@ -0,0 +1,36 @@ +# Created by: Po-Chuan Hsieh <sunpoet@FreeBSD.org> +# $FreeBSD$ + +PORTNAME= nevergrad +PORTVERSION= 0.1.6 +CATEGORIES= math python +MASTER_SITES= CHEESESHOP +PKGNAMEPREFIX= ${PYTHON_PKGNAMEPREFIX} + +MAINTAINER= sunpoet@FreeBSD.org +COMMENT= Python toolbox for performing gradient-free optimization + +LICENSE= MIT +LICENSE_FILE= ${WRKSRC}/LICENSE + +RUN_DEPENDS= ${PYTHON_PKGNAMEPREFIX}bayesian-optimization>=0.6.0:math/py-bayesian-optimization@${PY_FLAVOR} \ + ${PYTHON_PKGNAMEPREFIX}cma>=2.6.0:math/py-cma@${PY_FLAVOR} \ + ${PYTHON_PKGNAMEPREFIX}matplotlib>=2.2.3:math/py-matplotlib@${PY_FLAVOR} \ + ${PYTHON_PKGNAMEPREFIX}numpy>=1.15.0:math/py-numpy@${PY_FLAVOR} \ + ${PYTHON_PKGNAMEPREFIX}pandas>=0.23.4:math/py-pandas@${PY_FLAVOR} \ + ${PYTHON_PKGNAMEPREFIX}requests>=2.21.0:www/py-requests@${PY_FLAVOR} \ + ${PYTHON_PKGNAMEPREFIX}xlrd>=1.2.0:textproc/py-xlrd@${PY_FLAVOR} \ + ${PYTHON_PKGNAMEPREFIX}xlwt>=1.3.0:textproc/py-xlwt@${PY_FLAVOR} + +USES= python:3.6+ +USE_PYTHON= autoplist distutils + +NO_ARCH= yes + +.include <bsd.port.pre.mk> + +.if ${PYTHON_REL} < 3700 +RUN_DEPENDS+= ${PYTHON_PKGNAMEPREFIX}typing-extensions>=3.6.6:devel/py-typing-extensions@${PY_FLAVOR} +.endif + +.include <bsd.port.post.mk> Added: head/math/py-nevergrad/distinfo ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-nevergrad/distinfo Thu Mar 21 19:40:05 2019 (r496485) @@ -0,0 +1,3 @@ +TIMESTAMP = 1553188278 +SHA256 (nevergrad-0.1.6.tar.gz) = ffc900006535aa5590ed04f359d4a141e59ce3ec4245ef3740f9057daf8abb45 +SIZE (nevergrad-0.1.6.tar.gz) = 103150 Added: head/math/py-nevergrad/pkg-descr ============================================================================== --- /dev/null 00:00:00 1970 (empty, because file is newly added) +++ head/math/py-nevergrad/pkg-descr Thu Mar 21 19:40:05 2019 (r496485) @@ -0,0 +1,12 @@ +Nevergrad is a gradient-free optimization platform. + +The goals of this package are to provide: +- gradient/derivative-free optimization algorithms, including algorithms able to + handle noise. +- tools to instrument any code, making it painless to optimize your + parameters/hyperparameters, whether they are continuous, discrete or a mixture + of continuous and discrete variables. +- functions on which to test the optimization algorithms. +- benchmark routines in order to compare algorithms easily. + +WWW: https://github.com/facebookresearch/nevergrad
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?201903211940.x2LJe6F6055609>