Date: Sat, 11 Jan 2025 09:30:53 GMT From: Yuri Victorovich <yuri@FreeBSD.org> To: ports-committers@FreeBSD.org, dev-commits-ports-all@FreeBSD.org, dev-commits-ports-main@FreeBSD.org Subject: git: f5ce4ca4dd2c - main - misc/llama-cpp: update 4419 =?utf-8?Q?=E2=86=92?= 4458 Message-ID: <202501110930.50B9UrJS026661@gitrepo.freebsd.org>
next in thread | raw e-mail | index | archive | help
The branch main has been updated by yuri: URL: https://cgit.FreeBSD.org/ports/commit/?id=f5ce4ca4dd2cac5bf4af4203ed771d279c234871 commit f5ce4ca4dd2cac5bf4af4203ed771d279c234871 Author: Yuri Victorovich <yuri@FreeBSD.org> AuthorDate: 2025-01-11 06:33:49 +0000 Commit: Yuri Victorovich <yuri@FreeBSD.org> CommitDate: 2025-01-11 09:30:41 +0000 misc/llama-cpp: update 4419 → 4458 Reported by: portscout --- misc/llama-cpp/Makefile | 4 ++-- misc/llama-cpp/distinfo | 6 +++--- misc/llama-cpp/pkg-plist | 3 ++- 3 files changed, 7 insertions(+), 6 deletions(-) diff --git a/misc/llama-cpp/Makefile b/misc/llama-cpp/Makefile index 55dfc1bcb378..c2555f13efcf 100644 --- a/misc/llama-cpp/Makefile +++ b/misc/llama-cpp/Makefile @@ -1,6 +1,6 @@ PORTNAME= llama-cpp DISTVERSIONPREFIX= b -DISTVERSION= 4419 +DISTVERSION= 4458 CATEGORIES= misc # machine-learning MAINTAINER= yuri@FreeBSD.org @@ -50,6 +50,6 @@ do-test-ci: # build of tests fails, see https://github.com/ggerganov/llama.cpp/i @cd ${WRKSRC} && \ ${SETENV} ${MAKE_ENV} bash ci/run.sh ./tmp/results ./tmp/mnt -# tests as of 4404: 97% tests passed, 1 tests failed out of 31, see https://github.com/ggerganov/llama.cpp/issues/11036 +# tests as of 4458: 97% tests passed, 1 tests failed out of 31, see https://github.com/ggerganov/llama.cpp/issues/11036 .include <bsd.port.mk> diff --git a/misc/llama-cpp/distinfo b/misc/llama-cpp/distinfo index 7a00a1575eec..12febbecef6c 100644 --- a/misc/llama-cpp/distinfo +++ b/misc/llama-cpp/distinfo @@ -1,5 +1,5 @@ -TIMESTAMP = 1736139395 -SHA256 (ggerganov-llama.cpp-b4419_GH0.tar.gz) = 2071344152c0e685a7e14a9868e977c3ca0f28951621db7e6d1d9ba8d4bf198d -SIZE (ggerganov-llama.cpp-b4419_GH0.tar.gz) = 20608062 +TIMESTAMP = 1736573693 +SHA256 (ggerganov-llama.cpp-b4458_GH0.tar.gz) = 62e698f1a34e851d448db0d85123bf6f1c3f6eec5a25e29ddc8a4a71aeda5230 +SIZE (ggerganov-llama.cpp-b4458_GH0.tar.gz) = 20428216 SHA256 (nomic-ai-kompute-4565194_GH0.tar.gz) = 95b52d2f0514c5201c7838348a9c3c9e60902ea3c6c9aa862193a212150b2bfc SIZE (nomic-ai-kompute-4565194_GH0.tar.gz) = 13540496 diff --git a/misc/llama-cpp/pkg-plist b/misc/llama-cpp/pkg-plist index 4becea008f43..134491812ff6 100644 --- a/misc/llama-cpp/pkg-plist +++ b/misc/llama-cpp/pkg-plist @@ -8,8 +8,8 @@ bin/convert_hf_to_gguf.py %%EXAMPLES%%bin/llama-embedding %%EXAMPLES%%bin/llama-eval-callback %%EXAMPLES%%bin/llama-export-lora -%%EXAMPLES%%bin/llama-gen-docs %%EXAMPLES%%bin/llama-gbnf-validator +%%EXAMPLES%%bin/llama-gen-docs %%EXAMPLES%%bin/llama-gguf %%EXAMPLES%%bin/llama-gguf-hash %%EXAMPLES%%bin/llama-gguf-split @@ -53,6 +53,7 @@ include/ggml-rpc.h include/ggml-sycl.h include/ggml-vulkan.h include/ggml.h +include/gguf.h include/llama-cpp.h include/llama.h lib/cmake/llama/llama-config.cmake
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?202501110930.50B9UrJS026661>