Date: Sat, 03 Jan 2026 10:27:40 +0000 From: Yuri Victorovich <yuri@FreeBSD.org> To: ports-committers@FreeBSD.org, dev-commits-ports-all@FreeBSD.org, dev-commits-ports-main@FreeBSD.org Subject: git: 614ed6178989 - main - misc/llama-cpp: update 71=?utf-8?Q?34 =E2=86=92 7?=617 Message-ID: <6958ef1c.23b02.5d2181ec@gitrepo.freebsd.org>
index | next in thread | raw e-mail
The branch main has been updated by yuri: URL: https://cgit.FreeBSD.org/ports/commit/?id=614ed6178989239704432400614e79f473d919bc commit 614ed6178989239704432400614e79f473d919bc Author: Yuri Victorovich <yuri@FreeBSD.org> AuthorDate: 2026-01-03 07:52:04 +0000 Commit: Yuri Victorovich <yuri@FreeBSD.org> CommitDate: 2026-01-03 10:27:29 +0000 misc/llama-cpp: update 7134 → 7617 --- misc/llama-cpp/Makefile | 2 +- misc/llama-cpp/distinfo | 6 +++--- misc/llama-cpp/pkg-plist | 16 ++++++++++------ 3 files changed, 14 insertions(+), 10 deletions(-) diff --git a/misc/llama-cpp/Makefile b/misc/llama-cpp/Makefile index 7b6618d8bf25..f5f4847a6303 100644 --- a/misc/llama-cpp/Makefile +++ b/misc/llama-cpp/Makefile @@ -1,6 +1,6 @@ PORTNAME= llama-cpp DISTVERSIONPREFIX= b -DISTVERSION= 7134 +DISTVERSION= 7617 CATEGORIES= misc # machine-learning MAINTAINER= yuri@FreeBSD.org diff --git a/misc/llama-cpp/distinfo b/misc/llama-cpp/distinfo index bdfc515e9117..cedc4c2502f1 100644 --- a/misc/llama-cpp/distinfo +++ b/misc/llama-cpp/distinfo @@ -1,5 +1,5 @@ -TIMESTAMP = 1763919085 -SHA256 (ggerganov-llama.cpp-b7134_GH0.tar.gz) = a5e97acb5dfe4a47fb161ca9183c394bd4aeaa44a95ba4c7b184ecc70496723b -SIZE (ggerganov-llama.cpp-b7134_GH0.tar.gz) = 27279622 +TIMESTAMP = 1767423354 +SHA256 (ggerganov-llama.cpp-b7617_GH0.tar.gz) = dac116426aea2d338f8a278f73e870231f02e40fc5cfc867abe73440e877256c +SIZE (ggerganov-llama.cpp-b7617_GH0.tar.gz) = 28627696 SHA256 (nomic-ai-kompute-4565194_GH0.tar.gz) = 95b52d2f0514c5201c7838348a9c3c9e60902ea3c6c9aa862193a212150b2bfc SIZE (nomic-ai-kompute-4565194_GH0.tar.gz) = 13540496 diff --git a/misc/llama-cpp/pkg-plist b/misc/llama-cpp/pkg-plist index 9fe87d206de3..4abe13694c3c 100644 --- a/misc/llama-cpp/pkg-plist +++ b/misc/llama-cpp/pkg-plist @@ -3,6 +3,7 @@ bin/convert_hf_to_gguf.py %%EXAMPLES%%bin/llama-batched-bench %%EXAMPLES%%bin/llama-bench %%EXAMPLES%%bin/llama-cli +%%EXAMPLES%%bin/llama-completion %%EXAMPLES%%bin/llama-convert-llama2c-to-ggml %%EXAMPLES%%bin/llama-cvector-generator %%EXAMPLES%%bin/llama-diffusion-cli @@ -10,10 +11,12 @@ bin/convert_hf_to_gguf.py %%EXAMPLES%%bin/llama-eval-callback %%EXAMPLES%%bin/llama-export-lora %%EXAMPLES%%bin/llama-finetune +%%EXAMPLES%%bin/llama-fit-params %%EXAMPLES%%bin/llama-gen-docs %%EXAMPLES%%bin/llama-gguf %%EXAMPLES%%bin/llama-gguf-hash %%EXAMPLES%%bin/llama-gguf-split +%%EXAMPLES%%bin/llama-idle %%EXAMPLES%%bin/llama-imatrix %%EXAMPLES%%bin/llama-logits %%EXAMPLES%%bin/llama-lookahead @@ -49,6 +52,7 @@ include/ggml-rpc.h include/ggml-sycl.h include/ggml-vulkan.h include/ggml-webgpu.h +include/ggml-zendnn.h include/ggml.h include/gguf.h include/llama-cpp.h @@ -61,20 +65,20 @@ lib/cmake/llama/llama-config.cmake lib/cmake/llama/llama-version.cmake lib/libggml-base.so lib/libggml-base.so.0 -lib/libggml-base.so.0.9.4-dirty +lib/libggml-base.so.0.9.5 lib/libggml-cpu.so lib/libggml-cpu.so.0 -lib/libggml-cpu.so.0.9.4-dirty +lib/libggml-cpu.so.0.9.5 %%VULKAN%%lib/libggml-vulkan.so %%VULKAN%%lib/libggml-vulkan.so.0 -%%VULKAN%%lib/libggml-vulkan.so.0.9.4-dirty +%%VULKAN%%lib/libggml-vulkan.so.0.9.5 lib/libggml.so lib/libggml.so.0 -lib/libggml.so.0.9.4-dirty +lib/libggml.so.0.9.5 lib/libllama.so lib/libllama.so.0 -lib/libllama.so.0.0.7134 +lib/libllama.so.0.0.7617 lib/libmtmd.so lib/libmtmd.so.0 -lib/libmtmd.so.0.0.7134 +lib/libmtmd.so.0.0.7617 libdata/pkgconfig/llama.pchome | help
Want to link to this message? Use this
URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?6958ef1c.23b02.5d2181ec>
