Skip site navigation (1)Skip section navigation (2)
Date:      Sat, 28 Feb 2026 16:57:37 +0000
From:      Yuri Victorovich <yuri@FreeBSD.org>
To:        ports-committers@FreeBSD.org, dev-commits-ports-all@FreeBSD.org, dev-commits-ports-main@FreeBSD.org
Subject:   git: 15a257d5f8b3 - main - misc/llama-cpp: update 81=?utf-8?Q?32 =E2=86=92 8?=182
Message-ID:  <69a31e81.3b963.5acfdac1@gitrepo.freebsd.org>

index | next in thread | raw e-mail

The branch main has been updated by yuri:

URL: https://cgit.FreeBSD.org/ports/commit/?id=15a257d5f8b3fd93cf23cf61361253024a651413

commit 15a257d5f8b3fd93cf23cf61361253024a651413
Author:     Yuri Victorovich <yuri@FreeBSD.org>
AuthorDate: 2026-02-28 16:54:53 +0000
Commit:     Yuri Victorovich <yuri@FreeBSD.org>
CommitDate: 2026-02-28 16:57:34 +0000

    misc/llama-cpp: update 8132 → 8182
    
    It now uses the the ggml package as dependency instead of the bundled one.
    
    PR:             293496 (for ggml part)
    Reported by:    mord0d@firemail.cc
---
 misc/llama-cpp/Makefile  |  8 +++++---
 misc/llama-cpp/distinfo  |  6 +++---
 misc/llama-cpp/pkg-plist | 35 ++---------------------------------
 3 files changed, 10 insertions(+), 39 deletions(-)

diff --git a/misc/llama-cpp/Makefile b/misc/llama-cpp/Makefile
index c2612c1e54b8..bd914adab036 100644
--- a/misc/llama-cpp/Makefile
+++ b/misc/llama-cpp/Makefile
@@ -1,6 +1,6 @@
 PORTNAME=	llama-cpp
 DISTVERSIONPREFIX=	b
-DISTVERSION=	8132
+DISTVERSION=	8182
 CATEGORIES=	misc # machine-learning
 
 MAINTAINER=	yuri@FreeBSD.org
@@ -13,6 +13,8 @@ LICENSE_FILE=	${WRKSRC}/LICENSE
 BROKEN_armv7=	clang crashes, see https://bugs.freebsd.org/bugzilla/show_bug.cgi?id=278810
 BROKEN_i386=	compilation fails, see https://github.com/ggerganov/llama.cpp/issues/9545
 
+LIB_DEPENDS=	libggml-base.so:misc/ggml
+
 USES=		cmake:testing compiler:c++11-lang python:run shebangfix
 USE_LDCONFIG=	yes
 
@@ -23,8 +25,8 @@ GH_TUPLE=	nomic-ai:kompute:4565194:kompute/kompute
 
 SHEBANG_GLOB=	*.py
 
-CMAKE_ON=	BUILD_SHARED_LIBS
-		#LLAMA_USE_SYSTEM_GGML
+CMAKE_ON=	BUILD_SHARED_LIBS \
+		LLAMA_USE_SYSTEM_GGML
 CMAKE_OFF=	GGML_NATIVE \
 		FREEBSD_ALLOW_ADVANCED_CPU_FEATURES \
 		LLAMA_BUILD_TESTS
diff --git a/misc/llama-cpp/distinfo b/misc/llama-cpp/distinfo
index d75ba99c1802..e88a96bf93ca 100644
--- a/misc/llama-cpp/distinfo
+++ b/misc/llama-cpp/distinfo
@@ -1,5 +1,5 @@
-TIMESTAMP = 1771831071
-SHA256 (ggerganov-llama.cpp-b8132_GH0.tar.gz) = 346e3ec5c8146947d4cdda6bf340384006c590b0c4f8a3c89f265dd076351f57
-SIZE (ggerganov-llama.cpp-b8132_GH0.tar.gz) = 29059325
+TIMESTAMP = 1772296554
+SHA256 (ggerganov-llama.cpp-b8182_GH0.tar.gz) = 3bfffdc4fde15b90578010714ca0721c33f13c544cf2d8dc170c71a139b60777
+SIZE (ggerganov-llama.cpp-b8182_GH0.tar.gz) = 29080587
 SHA256 (nomic-ai-kompute-4565194_GH0.tar.gz) = 95b52d2f0514c5201c7838348a9c3c9e60902ea3c6c9aa862193a212150b2bfc
 SIZE (nomic-ai-kompute-4565194_GH0.tar.gz) = 13540496
diff --git a/misc/llama-cpp/pkg-plist b/misc/llama-cpp/pkg-plist
index 4a4098d35727..75c675c1a959 100644
--- a/misc/llama-cpp/pkg-plist
+++ b/misc/llama-cpp/pkg-plist
@@ -38,47 +38,16 @@ bin/convert_hf_to_gguf.py
 %%EXAMPLES%%bin/llama-speculative-simple
 %%EXAMPLES%%bin/llama-tokenize
 %%EXAMPLES%%bin/llama-tts
-include/ggml-alloc.h
-include/ggml-backend.h
-include/ggml-blas.h
-include/ggml-cann.h
-include/ggml-cpp.h
-include/ggml-cpu.h
-include/ggml-cuda.h
-include/ggml-metal.h
-include/ggml-opt.h
-include/ggml-rpc.h
-include/ggml-sycl.h
-include/ggml-virtgpu.h
-include/ggml-vulkan.h
-include/ggml-webgpu.h
-include/ggml-zendnn.h
-include/ggml.h
-include/gguf.h
 include/llama-cpp.h
 include/llama.h
 include/mtmd-helper.h
 include/mtmd.h
-lib/cmake/ggml/ggml-config.cmake
-lib/cmake/ggml/ggml-version.cmake
 lib/cmake/llama/llama-config.cmake
 lib/cmake/llama/llama-version.cmake
-lib/libggml-base.so
-lib/libggml-base.so.0
-lib/libggml-base.so.0.9.7
-lib/libggml-cpu.so
-lib/libggml-cpu.so.0
-lib/libggml-cpu.so.0.9.7
-lib/libggml-vulkan.so
-lib/libggml-vulkan.so.0
-lib/libggml-vulkan.so.0.9.7
-lib/libggml.so
-lib/libggml.so.0
-lib/libggml.so.0.9.7
 lib/libllama.so
 lib/libllama.so.0
-lib/libllama.so.0.0.8132
+lib/libllama.so.0.0.8182
 lib/libmtmd.so
 lib/libmtmd.so.0
-lib/libmtmd.so.0.0.8132
+lib/libmtmd.so.0.0.8182
 libdata/pkgconfig/llama.pc


home | help

Want to link to this message? Use this
URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?69a31e81.3b963.5acfdac1>