Date: Mon, 6 Jan 2025 04:01:17 GMT From: Yuri Victorovich <yuri@FreeBSD.org> To: ports-committers@FreeBSD.org, dev-commits-ports-all@FreeBSD.org, dev-commits-ports-main@FreeBSD.org Subject: git: 4e02f4adba52 - main - misc/llama-cpp: update 4409 =?utf-8?Q?=E2=86=92?= 4418 Message-ID: <202501060401.50641HVT030250@gitrepo.freebsd.org>
next in thread | raw e-mail | index | archive | help
The branch main has been updated by yuri: URL: https://cgit.FreeBSD.org/ports/commit/?id=4e02f4adba52c79983d9b5e378500571de5a5f2d commit 4e02f4adba52c79983d9b5e378500571de5a5f2d Author: Yuri Victorovich <yuri@FreeBSD.org> AuthorDate: 2025-01-06 01:11:21 +0000 Commit: Yuri Victorovich <yuri@FreeBSD.org> CommitDate: 2025-01-06 04:01:04 +0000 misc/llama-cpp: update 4409 → 4418 Reported by: portscout --- misc/llama-cpp/Makefile | 2 +- misc/llama-cpp/distinfo | 6 +++--- misc/llama-cpp/files/patch-ggml_src_CMakeLists.txt | 20 ++++++++++++++++++++ misc/llama-cpp/pkg-message | 17 +++++++++++++++++ 4 files changed, 41 insertions(+), 4 deletions(-) diff --git a/misc/llama-cpp/Makefile b/misc/llama-cpp/Makefile index 1eeab118f76e..ed8f01264362 100644 --- a/misc/llama-cpp/Makefile +++ b/misc/llama-cpp/Makefile @@ -1,6 +1,6 @@ PORTNAME= llama-cpp DISTVERSIONPREFIX= b -DISTVERSION= 4409 +DISTVERSION= 4418 CATEGORIES= misc # machine-learning MAINTAINER= yuri@FreeBSD.org diff --git a/misc/llama-cpp/distinfo b/misc/llama-cpp/distinfo index 1077df5ceb17..45282e0c4897 100644 --- a/misc/llama-cpp/distinfo +++ b/misc/llama-cpp/distinfo @@ -1,5 +1,5 @@ -TIMESTAMP = 1735962918 -SHA256 (ggerganov-llama.cpp-b4409_GH0.tar.gz) = 71a62315c73b1f00bdd5fdec7d16a77a0d6e1150171953fde230f04e7d5adfa1 -SIZE (ggerganov-llama.cpp-b4409_GH0.tar.gz) = 20608272 +TIMESTAMP = 1736116712 +SHA256 (ggerganov-llama.cpp-b4418_GH0.tar.gz) = b1d13215cbdb076ead3ed86043edc5c96999a204b95ad778fce8cd229d4d5427 +SIZE (ggerganov-llama.cpp-b4418_GH0.tar.gz) = 20608757 SHA256 (nomic-ai-kompute-4565194_GH0.tar.gz) = 95b52d2f0514c5201c7838348a9c3c9e60902ea3c6c9aa862193a212150b2bfc SIZE (nomic-ai-kompute-4565194_GH0.tar.gz) = 13540496 diff --git a/misc/llama-cpp/files/patch-ggml_src_CMakeLists.txt b/misc/llama-cpp/files/patch-ggml_src_CMakeLists.txt new file mode 100644 index 000000000000..8135824e1b8b --- /dev/null +++ b/misc/llama-cpp/files/patch-ggml_src_CMakeLists.txt @@ -0,0 +1,20 @@ +- workaround for https://github.com/ggerganov/llama.cpp/issues/11095 + +--- ggml/src/CMakeLists.txt.orig 2025-01-06 00:37:35 UTC ++++ ggml/src/CMakeLists.txt +@@ -152,15 +152,6 @@ endif() + # posix_memalign came in POSIX.1-2001 / SUSv3 + # M_PI is an XSI extension since POSIX.1-2001 / SUSv3, came in XPG1 (1985) + +-# Somehow in OpenBSD whenever POSIX conformance is specified +-# some string functions rely on locale_t availability, +-# which was introduced in POSIX.1-2008, forcing us to go higher +-if (CMAKE_SYSTEM_NAME MATCHES "OpenBSD") +- add_compile_definitions(_XOPEN_SOURCE=700) +-else() +- add_compile_definitions(_XOPEN_SOURCE=600) +-endif() +- + # Data types, macros and functions related to controlling CPU affinity and + # some memory allocation are available on Linux through GNU extensions in libc + if (CMAKE_SYSTEM_NAME MATCHES "Linux" OR CMAKE_SYSTEM_NAME MATCHES "Android") diff --git a/misc/llama-cpp/pkg-message b/misc/llama-cpp/pkg-message new file mode 100644 index 000000000000..071e82665d9a --- /dev/null +++ b/misc/llama-cpp/pkg-message @@ -0,0 +1,17 @@ +[ +{ type: install + message: <<EOM +You installed LLaMA-cpp: Facebook's LLaMA model runner. + +In order to experience LLaMA-cpp please download some +AI model in the GGUF format, for example from huggingface.com, +run the script below, and open localhost:9011 in your browser +to communicate with this AI model. + +$ llama-server -m $MODEL \ + --host 0.0.0.0 \ + --port 9011 + +EOM +} +]
Want to link to this message? Use this URL: <https://mail-archive.FreeBSD.org/cgi/mid.cgi?202501060401.50641HVT030250>