[#405966] DONE (try 2) llama.cpp.git=7819-alt1

Girar pender (vt) girar-builder at altlinux.org
Sun Jan 25 04:22:09 MSK 2026


https://git.altlinux.org/tasks/archive/done/_396/405966/logs/events.2.1.log
https://packages.altlinux.org/tasks/405966

subtask  name       aarch64  i586  x86_64
   #100  llama.cpp        -     -   10:50

2026-Jan-25 01:04:01 :: task #405966 for sisyphus resumed by vt:
#100 build 7819-alt1 from /people/vt/packages/llama.cpp.git fetched at 2026-Jan-24 02:11:58
2026-Jan-25 01:04:04 :: [aarch64] #100 llama.cpp.git 7819-alt1: build start
2026-Jan-25 01:04:04 :: [x86_64] #100 llama.cpp.git 7819-alt1: build start
2026-Jan-25 01:04:04 :: [i586] #100 llama.cpp.git 7819-alt1: build start
2026-Jan-25 01:04:16 :: [i586] #100 llama.cpp.git 7819-alt1: build SKIPPED
2026-Jan-25 01:04:23 :: [aarch64] #100 llama.cpp.git 7819-alt1: build OK (cached)
build/100/x86_64/log:[00:06:22] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:06:22] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Jan-25 01:14:54 :: [x86_64] #100 llama.cpp.git 7819-alt1: build OK
2026-Jan-25 01:15:03 :: 100: build check OK
2026-Jan-25 01:15:04 :: build check OK
2026-Jan-25 01:15:17 :: #100: llama.cpp.git 7819-alt1: version check OK
2026-Jan-25 01:15:17 :: build version check OK
--- llama.cpp-cpu-7819-alt1.x86_64.rpm.share	2026-01-25 01:15:20.823072220 +0000
+++ llama.cpp-cpu-7819-alt1.aarch64.rpm.share	2026-01-25 01:15:21.917082491 +0000
@@ -8,3 +8,3 @@
 /usr/share/doc/llama.cpp/README.md	100644	UTF-8 Unicode English text, with very long lines
-/usr/share/doc/llama.cpp/build-options.txt	100644	ASCII English text, with very long lines
+/usr/share/doc/llama.cpp/build-options.txt	100644	ASCII English text
 /usr/share/doc/llama.cpp/docs	40755	directory
warning (#100): non-identical /usr/share part
2026-Jan-25 01:15:35 :: noarch check OK
2026-Jan-25 01:15:37 :: plan: src +1 -1 =21505, aarch64 +8 -8 =38015, x86_64 +10 -10 =39032
#100 llama.cpp 7388-alt1 -> 1:7819-alt1
 Sat Jan 24 2026 Vitaly Chikunov <vt at altlinux> 1:7819-alt1
 - Update to b7819 (2026-01-23).
 - Responses API support (partial).
2026-Jan-25 01:16:19 :: patched apt indices
2026-Jan-25 01:16:28 :: created next repo
2026-Jan-25 01:16:38 :: duplicate provides check OK
2026-Jan-25 01:17:17 :: dependencies check OK
2026-Jan-25 01:17:49 :: [x86_64 aarch64] ELF symbols check OK
2026-Jan-25 01:18:05 :: [x86_64] #100 libllama: install check OK
2026-Jan-25 01:18:09 :: [aarch64] #100 libllama: install check OK (cached)
2026-Jan-25 01:18:14 :: [x86_64] #100 libllama-debuginfo: install check OK
2026-Jan-25 01:18:18 :: [aarch64] #100 libllama-debuginfo: install check OK (cached)
	x86_64: libllama-devel=1:7819-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Jan-25 01:18:22 :: [x86_64] #100 libllama-devel: install check OK
	aarch64: libllama-devel=1:7819-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Jan-25 01:18:26 :: [aarch64] #100 libllama-devel: install check OK (cached)
2026-Jan-25 01:18:34 :: [aarch64] #100 llama.cpp: install check OK (cached)
2026-Jan-25 01:18:42 :: [aarch64] #100 llama.cpp-cpu: install check OK (cached)
2026-Jan-25 01:18:48 :: [x86_64] #100 llama.cpp: install check OK
2026-Jan-25 01:18:50 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2026-Jan-25 01:18:58 :: [x86_64] #100 llama.cpp-cpu: install check OK
2026-Jan-25 01:18:59 :: [aarch64] #100 llama.cpp-vulkan: install check OK (cached)
2026-Jan-25 01:19:07 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2026-Jan-25 01:19:12 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2026-Jan-25 01:19:38 :: [x86_64] #100 llama.cpp-cuda: install check OK
2026-Jan-25 01:20:06 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2026-Jan-25 01:20:17 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2026-Jan-25 01:20:30 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK
2026-Jan-25 01:20:48 :: [x86_64-i586] generated apt indices
2026-Jan-25 01:20:48 :: [x86_64-i586] created next repo
2026-Jan-25 01:20:59 :: [x86_64-i586] dependencies check OK
2026-Jan-25 01:21:01 :: gears inheritance check OK
2026-Jan-25 01:21:01 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2026-Jan-25 01:21:02 :: acl check OK
2026-Jan-25 01:21:13 :: created contents_index files
2026-Jan-25 01:21:21 :: created hash files: aarch64 src x86_64
2026-Jan-25 01:21:24 :: task #405966 for sisyphus TESTED
2026-Jan-25 01:21:25 :: task is ready for commit
2026-Jan-25 01:21:30 :: repo clone OK
2026-Jan-25 01:21:30 :: packages update OK
2026-Jan-25 01:21:36 :: [x86_64 aarch64] update OK
2026-Jan-25 01:21:36 :: repo update OK
2026-Jan-25 01:21:46 :: repo save OK
2026-Jan-25 01:21:46 :: src index update OK
2026-Jan-25 01:21:49 :: updated /gears/l/llama.cpp.git branch `sisyphus'
2026-Jan-25 01:22:09 :: gears update OK
2026-Jan-25 01:22:09 :: task #405966 for sisyphus DONE


More information about the Sisyphus-incominger mailing list