[#408089] DONE (try 2) llama.cpp.git=8018-alt1
Girar pender (vt)
girar-builder at altlinux.org
Sat Feb 14 01:05:26 MSK 2026
https://git.altlinux.org/tasks/archive/done/_398/408089/logs/events.2.1.log
https://packages.altlinux.org/tasks/408089
2026-Feb-13 21:59:25 :: task #408089 for sisyphus resumed by vt:
#100 build 8018-alt1 from /people/vt/packages/llama.cpp.git fetched at 2026-Feb-13 05:16:13
2026-Feb-13 21:59:26 :: [x86_64] #100 llama.cpp.git 8018-alt1: build start
2026-Feb-13 21:59:26 :: [aarch64] #100 llama.cpp.git 8018-alt1: build start
2026-Feb-13 21:59:26 :: [i586] #100 llama.cpp.git 8018-alt1: build start
2026-Feb-13 21:59:33 :: [i586] #100 llama.cpp.git 8018-alt1: build SKIPPED
2026-Feb-13 21:59:41 :: [aarch64] #100 llama.cpp.git 8018-alt1: build OK (cached)
build/100/x86_64/log:[00:02:13] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:02:13] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Feb-13 21:59:42 :: [x86_64] #100 llama.cpp.git 8018-alt1: build OK (cached)
2026-Feb-13 21:59:43 :: 100: build check OK (cached)
2026-Feb-13 21:59:44 :: build check OK
2026-Feb-13 21:59:57 :: #100: llama.cpp.git 8018-alt1: version check OK
2026-Feb-13 21:59:57 :: build version check OK
--- llama.cpp-cpu-8018-alt1.x86_64.rpm.share 2026-02-13 22:00:00.393958199 +0000
+++ llama.cpp-cpu-8018-alt1.aarch64.rpm.share 2026-02-13 22:00:01.454966900 +0000
@@ -8,3 +8,3 @@
/usr/share/doc/llama.cpp/README.md 100644 UTF-8 Unicode English text, with very long lines
-/usr/share/doc/llama.cpp/build-options.txt 100644 ASCII English text, with very long lines
+/usr/share/doc/llama.cpp/build-options.txt 100644 ASCII English text
/usr/share/doc/llama.cpp/docs 40755 directory
warning (#100): non-identical /usr/share part
2026-Feb-13 22:00:16 :: noarch check OK
2026-Feb-13 22:00:18 :: plan: src +1 -1 =21591, aarch64 +8 -8 =38227, x86_64 +10 -10 =39241
#100 llama.cpp 7819-alt1 -> 1:8018-alt1
Fri Feb 13 2026 Vitaly Chikunov <vt at altlinux> 1:8018-alt1
- Update to b8018 (2026-02-12).
2026-Feb-13 22:01:05 :: patched apt indices
2026-Feb-13 22:01:15 :: created next repo
2026-Feb-13 22:01:27 :: duplicate provides check OK
2026-Feb-13 22:02:09 :: dependencies check OK
2026-Feb-13 22:02:46 :: [x86_64 aarch64] ELF symbols check OK
2026-Feb-13 22:02:55 :: [x86_64] #100 libllama: install check OK (cached)
2026-Feb-13 22:02:57 :: [x86_64] #100 libllama-debuginfo: install check OK (cached)
x86_64: libllama-devel=1:8018-alt1 post-install unowned files:
/usr/lib64/cmake
2026-Feb-13 22:03:00 :: [x86_64] #100 libllama-devel: install check OK (cached)
2026-Feb-13 22:03:01 :: [aarch64] #100 libllama: install check OK (cached)
2026-Feb-13 22:03:04 :: [x86_64] #100 llama.cpp: install check OK (cached)
2026-Feb-13 22:03:07 :: [x86_64] #100 llama.cpp-cpu: install check OK (cached)
2026-Feb-13 22:03:07 :: [aarch64] #100 libllama-debuginfo: install check OK (cached)
2026-Feb-13 22:03:10 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
aarch64: libllama-devel=1:8018-alt1 post-install unowned files:
/usr/lib64/cmake
2026-Feb-13 22:03:13 :: [aarch64] #100 libllama-devel: install check OK (cached)
2026-Feb-13 22:03:13 :: [x86_64] #100 llama.cpp-cuda: install check OK (cached)
2026-Feb-13 22:03:17 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK (cached)
2026-Feb-13 22:03:19 :: [aarch64] #100 llama.cpp: install check OK (cached)
2026-Feb-13 22:03:20 :: [x86_64] #100 llama.cpp-vulkan: install check OK (cached)
2026-Feb-13 22:03:23 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2026-Feb-13 22:03:25 :: [aarch64] #100 llama.cpp-cpu: install check OK (cached)
2026-Feb-13 22:03:31 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2026-Feb-13 22:03:37 :: [aarch64] #100 llama.cpp-vulkan: install check OK (cached)
2026-Feb-13 22:03:43 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2026-Feb-13 22:04:01 :: [x86_64-i586] generated apt indices
2026-Feb-13 22:04:01 :: [x86_64-i586] created next repo
2026-Feb-13 22:04:13 :: [x86_64-i586] dependencies check OK
2026-Feb-13 22:04:14 :: gears inheritance check OK
2026-Feb-13 22:04:14 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2026-Feb-13 22:04:14 :: acl check OK
2026-Feb-13 22:04:27 :: created contents_index files
2026-Feb-13 22:04:36 :: created hash files: aarch64 src x86_64
2026-Feb-13 22:04:39 :: task #408089 for sisyphus TESTED
2026-Feb-13 22:04:40 :: task is ready for commit
2026-Feb-13 22:04:45 :: repo clone OK
2026-Feb-13 22:04:46 :: packages update OK
2026-Feb-13 22:04:53 :: [x86_64 aarch64] update OK
2026-Feb-13 22:04:53 :: repo update OK
2026-Feb-13 22:05:05 :: repo save OK
2026-Feb-13 22:05:05 :: src index update OK
2026-Feb-13 22:05:06 :: updated /gears/l/llama.cpp.git branch `sisyphus'
2026-Feb-13 22:05:25 :: gears update OK
2026-Feb-13 22:05:25 :: task #408089 for sisyphus DONE
More information about the Sisyphus-incominger
mailing list