[#400709] TESTED llama.cpp.git=7127-alt1
Girar awaiter (vt)
girar-builder at altlinux.org
Fri Nov 21 23:41:49 MSK 2025
https://git.altlinux.org/tasks/400709/logs/events.1.1.log
https://packages.altlinux.org/tasks/400709
subtask name aarch64 i586 x86_64
#100 llama.cpp 8:18 - 7:53
2025-Nov-21 20:27:12 :: test-only task #400709 for sisyphus started by vt:
#100 build 7127-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Nov-21 20:27:10
2025-Nov-21 20:27:14 :: [x86_64] #100 llama.cpp.git 7127-alt1: build start
2025-Nov-21 20:27:14 :: [aarch64] #100 llama.cpp.git 7127-alt1: build start
2025-Nov-21 20:27:14 :: [i586] #100 llama.cpp.git 7127-alt1: build start
2025-Nov-21 20:27:25 :: [i586] #100 llama.cpp.git 7127-alt1: build SKIPPED
build/100/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Nov-21 20:35:07 :: [x86_64] #100 llama.cpp.git 7127-alt1: build OK
2025-Nov-21 20:35:32 :: [aarch64] #100 llama.cpp.git 7127-alt1: build OK
2025-Nov-21 20:35:40 :: 100: build check OK
2025-Nov-21 20:35:42 :: build check OK
2025-Nov-21 20:35:55 :: #100: llama.cpp.git 7127-alt1: version check OK
2025-Nov-21 20:35:55 :: build version check OK
2025-Nov-21 20:36:08 :: noarch check OK
2025-Nov-21 20:36:10 :: plan: src +1 -1 =21047, aarch64 +8 -9 =37062, x86_64 +10 -11 =37900
#100 llama.cpp 6869-alt1 -> 1:7127-alt1
Fri Nov 21 2025 Vitaly Chikunov <vt at altlinux> 1:7127-alt1
- Update to b7127 (2025-11-21).
- spec: Remove llama.cpp-convert package.
- model: detect GigaChat3-10-A1.8B as deepseek lite.
2025-Nov-21 20:36:52 :: patched apt indices
2025-Nov-21 20:37:01 :: created next repo
2025-Nov-21 20:37:11 :: duplicate provides check OK
2025-Nov-21 20:37:51 :: dependencies check OK
2025-Nov-21 20:38:26 :: [x86_64 aarch64] ELF symbols check OK
2025-Nov-21 20:38:41 :: [x86_64] #100 libllama: install check OK
2025-Nov-21 20:38:49 :: [x86_64] #100 libllama-debuginfo: install check OK
2025-Nov-21 20:38:49 :: [aarch64] #100 libllama: install check OK
x86_64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-21 20:38:56 :: [x86_64] #100 libllama-devel: install check OK
2025-Nov-21 20:39:02 :: [aarch64] #100 libllama-debuginfo: install check OK
aarch64: libllama-devel=1:7127-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Nov-21 20:39:14 :: [aarch64] #100 libllama-devel: install check OK
2025-Nov-21 20:39:20 :: [x86_64] #100 llama.cpp: install check OK
2025-Nov-21 20:39:28 :: [aarch64] #100 llama.cpp: install check OK
2025-Nov-21 20:39:29 :: [x86_64] #100 llama.cpp-cpu: install check OK
2025-Nov-21 20:39:41 :: [aarch64] #100 llama.cpp-cpu: install check OK
2025-Nov-21 20:39:43 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-21 20:40:02 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-Nov-21 20:40:07 :: [x86_64] #100 llama.cpp-cuda: install check OK
2025-Nov-21 20:40:16 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2025-Nov-21 20:40:32 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2025-Nov-21 20:40:34 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-21 20:40:41 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2025-Nov-21 20:40:52 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-Nov-21 20:41:11 :: [x86_64-i586] generated apt indices
2025-Nov-21 20:41:11 :: [x86_64-i586] created next repo
2025-Nov-21 20:41:22 :: [x86_64-i586] dependencies check OK
2025-Nov-21 20:41:23 :: gears inheritance check OK
2025-Nov-21 20:41:24 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2025-Nov-21 20:41:24 :: acl check OK
2025-Nov-21 20:41:37 :: created contents_index files
2025-Nov-21 20:41:45 :: created hash files: aarch64 src x86_64
2025-Nov-21 20:41:49 :: task #400709 for sisyphus TESTED
More information about the Sisyphus-incominger
mailing list