[#388129] TESTED llama.cpp.git=5753-alt1
Girar awaiter (vt)
girar-builder at altlinux.org
Wed Jun 25 11:33:41 MSK 2025
https://git.altlinux.org/tasks/388129/logs/events.1.1.log
https://packages.altlinux.org/tasks/388129
subtask name aarch64 i586 x86_64
#100 llama.cpp 12:17 - 8:00
2025-Jun-25 08:14:56 :: test-only task #388129 for sisyphus started by vt:
#100 build 5753-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Jun-25 08:14:53
2025-Jun-25 08:14:58 :: [i586] #100 llama.cpp.git 5753-alt1: build start
2025-Jun-25 08:14:58 :: [x86_64] #100 llama.cpp.git 5753-alt1: build start
2025-Jun-25 08:14:58 :: [aarch64] #100 llama.cpp.git 5753-alt1: build start
2025-Jun-25 08:15:09 :: [i586] #100 llama.cpp.git 5753-alt1: build SKIPPED
build/100/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:04:18] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Jun-25 08:22:58 :: [x86_64] #100 llama.cpp.git 5753-alt1: build OK
2025-Jun-25 08:27:15 :: [aarch64] #100 llama.cpp.git 5753-alt1: build OK
2025-Jun-25 08:27:36 :: #100: llama.cpp.git 5753-alt1: build check OK
2025-Jun-25 08:27:38 :: build check OK
2025-Jun-25 08:28:05 :: noarch check OK
2025-Jun-25 08:28:07 :: plan: src +1 -1 =20274, aarch64 +8 -8 =35233, x86_64 +10 -10 =36055
#100 llama.cpp 5332-alt1 -> 1:5753-alt1
Wed Jun 25 2025 Vitaly Chikunov <vt at altlinux> 1:5753-alt1
- Update to b5753 (2025-06-24).
- Install an experimental rpc backend and server. The rpc code is a
proof-of-concept, fragile, and insecure.
2025-Jun-25 08:28:44 :: patched apt indices
2025-Jun-25 08:28:53 :: created next repo
2025-Jun-25 08:29:03 :: duplicate provides check OK
2025-Jun-25 08:29:39 :: dependencies check OK
2025-Jun-25 08:30:10 :: [x86_64 aarch64] ELF symbols check OK
2025-Jun-25 08:30:25 :: [x86_64] #100 libllama: install check OK
2025-Jun-25 08:30:33 :: [aarch64] #100 libllama: install check OK
2025-Jun-25 08:30:33 :: [x86_64] #100 libllama-debuginfo: install check OK
x86_64: libllama-devel=1:5753-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Jun-25 08:30:40 :: [x86_64] #100 libllama-devel: install check OK
2025-Jun-25 08:30:46 :: [aarch64] #100 libllama-debuginfo: install check OK
aarch64: libllama-devel=1:5753-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Jun-25 08:30:57 :: [aarch64] #100 libllama-devel: install check OK
2025-Jun-25 08:31:05 :: [x86_64] #100 llama.cpp: install check OK
2025-Jun-25 08:31:12 :: [aarch64] #100 llama.cpp: install check OK
2025-Jun-25 08:31:14 :: [x86_64] #100 llama.cpp-cpu: install check OK
2025-Jun-25 08:31:27 :: [aarch64] #100 llama.cpp-cpu: install check OK
2025-Jun-25 08:31:35 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-Jun-25 08:31:58 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK
2025-Jun-25 08:32:00 :: [x86_64] #100 llama.cpp-cuda: install check OK
2025-Jun-25 08:32:13 :: [aarch64] #100 llama.cpp-vulkan: install check OK
2025-Jun-25 08:32:26 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK
2025-Jun-25 08:32:31 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-Jun-25 08:32:35 :: [x86_64] #100 llama.cpp-vulkan: install check OK
2025-Jun-25 08:32:47 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK
2025-Jun-25 08:33:04 :: [x86_64-i586] generated apt indices
2025-Jun-25 08:33:04 :: [x86_64-i586] created next repo
2025-Jun-25 08:33:14 :: [x86_64-i586] dependencies check OK
2025-Jun-25 08:33:16 :: gears inheritance check OK
2025-Jun-25 08:33:17 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2025-Jun-25 08:33:18 :: acl check OK
2025-Jun-25 08:33:30 :: created contents_index files
2025-Jun-25 08:33:38 :: created hash files: aarch64 src x86_64
2025-Jun-25 08:33:41 :: task #388129 for sisyphus TESTED
More information about the Sisyphus-incominger
mailing list