[#383655] TESTED (try 2) llama.cpp.git=5332-alt1
Girar awaiter (vt)
girar-builder at altlinux.org
Sat May 10 03:09:12 MSK 2025
https://git.altlinux.org/tasks/383655/logs/events.2.1.log
https://packages.altlinux.org/tasks/383655
subtask name aarch64 i586 x86_64
#200 llama.cpp 9:44 - 7:19
2025-May-09 23:52:25 :: test-only task #383655 for sisyphus resumed by vt:
#100 removed
#200 build 5332-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-May-09 23:52:22
2025-May-09 23:52:27 :: [i586] #200 llama.cpp.git 5332-alt1: build start
2025-May-09 23:52:27 :: [aarch64] #200 llama.cpp.git 5332-alt1: build start
2025-May-09 23:52:27 :: [x86_64] #200 llama.cpp.git 5332-alt1: build start
2025-May-09 23:52:38 :: [i586] #200 llama.cpp.git 5332-alt1: build SKIPPED
build/200/x86_64/log:[00:03:44] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/200/x86_64/log:[00:03:44] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-May-09 23:59:46 :: [x86_64] #200 llama.cpp.git 5332-alt1: build OK
2025-May-10 00:02:11 :: [aarch64] #200 llama.cpp.git 5332-alt1: build OK
2025-May-10 00:02:38 :: #200: llama.cpp.git 5332-alt1: build check OK
2025-May-10 00:02:40 :: build check OK
2025-May-10 00:03:05 :: noarch check OK
2025-May-10 00:03:07 :: plan: src +1 -1 =20214, aarch64 +8 -6 =35105, x86_64 +10 -8 =35906
#200 llama.cpp 4855-alt1 -> 1:5332-alt1
Sat May 10 2025 Vitaly Chikunov <vt at altlinux> 1:5332-alt1
- Update to b5332 (2025-05-09), with vision support in llama-server.
- Enable Vulkan backend (for GPU) in llama.cpp-vulkan package.
2025-May-10 00:03:52 :: patched apt indices
2025-May-10 00:04:02 :: created next repo
2025-May-10 00:04:13 :: duplicate provides check OK
2025-May-10 00:04:54 :: dependencies check OK
2025-May-10 00:05:34 :: [x86_64 aarch64] ELF symbols check OK
2025-May-10 00:05:50 :: [x86_64] #200 libllama: install check OK
2025-May-10 00:05:57 :: [aarch64] #200 libllama: install check OK
2025-May-10 00:05:58 :: [x86_64] #200 libllama-debuginfo: install check OK
x86_64: libllama-devel=1:5332-alt1 post-install unowned files:
/usr/lib64/cmake
2025-May-10 00:06:05 :: [x86_64] #200 libllama-devel: install check OK
2025-May-10 00:06:10 :: [aarch64] #200 libllama-debuginfo: install check OK
aarch64: libllama-devel=1:5332-alt1 post-install unowned files:
/usr/lib64/cmake
2025-May-10 00:06:21 :: [aarch64] #200 libllama-devel: install check OK
2025-May-10 00:06:31 :: [x86_64] #200 llama.cpp: install check OK
2025-May-10 00:06:35 :: [aarch64] #200 llama.cpp: install check OK
2025-May-10 00:06:41 :: [x86_64] #200 llama.cpp-cpu: install check OK
2025-May-10 00:06:50 :: [aarch64] #200 llama.cpp-cpu: install check OK
2025-May-10 00:07:01 :: [x86_64] #200 llama.cpp-cpu-debuginfo: install check OK
2025-May-10 00:07:19 :: [aarch64] #200 llama.cpp-cpu-debuginfo: install check OK
2025-May-10 00:07:27 :: [x86_64] #200 llama.cpp-cuda: install check OK
2025-May-10 00:07:34 :: [aarch64] #200 llama.cpp-vulkan: install check OK
2025-May-10 00:07:52 :: [aarch64] #200 llama.cpp-vulkan-debuginfo: install check OK
2025-May-10 00:07:54 :: [x86_64] #200 llama.cpp-cuda-debuginfo: install check OK
2025-May-10 00:08:04 :: [x86_64] #200 llama.cpp-vulkan: install check OK
2025-May-10 00:08:16 :: [x86_64] #200 llama.cpp-vulkan-debuginfo: install check OK
2025-May-10 00:08:34 :: [x86_64-i586] generated apt indices
2025-May-10 00:08:34 :: [x86_64-i586] created next repo
2025-May-10 00:08:46 :: [x86_64-i586] dependencies check OK
2025-May-10 00:08:47 :: gears inheritance check OK
2025-May-10 00:08:48 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #200: llama.cpp: disapproved by vt
2025-May-10 00:08:49 :: acl check IGNORED
2025-May-10 00:09:01 :: created contents_index files
2025-May-10 00:09:09 :: created hash files: aarch64 src x86_64
2025-May-10 00:09:12 :: task #383655 for sisyphus TESTED
More information about the Sisyphus-incominger
mailing list