[#398582] DONE (try 2) llama.cpp.git=6869-alt1

Girar pender (vt) girar-builder at altlinux.org
Wed Oct 29 02:38:23 MSK 2025


https://git.altlinux.org/tasks/archive/done/_389/398582/logs/events.2.1.log
https://packages.altlinux.org/tasks/398582

2025-Oct-28 23:32:24 :: task #398582 for sisyphus resumed by vt:
#100 build 6869-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Oct-28 23:03:52
2025-Oct-28 23:32:26 :: [i586] #100 llama.cpp.git 6869-alt1: build start
2025-Oct-28 23:32:26 :: [aarch64] #100 llama.cpp.git 6869-alt1: build start
2025-Oct-28 23:32:26 :: [x86_64] #100 llama.cpp.git 6869-alt1: build start
2025-Oct-28 23:32:37 :: [i586] #100 llama.cpp.git 6869-alt1: build SKIPPED
2025-Oct-28 23:32:45 :: [aarch64] #100 llama.cpp.git 6869-alt1: build OK (cached)
build/100/x86_64/log:[00:04:11] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:04:11] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Oct-28 23:32:47 :: [x86_64] #100 llama.cpp.git 6869-alt1: build OK (cached)
2025-Oct-28 23:32:47 :: 100: build check OK (cached)
2025-Oct-28 23:32:48 :: build check OK
2025-Oct-28 23:33:02 :: #100: llama.cpp.git 6869-alt1: version check OK
2025-Oct-28 23:33:02 :: build version check OK
2025-Oct-28 23:33:14 :: noarch check OK
2025-Oct-28 23:33:16 :: plan: src +1 -1 =20859, aarch64 +9 -9 =36861, x86_64 +11 -11 =37705
#100 llama.cpp 6397-alt1 -> 1:6869-alt1
 Tue Oct 28 2025 Vitaly Chikunov <vt at altlinux> 1:6869-alt1
 - Update to b6869 (2025-10-28).
2025-Oct-28 23:33:56 :: patched apt indices
2025-Oct-28 23:34:04 :: created next repo
2025-Oct-28 23:34:14 :: duplicate provides check OK
2025-Oct-28 23:34:51 :: dependencies check OK
2025-Oct-28 23:35:21 :: [x86_64 aarch64] ELF symbols check OK
2025-Oct-28 23:35:33 :: [x86_64] #100 libllama: install check OK (cached)
2025-Oct-28 23:35:38 :: [x86_64] #100 libllama-debuginfo: install check OK (cached)
2025-Oct-28 23:35:41 :: [aarch64] #100 libllama: install check OK (cached)
	x86_64: libllama-devel=1:6869-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Oct-28 23:35:43 :: [x86_64] #100 libllama-devel: install check OK (cached)
2025-Oct-28 23:35:49 :: [x86_64] #100 llama.cpp: install check OK (cached)
2025-Oct-28 23:35:49 :: [aarch64] #100 libllama-debuginfo: install check OK (cached)
2025-Oct-28 23:35:53 :: [x86_64] #100 llama.cpp-convert: install check OK (cached)
	aarch64: libllama-devel=1:6869-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Oct-28 23:35:57 :: [aarch64] #100 libllama-devel: install check OK (cached)
2025-Oct-28 23:35:58 :: [x86_64] #100 llama.cpp-cpu: install check OK (cached)
2025-Oct-28 23:36:03 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2025-Oct-28 23:36:05 :: [aarch64] #100 llama.cpp: install check OK (cached)
2025-Oct-28 23:36:09 :: [x86_64] #100 llama.cpp-cuda: install check OK (cached)
2025-Oct-28 23:36:13 :: [aarch64] #100 llama.cpp-convert: install check OK (cached)
2025-Oct-28 23:36:15 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK (cached)
2025-Oct-28 23:36:19 :: [x86_64] #100 llama.cpp-vulkan: install check OK (cached)
2025-Oct-28 23:36:21 :: [aarch64] #100 llama.cpp-cpu: install check OK (cached)
2025-Oct-28 23:36:24 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2025-Oct-28 23:36:29 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2025-Oct-28 23:36:37 :: [aarch64] #100 llama.cpp-vulkan: install check OK (cached)
2025-Oct-28 23:36:45 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2025-Oct-28 23:37:04 :: [x86_64-i586] generated apt indices
2025-Oct-28 23:37:04 :: [x86_64-i586] created next repo
2025-Oct-28 23:37:14 :: [x86_64-i586] dependencies check OK
2025-Oct-28 23:37:15 :: gears inheritance check OK
2025-Oct-28 23:37:16 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2025-Oct-28 23:37:16 :: acl check OK
2025-Oct-28 23:37:28 :: created contents_index files
2025-Oct-28 23:37:36 :: created hash files: aarch64 src x86_64
2025-Oct-28 23:37:39 :: task #398582 for sisyphus TESTED
2025-Oct-28 23:37:39 :: task is ready for commit
2025-Oct-28 23:37:44 :: repo clone OK
2025-Oct-28 23:37:45 :: packages update OK
2025-Oct-28 23:37:50 :: [x86_64 aarch64] update OK
2025-Oct-28 23:37:50 :: repo update OK
2025-Oct-28 23:38:00 :: repo save OK
2025-Oct-28 23:38:00 :: src index update OK
2025-Oct-28 23:38:03 :: updated /gears/l/llama.cpp.git branch `sisyphus'
2025-Oct-28 23:38:23 :: gears update OK
2025-Oct-28 23:38:23 :: task #398582 for sisyphus DONE


More information about the Sisyphus-incominger mailing list