[#377221] TESTED (try 5) llama.cpp.git=4855-alt1

Girar awaiter (vt) girar-builder at altlinux.org
Sun Mar 9 04:07:59 MSK 2025


https://git.altlinux.org/tasks/377221/logs/events.5.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
   #500  llama.cpp     4:37     -    6:49

2025-Mar-09 00:55:41 :: test-only task #377221 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 removed
#400 removed
#500 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-09 00:55:38
2025-Mar-09 00:55:42 :: [i586] #500 llama.cpp.git 4855-alt1: build start
2025-Mar-09 00:55:42 :: [x86_64] #500 llama.cpp.git 4855-alt1: build start
2025-Mar-09 00:55:42 :: [aarch64] #500 llama.cpp.git 4855-alt1: build start
2025-Mar-09 00:55:54 :: [i586] #500 llama.cpp.git 4855-alt1: build SKIPPED
2025-Mar-09 01:00:19 :: [aarch64] #500 llama.cpp.git 4855-alt1: build OK
build/500/x86_64/log:[00:03:44] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/500/x86_64/log:[00:03:44] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Mar-09 01:02:31 :: [x86_64] #500 llama.cpp.git 4855-alt1: build OK
2025-Mar-09 01:02:55 :: #500: llama.cpp.git 4855-alt1: build check OK
2025-Mar-09 01:02:56 :: build check OK
2025-Mar-09 01:03:20 :: noarch check OK
2025-Mar-09 01:03:22 :: plan: src +1 -1 =19914, aarch64 +5 -5 =34622, x86_64 +5 -5 =35423
#500 llama.cpp 3441-alt1 -> 1:4855-alt1
 Fri Mar 07 2025 Vitaly Chikunov <vt at altlinux> 1:4855-alt1
 - Update to b4855 (2025-03-07).
 - test: Enable NVIDIA GPU.
2025-Mar-09 01:04:03 :: patched apt indices
2025-Mar-09 01:04:13 :: created next repo
2025-Mar-09 01:04:23 :: duplicate provides check OK
2025-Mar-09 01:05:03 :: dependencies check OK
2025-Mar-09 01:05:35 :: [x86_64 aarch64] ELF symbols check OK
2025-Mar-09 01:05:51 :: [x86_64] #500 libllama: install check OK
2025-Mar-09 01:05:57 :: [aarch64] #500 libllama: install check OK
2025-Mar-09 01:06:00 :: [x86_64] #500 libllama-debuginfo: install check OK
	x86_64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-09 01:06:08 :: [x86_64] #500 libllama-devel: install check OK
2025-Mar-09 01:06:10 :: [aarch64] #500 libllama-debuginfo: install check OK
	aarch64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-09 01:06:21 :: [aarch64] #500 libllama-devel: install check OK
	x86_64: llama.cpp=1:4855-alt1 post-install unowned files:
 /usr/lib/llama
2025-Mar-09 01:06:32 :: [x86_64] #500 llama.cpp: install check OK
	aarch64: llama.cpp=1:4855-alt1 post-install unowned files:
 /usr/lib/llama
2025-Mar-09 01:06:35 :: [aarch64] #500 llama.cpp: install check OK
2025-Mar-09 01:07:05 :: [aarch64] #500 llama.cpp-debuginfo: install check OK
2025-Mar-09 01:07:08 :: [x86_64] #500 llama.cpp-debuginfo: install check OK
2025-Mar-09 01:07:25 :: [x86_64-i586] generated apt indices
2025-Mar-09 01:07:25 :: [x86_64-i586] created next repo
2025-Mar-09 01:07:36 :: [x86_64-i586] dependencies check OK
2025-Mar-09 01:07:36 :: gears inheritance check OK
2025-Mar-09 01:07:37 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #500: llama.cpp: disapproved by vt
2025-Mar-09 01:07:37 :: acl check IGNORED
2025-Mar-09 01:07:48 :: created contents_index files
2025-Mar-09 01:07:56 :: created hash files: aarch64 src x86_64
2025-Mar-09 01:07:58 :: task #377221 for sisyphus TESTED


More information about the Sisyphus-incominger mailing list