[#377221] TESTED (try 8) llama.cpp.git=4855-alt1

Girar awaiter (vt) girar-builder at altlinux.org
Mon Mar 10 03:44:28 MSK 2025


https://git.altlinux.org/tasks/377221/logs/events.8.1.log
https://packages.altlinux.org/tasks/377221

subtask  name       aarch64  i586  x86_64
  #1000  llama.cpp     4:37     -    5:59

2025-Mar-10 00:32:30 :: test-only task #377221 for sisyphus resumed by vt:
#100 removed
#200 removed
#300 removed
#400 removed
#500 removed
#600 removed
#700 removed
#1000 build 4855-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Mar-10 00:32:28
2025-Mar-10 00:32:32 :: [aarch64] #1000 llama.cpp.git 4855-alt1: build start
2025-Mar-10 00:32:32 :: [x86_64] #1000 llama.cpp.git 4855-alt1: build start
2025-Mar-10 00:32:32 :: [i586] #1000 llama.cpp.git 4855-alt1: build start
2025-Mar-10 00:32:44 :: [i586] #1000 llama.cpp.git 4855-alt1: build SKIPPED
2025-Mar-10 00:37:09 :: [aarch64] #1000 llama.cpp.git 4855-alt1: build OK
build/1000/x86_64/log:[00:03:18] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/1000/x86_64/log:[00:03:18] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Mar-10 00:38:31 :: [x86_64] #1000 llama.cpp.git 4855-alt1: build OK
2025-Mar-10 00:38:48 :: #1000: llama.cpp.git 4855-alt1: build check OK
2025-Mar-10 00:38:49 :: build check OK
2025-Mar-10 00:39:09 :: noarch check OK
2025-Mar-10 00:39:11 :: plan: src +1 -1 =19927, aarch64 +6 -5 =34648, x86_64 +8 -5 =35451
#1000 llama.cpp 3441-alt1 -> 1:4855-alt1
 Mon Mar 10 2025 Vitaly Chikunov <vt at altlinux> 1:4855-alt1
 - Update to b4855 (2025-03-07).
 - Enable CUDA backend (for NVIDIA GPU) in llama.cpp-cuda package.
 - Disable BLAS backend (issues/12282).
 - Install bash-completions.
2025-Mar-10 00:39:49 :: patched apt indices
2025-Mar-10 00:39:58 :: created next repo
2025-Mar-10 00:40:08 :: duplicate provides check OK
2025-Mar-10 00:40:45 :: dependencies check OK
2025-Mar-10 00:41:23 :: [x86_64 aarch64] ELF symbols check OK
2025-Mar-10 00:41:38 :: [x86_64] #1000 libllama: install check OK
2025-Mar-10 00:41:45 :: [aarch64] #1000 libllama: install check OK
2025-Mar-10 00:41:47 :: [x86_64] #1000 libllama-debuginfo: install check OK
	x86_64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-10 00:41:55 :: [x86_64] #1000 libllama-devel: install check OK
2025-Mar-10 00:41:58 :: [aarch64] #1000 libllama-debuginfo: install check OK
	aarch64: libllama-devel=1:4855-alt1 post-install unowned files:
 /usr/lib64/cmake
2025-Mar-10 00:42:09 :: [aarch64] #1000 libllama-devel: install check OK
2025-Mar-10 00:42:17 :: [x86_64] #1000 llama.cpp: install check OK
2025-Mar-10 00:42:23 :: [aarch64] #1000 llama.cpp: install check OK
2025-Mar-10 00:42:26 :: [x86_64] #1000 llama.cpp-cpu: install check OK
2025-Mar-10 00:42:37 :: [aarch64] #1000 llama.cpp-cpu: install check OK
2025-Mar-10 00:42:47 :: [x86_64] #1000 llama.cpp-cpu-debuginfo: install check OK
2025-Mar-10 00:43:07 :: [aarch64] #1000 llama.cpp-cpu-debuginfo: install check OK
2025-Mar-10 00:43:10 :: [x86_64] #1000 llama.cpp-cuda: install check OK
2025-Mar-10 00:43:35 :: [x86_64] #1000 llama.cpp-cuda-debuginfo: install check OK
2025-Mar-10 00:43:52 :: [x86_64-i586] generated apt indices
2025-Mar-10 00:43:52 :: [x86_64-i586] created next repo
2025-Mar-10 00:44:04 :: [x86_64-i586] dependencies check OK
2025-Mar-10 00:44:05 :: gears inheritance check OK
2025-Mar-10 00:44:05 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #1000: llama.cpp: allowed for vt
2025-Mar-10 00:44:05 :: acl check OK
2025-Mar-10 00:44:17 :: created contents_index files
2025-Mar-10 00:44:25 :: created hash files: aarch64 src x86_64
2025-Mar-10 00:44:28 :: task #377221 for sisyphus TESTED


More information about the Sisyphus-incominger mailing list