[#394153] DONE (try 2) llama.cpp.git=6397-alt1
Girar pender (vt)
girar-builder at altlinux.org
Sun Sep 7 07:05:40 MSK 2025
https://git.altlinux.org/tasks/archive/done/_384/394153/logs/events.2.2.log
https://packages.altlinux.org/tasks/394153
2025-Sep-07 03:59:06 :: task #394153 for sisyphus resumed by vt:
#100 build 6397-alt1 from /people/vt/packages/llama.cpp.git fetched at 2025-Sep-06 06:11:21
2025-Sep-07 03:59:07 :: [i586] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-07 03:59:07 :: [x86_64] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-07 03:59:07 :: [aarch64] #100 llama.cpp.git 6397-alt1: build start
2025-Sep-07 03:59:26 :: [aarch64] #100 llama.cpp.git 6397-alt1: build OK (cached)
build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:03:46] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2025-Sep-07 03:59:28 :: [x86_64] #100 llama.cpp.git 6397-alt1: build OK (cached)
2025-Sep-07 03:59:34 :: [i586] #100 llama.cpp.git 6397-alt1: build SKIPPED
2025-Sep-07 03:59:56 :: #100: llama.cpp.git 6397-alt1: build check OK
2025-Sep-07 03:59:58 :: build check OK
2025-Sep-07 04:00:11 :: noarch check OK
2025-Sep-07 04:00:13 :: plan: src +1 -1 =20597, aarch64 +9 -8 =36382, x86_64 +11 -10 =37210
#100 llama.cpp 6121-alt1 -> 1:6397-alt1
Sat Sep 06 2025 Vitaly Chikunov <vt at altlinux> 1:6397-alt1
- Update to b6397 (2025-09-06).
- Python-based model conversion scripts are sub-packaged. Note that they are
not supported and are provided as-is.
2025-Sep-07 04:00:59 :: patched apt indices
2025-Sep-07 04:01:09 :: created next repo
2025-Sep-07 04:01:20 :: duplicate provides check OK
2025-Sep-07 04:02:01 :: dependencies check OK
2025-Sep-07 04:02:35 :: [x86_64 aarch64] ELF symbols check OK
2025-Sep-07 04:02:47 :: [x86_64] #100 libllama: install check OK (cached)
2025-Sep-07 04:02:52 :: [x86_64] #100 libllama-debuginfo: install check OK (cached)
2025-Sep-07 04:02:55 :: [aarch64] #100 libllama: install check OK (cached)
x86_64: libllama-devel=1:6397-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Sep-07 04:02:57 :: [x86_64] #100 libllama-devel: install check OK (cached)
2025-Sep-07 04:03:02 :: [aarch64] #100 libllama-debuginfo: install check OK (cached)
2025-Sep-07 04:03:03 :: [x86_64] #100 llama.cpp: install check OK (cached)
2025-Sep-07 04:03:08 :: [x86_64] #100 llama.cpp-convert: install check OK (cached)
aarch64: libllama-devel=1:6397-alt1 post-install unowned files:
/usr/lib64/cmake
2025-Sep-07 04:03:10 :: [aarch64] #100 libllama-devel: install check OK (cached)
2025-Sep-07 04:03:12 :: [x86_64] #100 llama.cpp-cpu: install check OK (cached)
2025-Sep-07 04:03:17 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2025-Sep-07 04:03:18 :: [aarch64] #100 llama.cpp: install check OK (cached)
2025-Sep-07 04:03:24 :: [x86_64] #100 llama.cpp-cuda: install check OK (cached)
2025-Sep-07 04:03:26 :: [aarch64] #100 llama.cpp-convert: install check OK (cached)
2025-Sep-07 04:03:30 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK (cached)
2025-Sep-07 04:03:34 :: [aarch64] #100 llama.cpp-cpu: install check OK (cached)
2025-Sep-07 04:03:35 :: [x86_64] #100 llama.cpp-vulkan: install check OK (cached)
2025-Sep-07 04:03:40 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2025-Sep-07 04:03:42 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2025-Sep-07 04:03:50 :: [aarch64] #100 llama.cpp-vulkan: install check OK (cached)
2025-Sep-07 04:03:58 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2025-Sep-07 04:04:17 :: [x86_64-i586] generated apt indices
2025-Sep-07 04:04:17 :: [x86_64-i586] created next repo
2025-Sep-07 04:04:29 :: [x86_64-i586] dependencies check OK
2025-Sep-07 04:04:30 :: gears inheritance check OK
2025-Sep-07 04:04:30 :: srpm inheritance check OK
girar-check-perms: access to llama.cpp ALLOWED for vt: project leader
check-subtask-perms: #100: llama.cpp: allowed for vt
2025-Sep-07 04:04:31 :: acl check OK
2025-Sep-07 04:04:44 :: created contents_index files
2025-Sep-07 04:04:53 :: created hash files: aarch64 src x86_64
2025-Sep-07 04:04:56 :: task #394153 for sisyphus TESTED
2025-Sep-07 04:04:57 :: task is ready for commit
2025-Sep-07 04:05:02 :: repo clone OK
2025-Sep-07 04:05:02 :: packages update OK
2025-Sep-07 04:05:08 :: [x86_64 aarch64] update OK
2025-Sep-07 04:05:09 :: repo update OK
2025-Sep-07 04:05:20 :: repo save OK
2025-Sep-07 04:05:20 :: src index update OK
2025-Sep-07 04:05:21 :: updated /gears/l/llama.cpp.git branch `sisyphus'
2025-Sep-07 04:05:40 :: gears update OK
2025-Sep-07 04:05:40 :: task #394153 for sisyphus DONE
More information about the Sisyphus-incominger
mailing list