[#412943] p11 EPERM (try 2) llama.cpp.git=8470-alt1 del=python3-module-llama-cpp-python

Girar awaiter (vt) girar-builder at altlinux.org
Mon Mar 30 09:36:05 MSK 2026


https://git.altlinux.org/tasks/412943/logs/events.2.1.log
https://packages.altlinux.org/tasks/412943

2026-Mar-30 06:30:35 :: task #412943 for p11 resumed by vt:
2026-Mar-30 06:30:35 :: message: ALT#58422
#100 build 8470-alt1 from /gears/l/llama.cpp.git fetched at 2026-Mar-28 22:06:17 from sisyphus
#200 delete python3-module-llama-cpp-python
2026-Mar-30 06:30:36 :: [i586] #100 llama.cpp.git 8470-alt1: build start
2026-Mar-30 06:30:36 :: [aarch64] #100 llama.cpp.git 8470-alt1: build start
2026-Mar-30 06:30:36 :: [x86_64] #100 llama.cpp.git 8470-alt1: build start
2026-Mar-30 06:30:47 :: [i586] #100 llama.cpp.git 8470-alt1: build SKIPPED
2026-Mar-30 06:30:54 :: [aarch64] #100 llama.cpp.git 8470-alt1: build OK (cached)
build/100/x86_64/log:[00:05:38] debuginfo.req: WARNING: /usr/lib64/libcublas.so.12 is not yet debuginfo-enabled
build/100/x86_64/log:[00:05:39] debuginfo.req: WARNING: /usr/lib64/libcudart.so.12 is not yet debuginfo-enabled
2026-Mar-30 06:30:58 :: [x86_64] #100 llama.cpp.git 8470-alt1: build OK (cached)
2026-Mar-30 06:30:58 :: 100: build check OK (cached)
2026-Mar-30 06:30:59 :: build check OK
2026-Mar-30 06:31:11 :: #100: llama.cpp.git 8470-alt1: version check OK
2026-Mar-30 06:31:11 :: build version check OK
--- llama.cpp-cpu-8470-alt1.x86_64.rpm.share	2026-03-30 06:31:15.183362456 +0000
+++ llama.cpp-cpu-8470-alt1.aarch64.rpm.share	2026-03-30 06:31:16.478374001 +0000
@@ -8,3 +8,3 @@
 /usr/share/doc/llama.cpp/README.md	100644	UTF-8 Unicode English text, with very long lines
-/usr/share/doc/llama.cpp/build-options.txt	100644	ASCII English text, with very long lines
+/usr/share/doc/llama.cpp/build-options.txt	100644	ASCII English text
 /usr/share/doc/llama.cpp/docs	40755	directory
warning (#100): non-identical /usr/share part
2026-Mar-30 06:31:32 :: noarch check OK
2026-Mar-30 06:31:33 :: plan: src +1 -2 =20200, aarch64 +8 -9 =36013, x86_64 +10 -11 =36988
#100 llama.cpp 5753-alt1 -> 1:8470-alt1
 Sun Mar 22 2026 Vitaly Chikunov <vt at altlinux> 1:8470-alt1
 - Update to b8470 (2026-03-22).
 Tue Mar 03 2026 Vitaly Chikunov <vt at altlinux> 1:8192-alt1
 - Update to b8192 (2026-03-03).
 Fri Feb 13 2026 Vitaly Chikunov <vt at altlinux> 1:8018-alt1
 - Update to b8018 (2026-02-12).
 Sat Jan 24 2026 Vitaly Chikunov <vt at altlinux> 1:7819-alt1
 - Update to b7819 (2026-01-23).
 - Responses API support (partial).
 Sun Dec 14 2025 Vitaly Chikunov <vt at altlinux> 1:7388-alt1
 [...]
2026-Mar-30 06:32:18 :: patched apt indices
2026-Mar-30 06:32:28 :: created next repo
2026-Mar-30 06:32:38 :: duplicate provides check OK
2026-Mar-30 06:33:18 :: dependencies check OK
2026-Mar-30 06:33:55 :: [x86_64 aarch64] ELF symbols check OK
2026-Mar-30 06:34:08 :: [x86_64] #100 libllama: install check OK (cached)
2026-Mar-30 06:34:12 :: [x86_64] #100 libllama-debuginfo: install check OK (cached)
2026-Mar-30 06:34:14 :: [aarch64] #100 libllama: install check OK (cached)
	x86_64: libllama-devel=1:8470-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-30 06:34:17 :: [x86_64] #100 libllama-devel: install check OK (cached)
2026-Mar-30 06:34:22 :: [aarch64] #100 libllama-debuginfo: install check OK (cached)
2026-Mar-30 06:34:23 :: [x86_64] #100 llama.cpp: install check OK (cached)
2026-Mar-30 06:34:27 :: [x86_64] #100 llama.cpp-cpu: install check OK (cached)
	aarch64: libllama-devel=1:8470-alt1 post-install unowned files:
 /usr/lib64/cmake
2026-Mar-30 06:34:30 :: [aarch64] #100 libllama-devel: install check OK (cached)
2026-Mar-30 06:34:32 :: [x86_64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2026-Mar-30 06:34:38 :: [aarch64] #100 llama.cpp: install check OK (cached)
2026-Mar-30 06:34:38 :: [x86_64] #100 llama.cpp-cuda: install check OK (cached)
2026-Mar-30 06:34:44 :: [x86_64] #100 llama.cpp-cuda-debuginfo: install check OK (cached)
2026-Mar-30 06:34:46 :: [aarch64] #100 llama.cpp-cpu: install check OK (cached)
2026-Mar-30 06:34:49 :: [x86_64] #100 llama.cpp-vulkan: install check OK (cached)
2026-Mar-30 06:34:54 :: [aarch64] #100 llama.cpp-cpu-debuginfo: install check OK (cached)
2026-Mar-30 06:34:54 :: [x86_64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2026-Mar-30 06:35:01 :: [aarch64] #100 llama.cpp-vulkan: install check OK (cached)
2026-Mar-30 06:35:09 :: [aarch64] #100 llama.cpp-vulkan-debuginfo: install check OK (cached)
2026-Mar-30 06:35:27 :: [x86_64-i586] generated apt indices
2026-Mar-30 06:35:27 :: [x86_64-i586] created next repo
2026-Mar-30 06:35:38 :: [x86_64-i586] dependencies check OK
2026-Mar-30 06:35:39 :: gears inheritance check OK
2026-Mar-30 06:35:39 :: srpm inheritance check OK
girar-check-perms: access to @maint ALLOWED for cas: member of approved group
check-subtask-perms: #100: llama.cpp: approved by cas, needs an approval from a member of @tester group
girar-check-perms: access to python3-module-llama-cpp-python DENIED for vt: project `python3-module-llama-cpp-python' is not listed in the acl file for repository `p11', and the policy for such projects in `p11' is to deny
check-subtask-perms: #200: python3-module-llama-cpp-python: needs approvals from members of @maint and @tester groups
2026-Mar-30 06:35:42 :: acl check FAILED
2026-Mar-30 06:35:54 :: created contents_index files
2026-Mar-30 06:36:02 :: created hash files: aarch64 src x86_64
2026-Mar-30 06:36:05 :: task #412943 for p11 EPERM


More information about the Sisyphus-incominger mailing list