[cyber] I: Sisyphus-20231022 x86_64 beehive_status: +11 -10 (144)
ALT beekeeper
hiver на altlinux.org
Вс Окт 22 11:42:20 MSK 2023
11 NEW error logs
automake_1.16-1.16.5-alt1
# SKIP: 134
# XFAIL: 38
# FAIL: 1
# XPASS: 0
# ERROR: 0
See ./test-suite.log
curl-8.4.0-alt1
TESTDONE: 1406 tests out of 1407 reported OK: 99%
TESTFAIL: These test cases failed: 1474
make[1]: *** [Makefile:829: full-test] Error 1
perl-Data-Dump-Streamer-2.42-alt1
* Cpanel::JSON::XS is not installed
ERRORS/WARNINGS FOUND IN PREREQUISITES. You may wish to install the versions
of the modules indicated above before proceeding with this installation
--
Result: FAIL
Failed 1/26 test programs. 0/365 subtests failed.
perl-Mail-Box-3.010-alt1
t/504parser-bodyd.t ...... ok
# Failed test '1 lines 0'
# at t/505parser-bodymp.t line 68.
--
# expected: 21
# Failed test '1 lines 1'
# at t/505parser-bodymp.t line 68.
--
# expected: 82
# Failed test '1 lines 2'
# at t/505parser-bodymp.t line 68.
--
t/505parser-bodymp.t (Wstat: 768 Tests: 313 Failed: 3)
Failed tests: 45, 58, 65
Non-zero exit status: 3
--
Result: FAIL
Failed 1/42 test programs. 3/5339 subtests failed.
make: *** [Makefile:863: test_dynamic] Error 3
python3-module-easyprocess-1.1-alt2
tests/test_fast/test_proc.py::test_start3 PASSED [ 21%]
tests/test_fast/test_proc.py::test_alive FAILED [ 23%]
tests/test_fast/test_proc.py::test_std PASSED [ 26%]
--
tests/test_fast/test_proc.py::test_wrap PASSED [ 31%]
tests/test_fast/test_proc.py::test_with FAILED [ 34%]
tests/test_fast/test_proc.py::test_parse PASSED [ 36%]
--
=========================== short test summary info ============================
FAILED tests/test_fast/test_proc.py::test_alive - assert False
FAILED tests/test_fast/test_proc.py::test_with - assert False
=================== 2 failed, 36 passed in 77.84s (0:01:17) ====================
python3-module-pulsectl-asyncio-1.1.1-alt1
tests/test_async_with_dummy_instance.py::AsyncDummyTests::test_get_peak_sample
hasher-privd: parent: handle_io: idle time limit (3600 seconds) exceeded
python3-module-pydantic-2.4.2-alt1
f'Expected hash {_EXPECTED_NORTH_STAR_DATA_MD5} for north star data, but generated
{data_md5}'
E ValueError: Expected hash e0fb021af00010f90e9348d8c7fc8da4 for north
star data, but generated 0ff34599a0861026cf25b6cdbb4bbe81
tests/benchmarks/test_north_star.py:93: ValueError
______________ ERROR at setup of test_north_star_validate_python _______________
@pytest.fixture(scope='module')
--
f'Expected hash {_EXPECTED_NORTH_STAR_DATA_MD5} for north star data, but generated
{data_md5}'
E ValueError: Expected hash e0fb021af00010f90e9348d8c7fc8da4 for north
star data, but generated 0ff34599a0861026cf25b6cdbb4bbe81
tests/benchmarks/test_north_star.py:93: ValueError
___________ ERROR at setup of test_north_star_validate_python_strict ___________
@pytest.fixture(scope='module')
--
f'Expected hash {_EXPECTED_NORTH_STAR_DATA_MD5} for north star data, but generated
{data_md5}'
E ValueError: Expected hash e0fb021af00010f90e9348d8c7fc8da4 for north
star data, but generated 0ff34599a0861026cf25b6cdbb4bbe81
tests/benchmarks/test_north_star.py:93: ValueError
________________ ERROR at setup of test_north_star_dump_python _________________
@pytest.fixture(scope='module')
--
f'Expected hash {_EXPECTED_NORTH_STAR_DATA_MD5} for north star data, but generated
{data_md5}'
E ValueError: Expected hash e0fb021af00010f90e9348d8c7fc8da4 for north
star data, but generated 0ff34599a0861026cf25b6cdbb4bbe81
tests/benchmarks/test_north_star.py:93: ValueError
_________________ ERROR at setup of test_north_star_json_loads _________________
@pytest.fixture(scope='module')
--
f'Expected hash {_EXPECTED_NORTH_STAR_DATA_MD5} for north star data, but generated
{data_md5}'
E ValueError: Expected hash e0fb021af00010f90e9348d8c7fc8da4 for north
star data, but generated 0ff34599a0861026cf25b6cdbb4bbe81
tests/benchmarks/test_north_star.py:93: ValueError
_________________ ERROR at setup of test_north_star_json_dumps _________________
@pytest.fixture(scope='module')
python3-module-pyshark-0.6-alt1
==================================== ERRORS ====================================
________________ ERROR at teardown of test_count_packets[False] ________________
request = <SubRequest 'simple_xml_and_json_capture' for <Function
test_count_packets[False]>>
--
return await asyncio.wait_for(process.wait(), 1)
except asyncTimeoutError:
self._log.debug(
"Waiting for process to close failed, may have zombie process.")
except ProcessLookupError:
pass
except OSError:
if os.name != "nt":
--
raise RuntimeError('This event loop is already running')
RuntimeError: This event loop is already running
warnings.warn(pytest.PytestUnraisableExceptionWarning(msg))
--
=========================== short test summary info ============================
ERROR ../tests/test_basic_parsing.py::test_count_packets[False] - pyshark.cap...
=================== 73 passed, 2 warnings, 1 error in 20.75s ===================
ERROR: InvocationError for command
/usr/src/RPM/BUILD/python3-module-pyshark-0.6/src/.tox/py3/bin/pytest .. -s (exited
with code 1)
py3 finish: run-test after 20.96 seconds
--
___________________________________ summary ____________________________________
ERROR: py3: commands failed
cleanup
/usr/src/RPM/BUILD/python3-module-pyshark-0.6/src/.tox/.tmp/package/1/pyshark-0.6-py3-none-any.whl
python3-module-websockets-11.0.3-alt1
....................................................................................................................................................................................................................................................ss......s.....................................................................................................................s......................................................................................................................................s.............................................................................................................E.EFsFF..................s...........................................E.sFssF..................s...........................E.E..................................................................
hasher-privd: parent: handle_io: idle time limit (3600 seconds) exceeded
qemu-checkinstall-2-alt1
2.70user 1.53system 0:03.58elapsed 118%CPU (0avgtext+0avgdata 3354308maxresident)k
0inputs+0outputs (0major+15236minor)pagefaults 0swaps
+ toilet -f bigascii12 TCG
+ '[' -d /usr/lib/vm-run ']'
+ timeout 300 vm-run --tcg toilet -w222 'Native TCG OK'
+ time qemu-system-x86_64 -accel tcg -m 178369 -smp cores=4 -serial mon:stdio -nodefaults
-nographic -no-reboot -fsdev local,id=root,path=/,security_model=none,multidevs=remap
-device virtio-9p-pci,fsdev=root,mount_tag=virtio-9p:/ -device virtio-rng-pci -kernel
/boot/vmlinuz-6.5.8-un-def-alt1 -initrd /usr/src/tmp/initramfs-6.5.8-un-def-alt1.img
-sandbox on,spawn=deny -bios bios.bin -append 'console=ttyS0 mitigations=off nokaslr
quiet panic=-1 SCRIPT=/usr/src/tmp/vm.GujgIWwC4J no_timer_check'
Command terminated by signal 11
10.71user 0.54system 0:10.75elapsed 104%CPU (0avgtext+0avgdata 3344536maxresident)k
0inputs+0outputs (0major+17840minor)pagefaults 0swaps
NOTICE: This is crash of qemu-system-x86_64, not of the Linux kernel!
watchexec-1.22.3-alt1
test globs ... ok
test scopes ... FAILED
failures:
--
scopes
test result: FAILED. 5 passed; 1 failed; 0 ignored; 0 measured; 0 filtered out; finished
in 0.00s
error: test failed, to rerun pass `-p watchexec-filterer-ignore --test filtering`
10 error logs REMOVED from the list
codeblocks-20.03-alt9
efi-memtest86-5.0-alt3
fleet-commander-admin-0.15.1-alt12
fprintd-1.94.2-alt1
guile22-2.2.7-alt1
libgupnp-igd-1.6.0-alt1
prometheus-simpleclient-java-0.12.0-alt1_4jpp11
python3-module-Cython-0.29.36-alt1
python3-module-oslo.rootwrap-7.0.1-alt1.1
python3-module-typer-0.9.0-alt1
Total 144 error logs.
Подробная информация о списке рассылки Sisyphus-cybertalk