Task 38579434

Name wu_83f1af30-GIANNI_GPROTO7-0-1-RND2208_0
Workunit 31544813
Created 30 Sep 2025, 6:17:43 UTC
Sent 30 Sep 2025, 6:17:56 UTC
Report deadline 5 Oct 2025, 6:17:56 UTC
Received 30 Sep 2025, 8:09:07 UTC
Server state Over
Outcome Computation error
Client state Compute error
Exit status 195 (0x000000C3) EXIT_CHILD_FAILED
Computer ID 644391
Run time 3 min 25 sec
CPU time 34 sec
Validate state Invalid
Credit 0.00
Device peak FLOPS 55,489.00 GFLOPS
Application version LLM: LLMs for chemistry v1.01 (cuda124L)
windows_x86_64
Peak working set size 776.12 MB
Peak swap size 2.48 GB
Peak disk usage 6.35 GB

Stderr output

<core_client_version>8.2.4</core_client_version>
<![CDATA[
<message>
The operating system cannot run (null).
 (0xc3) - exit code 195 (0xc3)</message>
<stderr_txt>
04:05:19 (17368): wrapper (7.9.26016): starting
04:05:19 (17368): wrapper: running Library/usr/bin/tar.exe (xjvf input.tar.bz2)
tasks.json
run.bat
conf.yaml
main_generation-0.1.0-py3-none-any.whl
run.sh
04:05:20 (17368): Library/usr/bin/tar.exe exited; CPU time 0.000000
04:05:20 (17368): wrapper: running C:/Windows/system32/cmd.exe (/c call Scripts\activate.bat && Scripts\conda-unpack.exe && run.bat)

Generating train split: 0 examples [00:00, ? examples/s]
Generating train split: 2500 examples [00:00, 332364.26 examples/s]
C:\ProgramData\BOINC\slots\0\Lib\site-packages\huggingface_hub\file_download.py:144: UserWarning: `huggingface_hub` cache-system uses symlinks by default to efficiently store duplicated files but your machine does not support them in C:\ProgramData\BOINC\slots\.cache\hub\models--Acellera--proto. Caching files will still work but in a degraded version that might require more space on your disk. This warning can be disabled by setting the `HF_HUB_DISABLE_SYMLINKS_WARNING` environment variable. For more details, see https://huggingface.co/docs/huggingface_hub/how-to-cache#limitations.
To support symlinks on Windows, you either need to activate Developer Mode or to run Python as an administrator. In order to activate developer mode, see this article: https://docs.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development
  warnings.warn(message)
C:\ProgramData\BOINC\slots\0\Lib\site-packages\huggingface_hub\file_download.py:144: UserWarning: `huggingface_hub` cache-system uses symlinks by default to efficiently store duplicated files but your machine does not support them in C:\ProgramData\BOINC\slots\.cache\hub\models--unsloth--Qwen2.5-14B-Instruct-bnb-4bit. Caching files will still work but in a degraded version that might require more space on your disk. This warning can be disabled by setting the `HF_HUB_DISABLE_SYMLINKS_WARNING` environment variable. For more details, see https://huggingface.co/docs/huggingface_hub/how-to-cache#limitations.
To support symlinks on Windows, you either need to activate Developer Mode or to run Python as an administrator. In order to activate developer mode, see this article: https://docs.microsoft.com/en-us/windows/apps/get-started/enable-your-device-for-development
  warnings.warn(message)
C:\ProgramData\BOINC\slots\0\Lib\site-packages\torch\cuda\__init__.py:235: UserWarning: 
NVIDIA GeForce RTX 5090 with CUDA capability sm_120 is not compatible with the current PyTorch installation.
The current PyTorch install supports CUDA capabilities sm_50 sm_60 sm_61 sm_70 sm_75 sm_80 sm_86 sm_90.
If you want to use the NVIDIA GeForce RTX 5090 GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/

  warnings.warn(
[W930 04:07:04.000000000 socket.cpp:759] [c10d] The client socket has failed to connect to [MainGaming]:52041 (system error: 10049 - The requested address is not valid in its context.).
[rank0]: Traceback (most recent call last):
[rank0]:   File "wheel_contents/aiengine/main_generation.py", line 87, in <module>
[rank0]:   File "wheel_contents/aiengine/model.py", line 36, in __init__
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\utils.py", line 1096, in inner
[rank0]:     return fn(*args, **kwargs)
[rank0]:            ^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\entrypoints\llm.py", line 243, in __init__
[rank0]:     self.llm_engine = LLMEngine.from_engine_args(
[rank0]:                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\engine\llm_engine.py", line 521, in from_engine_args
[rank0]:     return engine_cls.from_vllm_config(
[rank0]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\engine\llm_engine.py", line 497, in from_vllm_config
[rank0]:     return cls(
[rank0]:            ^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\engine\llm_engine.py", line 281, in __init__
[rank0]:     self.model_executor = executor_class(vllm_config=vllm_config, )
[rank0]:                           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\executor\executor_base.py", line 52, in __init__
[rank0]:     self._init_executor()
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\executor\uniproc_executor.py", line 47, in _init_executor
[rank0]:     self.collective_rpc("load_model")
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\executor\uniproc_executor.py", line 56, in collective_rpc
[rank0]:     answer = run_method(self.driver_worker, method, args, kwargs)
[rank0]:              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\utils.py", line 2359, in run_method
[rank0]:     return func(*args, **kwargs)
[rank0]:            ^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\worker\worker.py", line 184, in load_model
[rank0]:     self.model_runner.load_model()
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\worker\model_runner.py", line 1113, in load_model
[rank0]:     self.model = get_model(vllm_config=self.vllm_config)
[rank0]:                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\model_loader\__init__.py", line 14, in get_model
[rank0]:     return loader.load_model(vllm_config=vllm_config)
[rank0]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\model_loader\loader.py", line 1278, in load_model
[rank0]:     model = _initialize_model(vllm_config=vllm_config)
[rank0]:             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\model_loader\loader.py", line 127, in _initialize_model
[rank0]:     return model_class(vllm_config=vllm_config, prefix=prefix)
[rank0]:            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 431, in __init__
[rank0]:     self.model = Qwen2Model(vllm_config=vllm_config,
[rank0]:                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\compilation\decorators.py", line 151, in __init__
[rank0]:     old_init(self, vllm_config=vllm_config, prefix=prefix, **kwargs)
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 300, in __init__
[rank0]:     self.start_layer, self.end_layer, self.layers = make_layers(
[rank0]:                                                     ^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\models\utils.py", line 610, in make_layers
[rank0]:     maybe_offload_to_cpu(layer_fn(prefix=f"{prefix}.{idx}"))
[rank0]:                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 302, in <lambda>
[rank0]:     lambda prefix: Qwen2DecoderLayer(config=config,
[rank0]:                    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 206, in __init__
[rank0]:     self.self_attn = Qwen2Attention(
[rank0]:                      ^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 153, in __init__
[rank0]:     self.rotary_emb = get_rope(
[rank0]:                       ^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\layers\rotary_embedding.py", line 1180, in get_rope
[rank0]:     rotary_emb = RotaryEmbedding(head_size, rotary_dim, max_position, base,
[rank0]:                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\layers\rotary_embedding.py", line 99, in __init__
[rank0]:     cache = self._compute_cos_sin_cache()
[rank0]:             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\layers\rotary_embedding.py", line 116, in _compute_cos_sin_cache
[rank0]:     inv_freq = self._compute_inv_freq(self.base)
[rank0]:                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\vllm\model_executor\layers\rotary_embedding.py", line 110, in _compute_inv_freq
[rank0]:     inv_freq = 1.0 / (base**(torch.arange(
[rank0]:                              ^^^^^^^^^^^^^
[rank0]:   File "C:\ProgramData\BOINC\slots\0\Lib\site-packages\torch\utils\_device.py", line 104, in __torch_function__
[rank0]:     return func(*args, **kwargs)
[rank0]:            ^^^^^^^^^^^^^^^^^^^^^
[rank0]: RuntimeError: CUDA error: no kernel image is available for execution on the device
[rank0]: CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect.
[rank0]: For debugging consider passing CUDA_LAUNCH_BLOCKING=1
[rank0]: Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions.

04:07:07 (17368): C:/Windows/system32/cmd.exe exited; CPU time 34.843750
04:07:07 (17368): app exit status: 0x16
04:07:07 (17368): called boinc_finish(195)
0 bytes in 0 Free Blocks.
256 bytes in 6 Normal Blocks.
1144 bytes in 1 CRT Blocks.
0 bytes in 0 Ignore Blocks.
0 bytes in 0 Client Blocks.
Largest number used: 0 bytes.
Total allocations: 1450657 bytes.
Dumping objects ->
{1601465} normal block at 0x000001D1EBD8DA60, 48 bytes long.
 Data: <PATH=C:\ProgramD> 50 41 54 48 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 
{1601454} normal block at 0x000001D1EBD8DBB0, 48 bytes long.
 Data: <HOME=C:\ProgramD> 48 4F 4D 45 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 
{1601443} normal block at 0x000001D1EBD8DD70, 48 bytes long.
 Data: <TMP=C:\ProgramDa> 54 4D 50 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 61 
{1601432} normal block at 0x000001D1EBD8DDE0, 48 bytes long.
 Data: <TEMP=C:\ProgramD> 54 45 4D 50 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 
{1601421} normal block at 0x000001D1EBD8D7C0, 48 bytes long.
 Data: <TMPDIR=C:\Progra> 54 4D 50 44 49 52 3D 43 3A 5C 50 72 6F 67 72 61 
{1601390} normal block at 0x000001D1EBE31D10, 64 bytes long.
 Data: <PATH=C:\ProgramD> 50 41 54 48 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 
..\api\boinc_api.cpp(309) : {1601377} normal block at 0x000001D1EBD61010, 8 bytes long.
 Data: <  &#208;&#235;&#209;   > 00 00 D0 EB D1 01 00 00 
{1599912} normal block at 0x000001D1EBD60DE0, 8 bytes long.
 Data: <p&#235;&#227;&#235;&#209;   > 70 EB E3 EB D1 01 00 00 
..\zip\boinc_zip.cpp(122) : {303} normal block at 0x000001D1EBD56120, 260 bytes long.
 Data: <                > 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 
{288} normal block at 0x000001D1EBD54180, 80 bytes long.
 Data: </c call Scripts\> 2F 63 20 63 61 6C 6C 20 53 63 72 69 70 74 73 5C 
{287} normal block at 0x000001D1EBD6A720, 16 bytes long.
 Data: <8&#173;&#214;&#235;&#209;           > 38 AD D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{286} normal block at 0x000001D1EBD6A0E0, 16 bytes long.
 Data: < &#173;&#214;&#235;&#209;           > 10 AD D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{285} normal block at 0x000001D1EBD6A950, 16 bytes long.
 Data: <&#232;&#172;&#214;&#235;&#209;           > E8 AC D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{284} normal block at 0x000001D1EBD6A090, 16 bytes long.
 Data: <&#192;&#172;&#214;&#235;&#209;           > C0 AC D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{283} normal block at 0x000001D1EBD69EB0, 16 bytes long.
 Data: < &#172;&#214;&#235;&#209;           > 98 AC D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{282} normal block at 0x000001D1EBD69C80, 16 bytes long.
 Data: <p&#172;&#214;&#235;&#209;           > 70 AC D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{281} normal block at 0x000001D1EBD63350, 48 bytes long.
 Data: <ComSpec=C:\Windo> 43 6F 6D 53 70 65 63 3D 43 3A 5C 57 69 6E 64 6F 
{280} normal block at 0x000001D1EBD69C30, 16 bytes long.
 Data: < &#232;&#212;&#235;&#209;           > 98 E8 D4 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{279} normal block at 0x000001D1EBD64DE0, 32 bytes long.
 Data: <SystemRoot=C:\Wi> 53 79 73 74 65 6D 52 6F 6F 74 3D 43 3A 5C 57 69 
{278} normal block at 0x000001D1EBD6A400, 16 bytes long.
 Data: <p&#232;&#212;&#235;&#209;           > 70 E8 D4 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{276} normal block at 0x000001D1EBD6A4F0, 16 bytes long.
 Data: <H&#232;&#212;&#235;&#209;           > 48 E8 D4 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{275} normal block at 0x000001D1EBD6A6D0, 16 bytes long.
 Data: < &#232;&#212;&#235;&#209;           > 20 E8 D4 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{274} normal block at 0x000001D1EBD69AA0, 16 bytes long.
 Data: <&#248;&#231;&#212;&#235;&#209;           > F8 E7 D4 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{273} normal block at 0x000001D1EBD6A4A0, 16 bytes long.
 Data: <&#208;&#231;&#212;&#235;&#209;           > D0 E7 D4 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{272} normal block at 0x000001D1EBD6A630, 16 bytes long.
 Data: <&#168;&#231;&#212;&#235;&#209;           > A8 E7 D4 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{271} normal block at 0x000001D1EBD64BA0, 32 bytes long.
 Data: <CUDA_DEVICE=0 PU> 43 55 44 41 5F 44 45 56 49 43 45 3D 30 00 50 55 
{270} normal block at 0x000001D1EBD6A450, 16 bytes long.
 Data: < &#231;&#212;&#235;&#209;           > 80 E7 D4 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{269} normal block at 0x000001D1EBD4E780, 320 bytes long.
 Data: <P&#164;&#214;&#235;&#209;   &#160;K&#214;&#235;&#209;   > 50 A4 D6 EB D1 01 00 00 A0 4B D6 EB D1 01 00 00 
{268} normal block at 0x000001D1EBD69E10, 16 bytes long.
 Data: <P&#172;&#214;&#235;&#209;           > 50 AC D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{267} normal block at 0x000001D1EBD69D20, 16 bytes long.
 Data: <(&#172;&#214;&#235;&#209;           > 28 AC D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{266} normal block at 0x000001D1EBD650E0, 32 bytes long.
 Data: <C:/Windows/syste> 43 3A 2F 57 69 6E 64 6F 77 73 2F 73 79 73 74 65 
{265} normal block at 0x000001D1EBD69A50, 16 bytes long.
 Data: < &#172;&#214;&#235;&#209;           > 00 AC D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{264} normal block at 0x000001D1EBD64720, 32 bytes long.
 Data: <xjvf input.tar.b> 78 6A 76 66 20 69 6E 70 75 74 2E 74 61 72 2E 62 
{263} normal block at 0x000001D1EBD69FF0, 16 bytes long.
 Data: <H&#171;&#214;&#235;&#209;           > 48 AB D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{262} normal block at 0x000001D1EBD69E60, 16 bytes long.
 Data: < &#171;&#214;&#235;&#209;           > 20 AB D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{261} normal block at 0x000001D1EBD69FA0, 16 bytes long.
 Data: <&#248;&#170;&#214;&#235;&#209;           > F8 AA D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{260} normal block at 0x000001D1EBD6A310, 16 bytes long.
 Data: <&#208;&#170;&#214;&#235;&#209;           > D0 AA D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{259} normal block at 0x000001D1EBD69F50, 16 bytes long.
 Data: <&#168;&#170;&#214;&#235;&#209;           > A8 AA D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{258} normal block at 0x000001D1EBD69DC0, 16 bytes long.
 Data: < &#170;&#214;&#235;&#209;           > 80 AA D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{256} normal block at 0x000001D1EBD6A360, 16 bytes long.
 Data: <&#240;.&#214;&#235;&#209;           > F0 2E D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{255} normal block at 0x000001D1EBD62EF0, 40 bytes long.
 Data: <`&#163;&#214;&#235;&#209;     &#227;&#235;&#209;   > 60 A3 D6 EB D1 01 00 00 10 1D E3 EB D1 01 00 00 
{254} normal block at 0x000001D1EBD69BE0, 16 bytes long.
 Data: <`&#170;&#214;&#235;&#209;           > 60 AA D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{253} normal block at 0x000001D1EBD6A040, 16 bytes long.
 Data: <8&#170;&#214;&#235;&#209;           > 38 AA D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{252} normal block at 0x000001D1EBD64360, 32 bytes long.
 Data: <Library/usr/bin/> 4C 69 62 72 61 72 79 2F 75 73 72 2F 62 69 6E 2F 
{251} normal block at 0x000001D1EBD6A590, 16 bytes long.
 Data: < &#170;&#214;&#235;&#209;           > 10 AA D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{250} normal block at 0x000001D1EBD6AA10, 992 bytes long.
 Data: < &#165;&#214;&#235;&#209;   `C&#214;&#235;&#209;   > 90 A5 D6 EB D1 01 00 00 60 43 D6 EB D1 01 00 00 
{94} normal block at 0x000001D1EBD64C00, 32 bytes long.
 Data: <windows_x86_64__> 77 69 6E 64 6F 77 73 5F 78 38 36 5F 36 34 5F 5F 
{93} normal block at 0x000001D1EBD61150, 16 bytes long.
 Data: <`/&#214;&#235;&#209;           > 60 2F D6 EB D1 01 00 00 00 00 00 00 00 00 00 00 
{92} normal block at 0x000001D1EBD62F60, 40 bytes long.
 Data: <P &#214;&#235;&#209;    L&#214;&#235;&#209;   > 50 11 D6 EB D1 01 00 00 00 4C D6 EB D1 01 00 00 
{71} normal block at 0x000001D1EBD60A70, 16 bytes long.
 Data: < &#234;  &#246;           > 80 EA 20 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{70} normal block at 0x000001D1EBD61100, 16 bytes long.
 Data: <@&#233;  &#246;           > 40 E9 20 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{69} normal block at 0x000001D1EBD610B0, 16 bytes long.
 Data: <&#248;W  &#246;           > F8 57 1D 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{68} normal block at 0x000001D1EBD60D90, 16 bytes long.
 Data: <&#216;W  &#246;           > D8 57 1D 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{67} normal block at 0x000001D1EBD616A0, 16 bytes long.
 Data: <P   &#246;           > 50 04 1D 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{66} normal block at 0x000001D1EBD61330, 16 bytes long.
 Data: <0   &#246;           > 30 04 1D 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{65} normal block at 0x000001D1EBD615B0, 16 bytes long.
 Data: <&#224;   &#246;           > E0 02 1D 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{64} normal block at 0x000001D1EBD61240, 16 bytes long.
 Data: <    &#246;           > 10 04 1D 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{63} normal block at 0x000001D1EBD61600, 16 bytes long.
 Data: <p   &#246;           > 70 04 1D 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
{62} normal block at 0x000001D1EBD60CF0, 16 bytes long.
 Data: < &#192;  &#246;           > 18 C0 1B 20 F6 7F 00 00 00 00 00 00 00 00 00 00 
Object dump complete.

</stderr_txt>
]]>


©2025 Universitat Pompeu Fabra