Name | wu_2d2c1760-GIANNI_GPROTO7-0-1-RND8822_0 |
Workunit | 31542607 |
Created | 24 Sep 2025, 15:02:53 UTC |
Sent | 24 Sep 2025, 15:04:37 UTC |
Report deadline | 29 Sep 2025, 15:04:37 UTC |
Received | 24 Sep 2025, 15:08:02 UTC |
Server state | Over |
Outcome | Computation error |
Client state | Compute error |
Exit status | 195 (0x000000C3) EXIT_CHILD_FAILED |
Computer ID | 642471 |
Run time | 19 sec |
CPU time | 9 sec |
Validate state | Invalid |
Credit | 0.00 |
Device peak FLOPS | 53,039.31 GFLOPS |
Application version | LLM: LLMs for chemistry v1.01 (cuda124L) windows_x86_64 |
Peak working set size | 689.64 MB |
Peak swap size | 1.40 GB |
Peak disk usage | 6.01 GB |
<core_client_version>8.2.4</core_client_version> <![CDATA[ <message> El sistema operativo no puede ejecutar (null). (0xc3) - exit code 195 (0xc3)</message> <stderr_txt> 17:06:16 (2524): wrapper (7.9.26016): starting 17:06:16 (2524): wrapper: running Library/usr/bin/tar.exe (xjvf input.tar.bz2) tasks.json run.bat conf.yaml main_generation-0.1.0-py3-none-any.whl run.sh 17:06:17 (2524): Library/usr/bin/tar.exe exited; CPU time 0.000000 17:06:17 (2524): wrapper: running C:/Windows/system32/cmd.exe (/c call Scripts\activate.bat && Scripts\conda-unpack.exe && run.bat) Generating train split: 0 examples [00:00, ? examples/s] Generating train split: 2500 examples [00:00, 384939.79 examples/s] C:\ProgramData\BOINC\slots\1\Lib\site-packages\torch\cuda\__init__.py:235: UserWarning: NVIDIA GeForce RTX 5090 with CUDA capability sm_120 is not compatible with the current PyTorch installation. The current PyTorch install supports CUDA capabilities sm_50 sm_60 sm_61 sm_70 sm_75 sm_80 sm_86 sm_90. If you want to use the NVIDIA GeForce RTX 5090 GPU with PyTorch, please check the instructions at https://pytorch.org/get-started/locally/ warnings.warn( [W924 17:06:44.000000000 socket.cpp:759] [c10d] The client socket has failed to connect to [Medimoscas]:58776 (system error: 10049 - La dirección solicitada no es válida en este contexto.). [rank0]: Traceback (most recent call last): [rank0]: File "wheel_contents/aiengine/main_generation.py", line 87, in <module> [rank0]: File "wheel_contents/aiengine/model.py", line 36, in __init__ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\utils.py", line 1096, in inner [rank0]: return fn(*args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\entrypoints\llm.py", line 243, in __init__ [rank0]: self.llm_engine = LLMEngine.from_engine_args( [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\engine\llm_engine.py", line 521, in from_engine_args [rank0]: return engine_cls.from_vllm_config( [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\engine\llm_engine.py", line 497, in from_vllm_config [rank0]: return cls( [rank0]: ^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\engine\llm_engine.py", line 281, in __init__ [rank0]: self.model_executor = executor_class(vllm_config=vllm_config, ) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\executor\executor_base.py", line 52, in __init__ [rank0]: self._init_executor() [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\executor\uniproc_executor.py", line 47, in _init_executor [rank0]: self.collective_rpc("load_model") [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\executor\uniproc_executor.py", line 56, in collective_rpc [rank0]: answer = run_method(self.driver_worker, method, args, kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\utils.py", line 2359, in run_method [rank0]: return func(*args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\worker\worker.py", line 184, in load_model [rank0]: self.model_runner.load_model() [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\worker\model_runner.py", line 1113, in load_model [rank0]: self.model = get_model(vllm_config=self.vllm_config) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\model_loader\__init__.py", line 14, in get_model [rank0]: return loader.load_model(vllm_config=vllm_config) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\model_loader\loader.py", line 1278, in load_model [rank0]: model = _initialize_model(vllm_config=vllm_config) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\model_loader\loader.py", line 127, in _initialize_model [rank0]: return model_class(vllm_config=vllm_config, prefix=prefix) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 431, in __init__ [rank0]: self.model = Qwen2Model(vllm_config=vllm_config, [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\compilation\decorators.py", line 151, in __init__ [rank0]: old_init(self, vllm_config=vllm_config, prefix=prefix, **kwargs) [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 300, in __init__ [rank0]: self.start_layer, self.end_layer, self.layers = make_layers( [rank0]: ^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\models\utils.py", line 610, in make_layers [rank0]: maybe_offload_to_cpu(layer_fn(prefix=f"{prefix}.{idx}")) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 302, in <lambda> [rank0]: lambda prefix: Qwen2DecoderLayer(config=config, [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 206, in __init__ [rank0]: self.self_attn = Qwen2Attention( [rank0]: ^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\models\qwen2.py", line 153, in __init__ [rank0]: self.rotary_emb = get_rope( [rank0]: ^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\layers\rotary_embedding.py", line 1180, in get_rope [rank0]: rotary_emb = RotaryEmbedding(head_size, rotary_dim, max_position, base, [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\layers\rotary_embedding.py", line 99, in __init__ [rank0]: cache = self._compute_cos_sin_cache() [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\layers\rotary_embedding.py", line 116, in _compute_cos_sin_cache [rank0]: inv_freq = self._compute_inv_freq(self.base) [rank0]: ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\vllm\model_executor\layers\rotary_embedding.py", line 110, in _compute_inv_freq [rank0]: inv_freq = 1.0 / (base**(torch.arange( [rank0]: ^^^^^^^^^^^^^ [rank0]: File "C:\ProgramData\BOINC\slots\1\Lib\site-packages\torch\utils\_device.py", line 104, in __torch_function__ [rank0]: return func(*args, **kwargs) [rank0]: ^^^^^^^^^^^^^^^^^^^^^ [rank0]: RuntimeError: CUDA error: no kernel image is available for execution on the device [rank0]: CUDA kernel errors might be asynchronously reported at some other API call, so the stacktrace below might be incorrect. [rank0]: For debugging consider passing CUDA_LAUNCH_BLOCKING=1 [rank0]: Compile with `TORCH_USE_CUDA_DSA` to enable device-side assertions. 17:06:45 (2524): C:/Windows/system32/cmd.exe exited; CPU time 9.468750 17:06:45 (2524): app exit status: 0x16 17:06:45 (2524): called boinc_finish(195) 0 bytes in 0 Free Blocks. 256 bytes in 6 Normal Blocks. 1144 bytes in 1 CRT Blocks. 0 bytes in 0 Ignore Blocks. 0 bytes in 0 Client Blocks. Largest number used: 0 bytes. Total allocations: 661881 bytes. Dumping objects -> {1601524} normal block at 0x0000019606619900, 48 bytes long. Data: <PATH=C:\ProgramD> 50 41 54 48 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 {1601513} normal block at 0x0000019606619DD0, 48 bytes long. Data: <HOME=C:\ProgramD> 48 4F 4D 45 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 {1601502} normal block at 0x000001960661A3F0, 48 bytes long. Data: <TMP=C:\ProgramDa> 54 4D 50 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 61 {1601491} normal block at 0x0000019606619890, 48 bytes long. Data: <TEMP=C:\ProgramD> 54 45 4D 50 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 {1601480} normal block at 0x000001960661A070, 48 bytes long. Data: <TMPDIR=C:\Progra> 54 4D 50 44 49 52 3D 43 3A 5C 50 72 6F 67 72 61 {1601449} normal block at 0x00000196082CCF20, 64 bytes long. Data: <PATH=C:\ProgramD> 50 41 54 48 3D 43 3A 5C 50 72 6F 67 72 61 6D 44 ..\api\boinc_api.cpp(309) : {1601436} normal block at 0x00000196065EFF10, 8 bytes long. Data: < – > 00 00 90 06 96 01 00 00 {1599912} normal block at 0x00000196065EFAB0, 8 bytes long. Data: < ßf – > 90 DF 66 06 96 01 00 00 ..\zip\boinc_zip.cpp(122) : {303} normal block at 0x00000196065E26F0, 260 bytes long. Data: < > 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 {288} normal block at 0x00000196065DF6A0, 80 bytes long. Data: </c call Scripts\> 2F 63 20 63 61 6C 6C 20 53 63 72 69 70 74 73 5C {287} normal block at 0x00000196065F6DC0, 16 bytes long. Data: <Xv_ – > 58 76 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {286} normal block at 0x00000196065F6730, 16 bytes long. Data: <0v_ – > 30 76 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {285} normal block at 0x00000196065F6370, 16 bytes long. Data: < v_ – > 08 76 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {284} normal block at 0x00000196065F7180, 16 bytes long. Data: <àu_ – > E0 75 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {283} normal block at 0x00000196065F6690, 16 bytes long. Data: <¸u_ – > B8 75 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {282} normal block at 0x00000196065F7130, 16 bytes long. Data: < u_ – > 90 75 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {281} normal block at 0x00000196065F4AB0, 48 bytes long. Data: <ComSpec=C:\Windo> 43 6F 6D 53 70 65 63 3D 43 3A 5C 57 69 6E 64 6F {280} normal block at 0x00000196065F69B0, 16 bytes long. Data: < b_ – > 08 62 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {279} normal block at 0x00000196065F0F70, 32 bytes long. Data: <SystemRoot=C:\Wi> 53 79 73 74 65 6D 52 6F 6F 74 3D 43 3A 5C 57 69 {278} normal block at 0x00000196065F6FF0, 16 bytes long. Data: <àa_ – > E0 61 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {276} normal block at 0x00000196065F6FA0, 16 bytes long. Data: <¸a_ – > B8 61 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {275} normal block at 0x00000196065F6D70, 16 bytes long. Data: < a_ – > 90 61 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {274} normal block at 0x00000196065F6D20, 16 bytes long. Data: <ha_ – > 68 61 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {273} normal block at 0x00000196065F70E0, 16 bytes long. Data: <@a_ – > 40 61 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {272} normal block at 0x00000196065F6960, 16 bytes long. Data: < a_ – > 18 61 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {271} normal block at 0x00000196065F1450, 32 bytes long. Data: <CUDA_DEVICE=0 PU> 43 55 44 41 5F 44 45 56 49 43 45 3D 30 00 50 55 {270} normal block at 0x00000196065F6F50, 16 bytes long. Data: <ð`_ – > F0 60 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {269} normal block at 0x00000196065F60F0, 320 bytes long. Data: <Po_ – P _ – > 50 6F 5F 06 96 01 00 00 50 14 5F 06 96 01 00 00 {268} normal block at 0x00000196065F6F00, 16 bytes long. Data: <pu_ – > 70 75 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {267} normal block at 0x00000196065F6E60, 16 bytes long. Data: <Hu_ – > 48 75 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {266} normal block at 0x00000196065F0E50, 32 bytes long. Data: <C:/Windows/syste> 43 3A 2F 57 69 6E 64 6F 77 73 2F 73 79 73 74 65 {265} normal block at 0x00000196065F6640, 16 bytes long. Data: < u_ – > 20 75 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {264} normal block at 0x00000196065F13F0, 32 bytes long. Data: <xjvf input.tar.b> 78 6A 76 66 20 69 6E 70 75 74 2E 74 61 72 2E 62 {263} normal block at 0x00000196065F7090, 16 bytes long. Data: <ht_ – > 68 74 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {262} normal block at 0x00000196065F6410, 16 bytes long. Data: <@t_ – > 40 74 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {261} normal block at 0x00000196065F72C0, 16 bytes long. Data: < t_ – > 18 74 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {260} normal block at 0x00000196065F7270, 16 bytes long. Data: <ðs_ – > F0 73 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {259} normal block at 0x00000196065F6CD0, 16 bytes long. Data: <Ès_ – > C8 73 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {258} normal block at 0x00000196065F6870, 16 bytes long. Data: < s_ – > A0 73 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {256} normal block at 0x00000196065F7220, 16 bytes long. Data: <ðH_ – > F0 48 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {255} normal block at 0x00000196065F48F0, 40 bytes long. Data: < r_ – Ï, – > 20 72 5F 06 96 01 00 00 20 CF 2C 08 96 01 00 00 {254} normal block at 0x00000196065F6B40, 16 bytes long. Data: < s_ – > 80 73 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {253} normal block at 0x00000196065F7040, 16 bytes long. Data: <Xs_ – > 58 73 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {252} normal block at 0x00000196065F1810, 32 bytes long. Data: <Library/usr/bin/> 4C 69 62 72 61 72 79 2F 75 73 72 2F 62 69 6E 2F {251} normal block at 0x00000196065F6AF0, 16 bytes long. Data: <0s_ – > 30 73 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {250} normal block at 0x00000196065F7330, 992 bytes long. Data: <ðj_ – _ – > F0 6A 5F 06 96 01 00 00 10 18 5F 06 96 01 00 00 {94} normal block at 0x00000196065F12D0, 32 bytes long. Data: <windows_x86_64__> 77 69 6E 64 6F 77 73 5F 78 38 36 5F 36 34 5F 5F {93} normal block at 0x00000196065EF2E0, 16 bytes long. Data: <ÀM_ – > C0 4D 5F 06 96 01 00 00 00 00 00 00 00 00 00 00 {92} normal block at 0x00000196065F4DC0, 40 bytes long. Data: <àò^ – Ð _ – > E0 F2 5E 06 96 01 00 00 D0 12 5F 06 96 01 00 00 {71} normal block at 0x00000196065EF560, 16 bytes long. Data: < ꢭ÷ > 80 EA A2 AD F7 7F 00 00 00 00 00 00 00 00 00 00 {70} normal block at 0x00000196065EF4C0, 16 bytes long. Data: <@颭÷ > 40 E9 A2 AD F7 7F 00 00 00 00 00 00 00 00 00 00 {69} normal block at 0x00000196065EF6A0, 16 bytes long. Data: <øWŸ­÷ > F8 57 9F AD F7 7F 00 00 00 00 00 00 00 00 00 00 {68} normal block at 0x00000196065EF100, 16 bytes long. Data: <ØWŸ­÷ > D8 57 9F AD F7 7F 00 00 00 00 00 00 00 00 00 00 {67} normal block at 0x00000196065EF5B0, 16 bytes long. Data: <P Ÿ­÷ > 50 04 9F AD F7 7F 00 00 00 00 00 00 00 00 00 00 {66} normal block at 0x00000196065EF1A0, 16 bytes long. Data: <0 Ÿ­÷ > 30 04 9F AD F7 7F 00 00 00 00 00 00 00 00 00 00 {65} normal block at 0x00000196065EFA60, 16 bytes long. Data: <à Ÿ­÷ > E0 02 9F AD F7 7F 00 00 00 00 00 00 00 00 00 00 {64} normal block at 0x00000196065EF420, 16 bytes long. Data: < Ÿ­÷ > 10 04 9F AD F7 7F 00 00 00 00 00 00 00 00 00 00 {63} normal block at 0x00000196065EFA10, 16 bytes long. Data: <p Ÿ­÷ > 70 04 9F AD F7 7F 00 00 00 00 00 00 00 00 00 00 {62} normal block at 0x00000196065EFBF0, 16 bytes long. Data: < À ­÷ > 18 C0 9D AD F7 7F 00 00 00 00 00 00 00 00 00 00 Object dump complete. </stderr_txt> ]]>
©2025 Universitat Pompeu Fabra