Skip to content

[IR] Implement replace_all_uses_with #6125

[IR] Implement replace_all_uses_with

[IR] Implement replace_all_uses_with #6125

GitHub Actions / Test Results failed Apr 20, 2024 in 0s

8 fail, 1 877 skipped, 3 955 pass in 3h 47m 28s

     28 files      28 suites   3h 47m 28s ⏱️
  5 840 tests  3 955 ✅   1 877 💤  8 ❌
508 973 runs  80 590 ✅ 428 350 💤 33 ❌

Results for commit 796b46c.

Annotations

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

1 out of 27 runs failed: test_output_match_opinfo__linalg_vector_norm_cpu_float32 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py310-ubuntu-latest)/pytest.xml [took 46s]
Raw output
EOFError
EOFError
tests/function_libs/torch_lib/ops_test.py:242: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:546: in _capture_graph_and_evaluate_torch_script_evaluator
    return _safe_ort_session_run(onnx_model.SerializeToString(), ort_inputs)
tests/function_libs/torch_lib/ops_test_common.py:353: in _safe_ort_session_run
    return_dict = manager.dict()
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/managers.py:724: in temp
    proxy = proxytype(
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/managers.py:792: in __init__
    self._incref()
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/managers.py:846: in _incref
    conn = self._Client(self._token.address, authkey=self._authkey)
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/connection.py:508: in Client
    answer_challenge(c, authkey)
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/connection.py:752: in answer_challenge
    message = connection.recv_bytes(256)         # reject large message
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/connection.py:216: in recv_bytes
    buf = self._recv_bytes(maxlength)
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/connection.py:414: in _recv_bytes
    buf = self._recv(4)
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/connection.py:383: in _recv
    raise EOFError
E   EOFError
tests/function_libs/torch_lib/ops_test.py:242: in run_test_output_match
    function_output = function_executor(test_name, reference_torch_outputs)(
tests/function_libs/torch_lib/ops_test_common.py:546: in _capture_graph_and_evaluate_torch_script_evaluator
    return _safe_ort_session_run(onnx_model.SerializeToString(), ort_inputs)
tests/function_libs/torch_lib/ops_test_common.py:352: in _safe_ort_session_run
    manager = multiprocessing.Manager()
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/context.py:57: in Manager
    m.start()
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/managers.py:566: in start
    self._address = reader.recv()
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/connection.py:250: in recv
    buf = self._recv_bytes()
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/connection.py:414: in _recv_bytes
    buf = self._recv(4)
/opt/hostedtoolcache/Python/3.10.14/x64/lib/python3.10/multiprocessing/connection.py:383: in _recv
    raise EOFError
E   EOFError

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 2 runs failed: test_output_match_opinfo__nn_functional_nll_loss_weight_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 1s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 1s]
Raw output
AssertionError: Scalars are not close!

Expected -1.087890625 but got -1.08984375.
Absolute difference: 0.001953125 (up to 1e-05 allowed)
Relative difference: 0.0017953321364452424 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected -0.0015745162963867188 but got 0.0006766319274902344.
Absolute difference: 0.002251148223876953 (up to 1e-05 allowed)
Relative difference: 1.429739551786796 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected -1.4326171875 but got -1.4345703125.
Absolute difference: 0.001953125 (up to 1e-05 allowed)
Relative difference: 0.0013633265167007499 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected -0.2271728515625 but got -0.228515625.
Absolute difference: 0.0013427734375 (up to 1e-05 allowed)
Relative difference: 0.005910800644814616 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected 21.34375 but got 21.3125.
Absolute difference: 0.03125 (up to 1e-05 allowed)
Relative difference: 0.0014641288433382138 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected -11.90625 but got -11.9375.
Absolute difference: 0.03125 (up to 1e-05 allowed)
Relative difference: 0.0026246719160104987 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -1.087890625 but got -1.08984375.
E   Absolute difference: 0.001953125 (up to 1e-05 allowed)
E   Relative difference: 0.0017953321364452424 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -0.0015745162963867188 but got 0.0006766319274902344.
E   Absolute difference: 0.002251148223876953 (up to 1e-05 allowed)
E   Relative difference: 1.429739551786796 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -1.4326171875 but got -1.4345703125.
E   Absolute difference: 0.001953125 (up to 1e-05 allowed)
E   Relative difference: 0.0013633265167007499 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -0.2271728515625 but got -0.228515625.
E   Absolute difference: 0.0013427734375 (up to 1e-05 allowed)
E   Relative difference: 0.005910800644814616 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected 21.34375 but got 21.3125.
E   Absolute difference: 0.03125 (up to 1e-05 allowed)
E   Relative difference: 0.0014641288433382138 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -11.90625 but got -11.9375.
E   Absolute difference: 0.03125 (up to 1e-05 allowed)
E   Relative difference: 0.0026246719160104987 (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 2 runs failed: test_output_match_opinfo__nn_functional_cross_entropy_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyEagerCPU)

artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 4s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 5s]
Raw output
AssertionError: Scalars are not close!

Expected 1.2529296875 but got 1.2509765625.
Absolute difference: 0.001953125 (up to 1e-05 allowed)
Relative difference: 0.001558846453624318 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected 1.2529296875 but got 1.2509765625.
E   Absolute difference: 0.001953125 (up to 1e-05 allowed)
E   Relative difference: 0.001558846453624318 (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 2 runs failed: test_output_match_opinfo__nn_functional_cross_entropy_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 1s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 1s]
Raw output
AssertionError: Scalars are not close!

Expected 1.2529296875 but got 1.2509765625.
Absolute difference: 0.001953125 (up to 1e-05 allowed)
Relative difference: 0.001558846453624318 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected 1.2529296875 but got 1.2509765625.
E   Absolute difference: 0.001953125 (up to 1e-05 allowed)
E   Relative difference: 0.001558846453624318 (up to 0.001 allowed)

Check warning on line 0 in tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU

See this annotation in the file changed.

@github-actions github-actions / Test Results

All 2 runs failed: test_output_match_opinfo__nn_functional_nll_loss_weight_cpu_float16 (tests.function_libs.torch_lib.ops_test.TestOutputConsistencyFullGraphCPU)

artifacts/Test Results (py311-torch-nightly-windows-latest)/pytest.xml [took 1s]
artifacts/Test Results (py312-torch-nightly-windows-latest)/pytest.xml [took 1s]
Raw output
AssertionError: Scalars are not close!

Expected -1.087890625 but got -1.08984375.
Absolute difference: 0.001953125 (up to 1e-05 allowed)
Relative difference: 0.0017953321364452424 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected -0.0015745162963867188 but got 0.0006766319274902344.
Absolute difference: 0.002251148223876953 (up to 1e-05 allowed)
Relative difference: 1.429739551786796 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected -1.4326171875 but got -1.4345703125.
Absolute difference: 0.001953125 (up to 1e-05 allowed)
Relative difference: 0.0013633265167007499 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected -0.2271728515625 but got -0.228515625.
Absolute difference: 0.0013427734375 (up to 1e-05 allowed)
Relative difference: 0.005910800644814616 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected 21.34375 but got 21.3125.
Absolute difference: 0.03125 (up to 1e-05 allowed)
Relative difference: 0.0014641288433382138 (up to 0.001 allowed)
AssertionError: Scalars are not close!

Expected -11.90625 but got -11.9375.
Absolute difference: 0.03125 (up to 1e-05 allowed)
Relative difference: 0.0026246719160104987 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -1.087890625 but got -1.08984375.
E   Absolute difference: 0.001953125 (up to 1e-05 allowed)
E   Relative difference: 0.0017953321364452424 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -0.0015745162963867188 but got 0.0006766319274902344.
E   Absolute difference: 0.002251148223876953 (up to 1e-05 allowed)
E   Relative difference: 1.429739551786796 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -1.4326171875 but got -1.4345703125.
E   Absolute difference: 0.001953125 (up to 1e-05 allowed)
E   Relative difference: 0.0013633265167007499 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -0.2271728515625 but got -0.228515625.
E   Absolute difference: 0.0013427734375 (up to 1e-05 allowed)
E   Relative difference: 0.005910800644814616 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected 21.34375 but got 21.3125.
E   Absolute difference: 0.03125 (up to 1e-05 allowed)
E   Relative difference: 0.0014641288433382138 (up to 0.001 allowed)
tests\function_libs\torch_lib\ops_test.py:279: in run_test_output_match
    torch.testing.assert_close(
E   AssertionError: Scalars are not close!
E   
E   Expected -11.90625 but got -11.9375.
E   Absolute difference: 0.03125 (up to 1e-05 allowed)
E   Relative difference: 0.0026246719160104987 (up to 0.001 allowed)

Check warning on line 0 in onnxscript.function_libs.torch_lib.graph_building.graph_building_test.TestModelSaving

See this annotation in the file changed.

@github-actions github-actions / Test Results

test_experimental_function_value_info_are_stored_in_graph_value_info (onnxscript.function_libs.torch_lib.graph_building.graph_building_test.TestModelSaving) failed

artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 1s]
Raw output
torch.onnx.OnnxExporterError: Failed to export the model to ONNX. Generating SARIF report at 'report_dynamo_export.sarif'. SARIF is a standard format for the output of static analysis tools. SARIF logs can be loaded in VS Code SARIF viewer extension, or SARIF web viewer (https://microsoft.github.io/sarif-web-component/). Please report a bug on PyTorch Github: https://github.com/pytorch/pytorch/issues
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1427: in dynamo_export
    ).export()
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1170: in export
    graph_module = self.options.fx_tracer.generate_fx(
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/fx/dynamo_graph_extractor.py:213: in generate_fx
    graph_module, graph_guard = torch._dynamo.export(
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/_dynamo/eval_frame.py:1205: in inner
    check_if_dynamo_supported()
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/_dynamo/eval_frame.py:597: in check_if_dynamo_supported
    raise RuntimeError("Python 3.12+ not yet supported for torch.compile")
E   RuntimeError: Python 3.12+ not yet supported for torch.compile

The above exception was the direct cause of the following exception:
onnxscript/function_libs/torch_lib/graph_building/graph_building_test.py:201: in test_experimental_function_value_info_are_stored_in_graph_value_info
    model_proto = torch.onnx.dynamo_export(model, x).model_proto
<@beartype(torch.onnx._internal.exporter.dynamo_export) at 0x1234276a0>:53: in dynamo_export
    ???
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1438: in dynamo_export
    raise OnnxExporterError(
E   torch.onnx.OnnxExporterError: Failed to export the model to ONNX. Generating SARIF report at 'report_dynamo_export.sarif'. SARIF is a standard format for the output of static analysis tools. SARIF logs can be loaded in VS Code SARIF viewer extension, or SARIF web viewer (https://microsoft.github.io/sarif-web-component/). Please report a bug on PyTorch Github: https://github.com/pytorch/pytorch/issues

Check warning on line 0 in onnxscript.function_libs.torch_lib.graph_building.graph_building_test.TestModelSaving

See this annotation in the file changed.

@github-actions github-actions / Test Results

test_save_initializer_to_files_for_large_model (onnxscript.function_libs.torch_lib.graph_building.graph_building_test.TestModelSaving) failed

artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 7s]
Raw output
torch.onnx.OnnxExporterError: Failed to export the model to ONNX. Generating SARIF report at 'report_dynamo_export.sarif'. SARIF is a standard format for the output of static analysis tools. SARIF logs can be loaded in VS Code SARIF viewer extension, or SARIF web viewer (https://microsoft.github.io/sarif-web-component/). Please report a bug on PyTorch Github: https://github.com/pytorch/pytorch/issues
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1427: in dynamo_export
    ).export()
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1170: in export
    graph_module = self.options.fx_tracer.generate_fx(
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/fx/dynamo_graph_extractor.py:213: in generate_fx
    graph_module, graph_guard = torch._dynamo.export(
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/_dynamo/eval_frame.py:1205: in inner
    check_if_dynamo_supported()
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/_dynamo/eval_frame.py:597: in check_if_dynamo_supported
    raise RuntimeError("Python 3.12+ not yet supported for torch.compile")
E   RuntimeError: Python 3.12+ not yet supported for torch.compile

The above exception was the direct cause of the following exception:
onnxscript/function_libs/torch_lib/graph_building/graph_building_test.py:173: in test_save_initializer_to_files_for_large_model
    model_proto = torch.onnx.dynamo_export(model, x).model_proto
<@beartype(torch.onnx._internal.exporter.dynamo_export) at 0x1234276a0>:53: in dynamo_export
    ???
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1438: in dynamo_export
    raise OnnxExporterError(
E   torch.onnx.OnnxExporterError: Failed to export the model to ONNX. Generating SARIF report at 'report_dynamo_export.sarif'. SARIF is a standard format for the output of static analysis tools. SARIF logs can be loaded in VS Code SARIF viewer extension, or SARIF web viewer (https://microsoft.github.io/sarif-web-component/). Please report a bug on PyTorch Github: https://github.com/pytorch/pytorch/issues

Check warning on line 0 in onnxscript.function_libs.torch_lib.graph_building.graph_building_test.TestModelSaving

See this annotation in the file changed.

@github-actions github-actions / Test Results

test_input_output_and_initializer_are_not_stored_in_value_info (onnxscript.function_libs.torch_lib.graph_building.graph_building_test.TestModelSaving) failed

artifacts/Test Results (py312-torch-nightly-macos-latest)/pytest.xml [took 0s]
Raw output
torch.onnx.OnnxExporterError: Failed to export the model to ONNX. Generating SARIF report at 'report_dynamo_export.sarif'. SARIF is a standard format for the output of static analysis tools. SARIF logs can be loaded in VS Code SARIF viewer extension, or SARIF web viewer (https://microsoft.github.io/sarif-web-component/). Please report a bug on PyTorch Github: https://github.com/pytorch/pytorch/issues
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1427: in dynamo_export
    ).export()
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1170: in export
    graph_module = self.options.fx_tracer.generate_fx(
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/fx/dynamo_graph_extractor.py:213: in generate_fx
    graph_module, graph_guard = torch._dynamo.export(
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/_dynamo/eval_frame.py:1205: in inner
    check_if_dynamo_supported()
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/_dynamo/eval_frame.py:597: in check_if_dynamo_supported
    raise RuntimeError("Python 3.12+ not yet supported for torch.compile")
E   RuntimeError: Python 3.12+ not yet supported for torch.compile

The above exception was the direct cause of the following exception:
onnxscript/function_libs/torch_lib/graph_building/graph_building_test.py:182: in test_input_output_and_initializer_are_not_stored_in_value_info
    model_proto = torch.onnx.dynamo_export(model, x).model_proto
<@beartype(torch.onnx._internal.exporter.dynamo_export) at 0x1234276a0>:53: in dynamo_export
    ???
.nox/test_torch_nightly/lib/python3.12/site-packages/torch/onnx/_internal/exporter.py:1438: in dynamo_export
    raise OnnxExporterError(
E   torch.onnx.OnnxExporterError: Failed to export the model to ONNX. Generating SARIF report at 'report_dynamo_export.sarif'. SARIF is a standard format for the output of static analysis tools. SARIF logs can be loaded in VS Code SARIF viewer extension, or SARIF web viewer (https://microsoft.github.io/sarif-web-component/). Please report a bug on PyTorch Github: https://github.com/pytorch/pytorch/issues