-
Notifications
You must be signed in to change notification settings - Fork 52
[torch-ort-infer] Aten fallback doesn't work #139
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send 8000 you account related emails.
Already on GitHub? Sign in to your account
Comments
Can you please try with the latest PyTorch nightly? |
With PyTorch nightly, Aten Fallback is no longer valid since numpy_T converts to Transpose(pytorch/pytorch#79269). |
Thanks @saipj. Do you have another model we can test with? |
@saipj I had a play with this with the |
@natke : Can you add this example in this repo? Explaining how to set the type when type inference fails in ORT. |
@askhade Yes, sure |
Aten op doesn't fallback to native pytorch runtime as expected.
Versions:
Torch - 1.12.0
OnnxRuntime - 1.12.0
Torch-ort-infer - 1.12.0
Reproduction steps:
Error log
Traceback (most recent call last):
File "unit_test_atenop.py", line 23, in
test_numpy_T([3, 2, 5])
File "unit_test_atenop.py", line 20, in test_numpy_T
ort_prediction = run_step(ort_model, ort_input)
File "unit_test_atenop.py", line 16, in run_step
prediction = model(input)
File "/ort_aten_fb/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
return forward_call(*input, **kwargs)
File "/ort_aten_fb/lib/python3.8/site-packages/torch_ort/ortinferencemodule/_utils_infer.py", line 98, in _forward
return ortinferencemodule._forward_call(*inputs, **kwargs)
File "/ort_aten_fb/lib/python3.8/site-packages/torch_ort/ortinferencemodule/ortinferencemodule.py", line 107, in _forward_call
self._inference_session = onnxruntime.InferenceSession(
File "/ort_aten_fb/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 347, in init
self._create_inference_session(providers, provider_options, disabled_optimizers)
File "/ort_aten_fb/lib/python3.8/site-packages/onnxruntime/capi/onnxruntime_inference_collection.py", line 386, in create_inference_session
sess = C.InferenceSession(session_options, self.model_bytes, False, self.read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Node (ATen_0) output arg (data) type inference failed.
Tested with symbolic shape inference call from ORTModule(ref: symbolic_shape). Fails with Exception("Incomplete symbolic shape inference").
The text was updated successfully, but these errors were encountered: