We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
For the given IR
module { func.func @torch_jit(%arg1: !torch.vtensor<[64,4,144,144],f32>, %arg2: !torch.vtensor<[1,4,144,144],f32>) -> !torch.vtensor<[64,4,144,144],f32> attributes {torch.onnx_meta.ir_version = 7 : si64, torch.onnx_meta.opset_version = 21 : si64, torch.onnx_meta.producer_name = "pytorch", torch.onnx_meta.producer_version = "1.12.1"} { %1 = torch.operator "onnx.Add"(%arg1, %arg2) : (!torch.vtensor<[64,4,144,144],f32>, !torch.vtensor<[1,4,144,144],f32>) -> !torch.vtensor<[64,4,144,144],f32> %2 = torch.operator "onnx.Softmax"(%1) {torch.onnx.axis = -1 : si64} : (!torch.vtensor<[64,4,144,144],f32>) -> !torch.vtensor<[64,4,144,144],f32> return %2 : !torch.vtensor<[64,4,144,144],f32> } }
getting error as
error: <unknown>:0:0: stack frame size (294916) exceeds limit (131056) in function 'torch_jit$async_dispatch_1_softmax_64x4x144x144xf32_dispatch_tensor_store'
while it's working fine in CPU.
command:
iree-compile --iree-hal-target-backends=rocm --iree-hip-target=gfx942 -o abc.vmfb model.torch_onnx.mlir
version: IREE compiler version 3.0.0rc20241117 @ 29c451b
detail log:
dump.log
Compiler
No response
The text was updated successfully, but these errors were encountered:
@qedawkins I believe this #19212 patch solves the issue.
Sorry, something went wrong.
pashu123
No branches or pull requests
What happened?
For the given IR
getting error as
while it's working fine in CPU.
Steps to reproduce your issue
command:
version: IREE compiler version 3.0.0rc20241117 @ 29c451b
detail log:
dump.log
What component(s) does this issue relate to?
Compiler
Version information
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: