-
Notifications
You must be signed in to change notification settings - Fork 2.9k
-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Web] Demucs model won't run in both WASM and WGPU #22031
Comments
For WebGPU EP, the problem is related to op unsqueeze. According the ONNX spec (https://onnx.ai/onnx/operators/onnx__Unsqueeze.html), axes of unsqueeze is a list of integers, but in your model, it's just a scalar "1". |
So the problem is related to the dynamo export of torch? |
Technically the axes should always be a 1D tensor. However, in reality, the CPU code has loosen the limit: perhaps webgpu should have same behavior to CPU. |
This is to fix issue microsoft#22031 to run model demucs. For conv-transpose, outputPadding.length could be 1, while spatialRank is 2. The fix is to append enough 0s to outputPadding. For conv, the issue is similar. kernelShape.length sometimes could be 1, while inputs[1].dims.length is 4. The fix is also to append enough 0s to kernelShape.
This is to fix issue #22031 to run model demucs. For conv-transpose, outputPadding.length could be 1, while spatialRank is 2. The fix is to append enough 0s to outputPadding. For conv, the issue is similar. kernelShape.length sometimes could be 1, while inputs[1].dims.length is 4. The fix is also to append enough 0s to kernelShape.
@gyagp with latest 1.20.0-dev.20240917-afd642a194, that should include both fixes, i still cannot run the model in webgpu, the runtime just aborts after displaying the wgpu experimental warning |
I also hit some issue with the latest code, and I will take a further look. |
Your model succesfully runs with latest @dev, with timings (60s of audio with 10s chunks): wasm: wgpu: onnx python cpu: |
Describe the issue
I converted the model from pytorch to onnx as described here, with some issues. The model works in onnx python, but in wasm /webgpu the runtime dies without error. The optimized version of the model runs in wasm, but not webgpu. I don't know if this problem is related to the model conversion or the runtime. I have tested with both @latest and @dev.
To reproduce
Here's a link to a sample repo, instructions in README.
Urgency
Urgent, as this project is related to my thesis
ONNX Runtime Installation
Released Package
ONNX Runtime Version or Commit ID
1.19.2, 1.20.0-dev.20240907-ad9afbb042
Execution Provider
'wasm'/'cpu' (WebAssembly CPU), 'webgpu' (WebGPU)
The text was updated successfully, but these errors were encountered: