Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ORTModelForCustomTasks lacks attributes #1992

Open
2 of 4 tasks
TheMattBin opened this issue Aug 19, 2024 · 1 comment
Open
2 of 4 tasks

ORTModelForCustomTasks lacks attributes #1992

TheMattBin opened this issue Aug 19, 2024 · 1 comment
Labels
bug Something isn't working

Comments

@TheMattBin
Copy link

System Info

python: 3.11
OS: Linux
torch: 2.4.0
optimum: 1.21.3
onnx: 1.16.2
onnxruntime: 1.18.1
onnxruntime-gpu: 1.19.0

Who can help?

@JingyaHuang @echarlaix

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction (minimal, reproducible, runnable)

Hi, I'm running my customized onnx model with optimum with CUDA using ORTModelForCustomTasks. I can run my model with both CPU and Intel-Openvino. However, when I run the script below, it shows errors AttributeError: 'ORTModelForCustomTasks' object has no attribute 'inputs_names' which I assumed related to optimum/onnxruntime/io_binding/io_binding_helper.py:160. It would be helpful if you can provide some hints with this issue.

image_processor = AutoProcessor.from_pretrained("deta_onnx2_gpu")
model = ORTModelForCustomTasks.from_pretrained("deta_onnx2_gpu", provider="CUDAExecutionProvider")

url = 'http://images.cocodataset.org/val2017/000000039769.jpg'
image = Image.open(requests.get(url, stream=True).raw)


inputs = image_processor(images=image, return_tensors="pt")
print(inputs.keys())
# assert model.providers == ["CUDAExecutionProvider", "CPUExecutionProvider"]
with torch.no_grad():
    outputs = model(**inputs)

Expected behavior

I would expect the code to be run and provide similar detection output like running on onnx model with CPU.

@TheMattBin TheMattBin added the bug Something isn't working label Aug 19, 2024
@TheMattBin
Copy link
Author

I also tested with DETR (onnx with cuda) which is supported by optimum.cli, also got the same issue.

optimum-cli export onnx -m facebook/detr-resnet-50 --task 'object-detection' --framework 'pt' detr_gpu  --device cuda

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant