Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Failing to create TorchScript module using torch.jit.script() #1148

Open
Mark-M2L opened this issue Nov 3, 2022 · 0 comments
Open

Failing to create TorchScript module using torch.jit.script() #1148

Mark-M2L opened this issue Nov 3, 2022 · 0 comments
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@Mark-M2L
Copy link

Mark-M2L commented Nov 3, 2022

Hello all,

Thank you very much for this wonderful library. It is really useful. I have trained the fastai unet model with ResNet34 as backbone and try to port it to C++ at the moment (and perhaps later, convert it to torch_tensorrt). So, for the step to compile the TorchScript module, torch.jit.script() is used. However, this function returns a failure.

The code is as follows:

import torch
import torchvision
from icevision.models import *
from icevision.all import *

presize, size = 384, 224
presize, size = ImgSize(presize, int(presize*1.0)), ImgSize(size, int(size*1.0))

model_type = models.fastai.unet
backbone = model_type.backbones.resnet34()
model = model_type.model(backbone=backbone, num_classes=9, img_size=size)

scripted_model = torch.jit.script(model)

torch.jit.save(scripted_model, 'resnet_unet_224.pt')

The error obtained:

RuntimeError: Hook 'hook_fn' on module 'ReLU' was expected to only have exactly 3 inputs but it had 4 inputs. This error occured while scripting the forward hook 'hook_fn' on module ReLU. If you did not want to script this hook remove it from the original NN module before scripting. This hook was expected to have the following signature: hook_fn(self, input: Tuple[Tensor], output: Tensor). The type of the output arg is the returned type from either the forward method or the previous hook if it exists. Note that hooks can return anything, but if the hook is on a submodule the outer module is expecting the same return type as the submodule's forward.

Further, I get the same error when trying to compile the model using torch_tensorrt. Can you help with creating the TorchScript? Thanks!

@Mark-M2L Mark-M2L added enhancement New feature or request help wanted Extra attention is needed labels Nov 3, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

1 participant