8000 feat: expose IResizeLayer in dynamo.conversion.impl by bowang007 · Pull Request #2488 · pytorch/TensorRT · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

feat: expose IResizeLayer in dynamo.conversion.impl #2488

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Dec 9, 2023

Conversation

bowang007
Copy link
Collaborator

Description

#2219

Type of change

Expose IResizeLayer in dynamo.conversion.impl

Checklist:

  • My code follows the style guidelines of this project (You can use the linters)
  • I have performed a self-review of my own code
  • I have commented my code, particularly in hard-to-understand areas and hacks
  • I have made corresponding changes to the documentation
  • I have added tests to verify my fix or my feature
  • New and existing unit tests pass locally with my changes
  • I have added the relevant labels to my PR in so that relevant reviewers are notified

@bowang007 bowang007 requested a review from gs-olive November 28, 2023 06:17
@github-actions github-actions bot added component: api [Python] Issues re: Python API component: conversion Issues re: Conversion stage component: converters Issues re: Specific op converters component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths component: tests Issues re: Tests labels Nov 28, 2023
@bowang007 bowang007 requested a review from narendasan November 28, 2023 06:17
@bowang007 bowang007 marked this pull request as draft November 28, 2023 06:18
Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to Python style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to Python style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to Python style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

@bowang007 bowang007 self-assigned this Nov 28, 2023
Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to Python style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to Python style guidelines

@bowang007 bowang007 marked this pull request as ready for review November 29, 2023 06:01
Comment on lines 26 to 29
if out_shape is not None:
resize_layer.shape = list(input.shape)[:2] + list(out_shape)
else:
resize_layer.scales = [1, 1] + list(scale_factor)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should both branches of the conditional ensure that both shape and scales are set, or can TRT handle cases where only one is set?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@gs-olive
Looks like there are some internal assertions in Pytorch that checks the arguments for shape and scales.
So pytorch will throws out an error if none of them is set or both of them are set.
Since this is a converter so we assumes that the arguments are valid or pytorch will throw out error right?
Please correct me if I'm wrong, thanks!

Copy link
Collaborator
@zewenli98 zewenli98 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Our converters come from the schema. The decorator's key should be one of the scheme's functions.

@bowang007 bowang007 force-pushed the dynamo_resize_converter branch from 9e2cc48 to f538a97 Compare December 4, 2023 22:50
Copy link
Collaborator
@gs-olive gs-olive left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added a few suggestions, but overall looks good! Could you also add support for the non-vec variants, such as upsample_bilinear2d.

Copy link
Collaborator
@gs-olive gs-olive left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good overall - added minor comment about mypy linting. Also needs a rebase onto the latest main to resolve merge conflicts

@bowang007
Copy link
Collaborator Author

Added a few suggestions, but overall looks good! Could you also add support for the non-vec variants, such as upsample_bilinear2d.

@gs-olive Any idea why this happens?

AssertionError: Detected converter for OpOverloadPacket aten.upsample_nearest2d. We do not support OpOverloadPacket-keyed converters with multiple overloads. Make sure to explicitly specify each converter overload. For instance aten.mean is not a valid key, but aten.mean.default is.

@gs-olive
Copy link
Collaborator
gs-olive commented Dec 6, 2023

@bowang007 - actually, I realized the upsample_bilinear2d op is not in the Core ATen IR set, so it is okay to omit it. That error arises when the decorator specified for the converter is something like aten.upsample_bilinear2d instead of aten.upsample_bilinear2d.default

@bowang007
Copy link
Collaborator Author

@bowang007 - actually, I realized the upsample_bilinear2d op is not in the Core ATen IR set, so it is okay to omit it. That error arises when the decorator specified for the converter is something like aten.upsample_bilinear2d instead of aten.upsample_bilinear2d.default

In that case, how can we add converter for aten.upsample_bilinear2d?

@gs-olive
Copy link
Collaborator
gs-olive commented Dec 7, 2023

We can add this converter using the decorator @torch.ops.aten.upsample_bilinear2d.default as opposed to @torch.ops.aten.upsample_bilinear2d.

@bowang007
Copy link
Collaborator Author

Our converters come from the schema. The decorator's key should be one of the scheme's functions.

yeah that makes sense! Thanks!

@bowang007 bowang007 force-pushed the dynamo_resize_converter branch from aeb73f3 to a2680d0 Compare December 8, 2023 23:34
@bowang007 bowang007 requested a review from gs-olive December 8, 2023 23:34
Copy link
Collaborator
@gs-olive gs-olive left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Just needs a rebase, then looks good to me

@bowang007 bowang007 force-pushed the dynamo_resize_converter branch from a2680d0 to f08e370 Compare December 9, 2023 01:40
@bowang007 bowang007 requested review from zewenli98 and removed request for zewenli98 December 9, 2023 01:41
Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are some changes that do not conform to Python style guidelines:

--- /home/runner/work/TensorRT/TensorRT/py/torch_tensorrt/dynamo/conversion/aten_ops_converters.py	2023-12-09 01:40:38.396894+00:00
+++ /home/runner/work/TensorRT/TensorRT/py/torch_tensorrt/dynamo/conversion/aten_ops_converters.py	2023-12-09 01:42:26.080943+00:00
@@ -2502,6 +2502,6 @@
        input=args[0],
        out_shape=args_bounds_check(args, 1),
        scale_factors=args_bounds_check(args, 3),
        resize_mode="bilinear",
        align_corners=args_bounds_check(args, 2),
-    )
\ No newline at end of file
+    )

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are some changes that do not conform to Python style guidelines:

--- /home/runner/work/TensorRT/TensorRT/py/torch_tensorrt/dynamo/conversion/aten_ops_converters.py	2023-12-09 01:41:43.828878+00:00
+++ /home/runner/work/TensorRT/TensorRT/py/torch_tensorrt/dynamo/conversion/aten_ops_converters.py	2023-12-09 01:43:31.367921+00:00
@@ -2502,6 +2502,6 @@
        input=args[0],
        out_shape=args_bounds_check(args, 1),
        scale_factors=args_bounds_check(args, 3),
        resize_mode="bilinear",
        align_corners=args_bounds_check(args, 2),
-    )
\ No newline at end of file
+    )

@bowang007 bowang007 force-pushed the dynamo_resize_converter branch from f08e370 to 960458f Compare December 9, 2023 01:50
Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to C++ style guidelines

Copy link
@github-actions github-actions bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code conforms to Python style guidelines

@bowang007 bowang007 merged commit 623d2f7 into main Dec 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
cla signed component: api [Python] Issues re: Python API component: conversion Issues re: Conversion stage component: converters Issues re: Specific op converters component: dynamo Issues relating to the `torch.compile` or `torch._dynamo.export` paths component: tests Issues re: Tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants
0