-
Notifications
You must be signed in to change notification settings - Fork 23k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Allow resizing of parametrized tensors #60418
Conversation
💊 CI failures summary and remediationsAs of commit 9a72174 (more details on the Dr. CI page and at hud.pytorch.org/pr/60418):
1 failure not recognized by patterns:
This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.Please report bugs/suggestions to the (internal) Dr. CI Users group. |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
1 similar comment
This pull request was exported from Phabricator. Differential Revision: D29279442 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you please update the tests in test/test_nn.py
(and in particular the ones that make sure that an error is raised when the flag is not provided) to make sure this is working as expected.
cc @lezcano if you have any opinion on the name of the flag.
torch/nn/utils/parametrize.py
Outdated
@@ -66,17 +66,20 @@ class ParametrizationList(ModuleList): | |||
Args: | |||
modules (iterable): an iterable of modules representing the parametrizations | |||
original (Parameter or Tensor): parameter or buffer that is parametrized | |||
allow_resize (bool): a boolean flag that denotes whether the parametrized tensor can be resized |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think we want to be a little more ominous here. Warning that the current Module might not work any-more and users enable this flag at their own risk.
torch/nn/utils/parametrize.py
Outdated
@@ -231,6 +234,7 @@ def right_inverse(self, X: Tensor) -> Tensor | |||
tensor_name (str): name of the parameter or buffer on which to register | |||
the parametrization | |||
parametrization (nn.Module): the parametrization to register | |||
allow_resize (bool): a boolean flag that denotes whether the parametrized tensor can be resized |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Same comment about the doc
I would wait first for #58488 to be landed, as that PR heavily extends the current scope of parametrizations. In that PR we support changing the size of Could you confirm that your use case would still not be covered by what this new PR implements @kazhou ? Now, in this new PR we do a number of checks to protect the user. Some of these checks could be relaxed. Perhaps we could add an I'd be happy to submit this after we land #58488 if we figure that there's some use case for this functionality |
I'm afraid this one will not be covered by #60530 indeed. |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
Codecov Report
@@ Coverage Diff @@
## master #60418 +/- ##
==========================================
- Coverage 80.60% 76.21% -4.39%
==========================================
Files 1879 2061 +182
Lines 202892 205073 +2181
==========================================
- Hits 163543 156305 -7238
- Misses 39349 48768 +9419 |
test/test_nn.py
Outdated
@@ -2171,6 +2176,41 @@ def forward(self, x): | |||
initial_bias_id = id(model.bias) | |||
initial_model = deepcopy(model) | |||
|
|||
# Test allow_resize flag | |||
parametrize.register_parametrization(model, "weight", Resize()) # default allow_resize = False | |||
self.assertRaises(RuntimeError) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think this is doing what you want.
You can use
with self.assertRaisesRegex(RuntimeError, "Your error msg"):
the_code_that_should_fail()
To make sure you get the error you expect.
This pull request was exported from Phabricator. Differential Revision: D29279442 |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
This pull request was exported from Phabricator. Differential Revision: D29279442 |
Summary: Pull Request resolved: pytorch#60418 Modify `parametrize.py` to allow resizing of parametrized tensors Test Plan: `buck test mode/dev-nosan //caffe2/test:nn -- --exact 'caffe2/test:nn - test_register_and_remove_parametrization (test_nn.TestNN)'` https://pxl.cl/1KS5P Differential Revision: D29279442 fbshipit-source-id: b185edee2bf66bbaa2969cef33c40a879bc517a8
Summary: Pull Request resolved: pytorch#60418 Modify `parametrize.py` to allow resizing of parametrized tensors Test Plan: `buck test mode/dev-nosan //caffe2/test:nn -- --exact 'caffe2/test:nn - test_register_and_remove_parametrization (test_nn.TestNN)'` https://pxl.cl/1KS5P Differential Revision: D29279442 fbshipit-source-id: d91fffb4ce7e492cd6df1c618a25216e21aaceac
@lezcano this is now rebased on top of the other PR, you can take a second look? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I left a few suggestions regarding the wording of the documentation, but the functionality looks good to me, thanks!
torch/nn/utils/parametrize.py
Outdated
@@ -389,6 +394,9 @@ def right_inverse(self, X: Tensor) -> Union[Tensor, Sequence[Tensor]] | |||
tensor_name (str): name of the parameter or buffer on which to register | |||
the parametrization | |||
parametrization (nn.Module): the parametrization to register | |||
Keyword args: | |||
unsafe (bool): a boolean flag that denotes whether the parametrization breaks the invariants. Default: `False` | |||
Warning: the current Module might not work anymore, enable this flag at your own risk. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think that the biggest problem of activating this flag is that the errors that the parametrization may have are not detected early and the errors messages that one gets are notably more cryptic. So perhaps:
Warning: the current Module might not work anymore, enable this flag at your own risk. | |
Warning: the parametrization is not checked for consistency upon registration. Enable this flag at your own risk. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I just realised that we also need to update the .. note:
in L377-378 reflecting this change. A possible rewrite could be:
If
unsafe=False
(default) both theforward
andright_inverse
methods will be called once to perform a number of consistency checks.
Ifunsafe=True
, thenright_inverse
will be called if the tensor is not parametrized, and nothing will be called otherwise.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@kazhou you marked these as resolved but perhaps forgot to push the changes?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it should be synced from phabricator to this PR? it was re-exported about 2 hrs ago
@kazhou has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator. |
Summary: Modify `parametrize.py` to allow resizing of parametrized tensors Pull Request resolved: pytorch#60418 Test Plan: `buck test mode/dev-nosan //caffe2/test:nn -- --exact 'caffe2/test:nn - test_register_and_remove_parametrization (test_nn.TestNN)'` https://pxl.cl/1L0wh Differential Revision: D29279442 Pulled By: kazhou fbshipit-source-id: 96c2c5a6b40344710f91413d70c3b0439e3b89e2
This pull request was exported from Phabricator. Differential Revision: D29279442 |
Summary: Modify
parametrize.py
to allow resizing of parametrized tensorsTest Plan: TODO
Differential Revision: D29279442