Nothing Special   »   [go: up one dir, main page]

Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow resizing of parametrized tensors #60418

Closed
wants to merge 1 commit into from

Conversation

kazhou
Copy link
Contributor
@kazhou kazhou commented Jun 21, 2021

Summary: Modify parametrize.py to allow resizing of parametrized tensors

Test Plan: TODO

Differential Revision: D29279442

@facebook-github-bot
Copy link
Contributor
facebook-github-bot commented Jun 21, 2021

💊 CI failures summary and remediations

As of commit 9a72174 (more details on the Dr. CI page and at hud.pytorch.org/pr/60418):


  • 1/1 failures introduced in this PR

1 failure not recognized by patterns:

Job Step Action
CircleCI pytorch_linux_xenial_py3_clang5_asan_test1 Run tests 🔁 rerun

This comment was automatically generated by Dr. CI (expand for details).Follow this link to opt-out of these comments for your Pull Requests.

Please report bugs/suggestions to the (internal) Dr. CI Users group.

Click here to manually regenerate this comment.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

1 similar comment
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from cd1bebf to 065493f Compare June 22, 2021 18:52
Copy link
Collaborator
@albanD albanD left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you please update the tests in test/test_nn.py (and in particular the ones that make sure that an error is raised when the flag is not provided) to make sure this is working as expected.

cc @lezcano if you have any opinion on the name of the flag.

@@ -66,17 +66,20 @@ class ParametrizationList(ModuleList):
Args:
modules (iterable): an iterable of modules representing the parametrizations
original (Parameter or Tensor): parameter or buffer that is parametrized
allow_resize (bool): a boolean flag that denotes whether the parametrized tensor can be resized
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we want to be a little more ominous here. Warning that the current Module might not work any-more and users enable this flag at their own risk.

torch/nn/utils/parametrize.py Outdated Show resolved Hide resolved
@@ -231,6 +234,7 @@ def right_inverse(self, X: Tensor) -> Tensor
tensor_name (str): name of the parameter or buffer on which to register
the parametrization
parametrization (nn.Module): the parametrization to register
allow_resize (bool): a boolean flag that denotes whether the parametrized tensor can be resized
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment about the doc

@lezcano
Copy link
Collaborator
lezcano commented Jun 23, 2021

I would wait first for #58488 to be landed, as that PR heavily extends the current scope of parametrizations.

In that PR we support changing the size of original, but not the size of the returned weight. In other words, right_inverse may change the size of the parameter, but then forward should map this tensor with a new size to a tensor of the old size.

Could you confirm that your use case would still not be covered by what this new PR implements @kazhou ?

Now, in this new PR we do a number of checks to protect the user. Some of these checks could be relaxed. Perhaps we could add an unsafe=False kw-only flag. Then, we could document that, if you use this flag, you may be allowed to do a few more non-standard things, but if you find an error you are on your own.

I'd be happy to submit this after we land #58488 if we figure that there's some use case for this functionality

@albanD
Copy link
Collaborator
albanD commented Jun 23, 2021

I'm afraid this one will not be covered by #60530 indeed.
The use case here is to use parametrization in conjunction with a post forward hook to be able to do less computation during the forward of this Module while still returning a full size output.
This will indeed not work for all Module and is pretty unsafe in general, but sounds ok for this particular use case.

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from 065493f to 038e003 Compare June 23, 2021 19:51
@codecov
Copy link
codecov bot commented Jun 24, 2021

Codecov Report

Merging #60418 (56eb0cf) into master (001ff3a) will decrease coverage by 4.38%.
The diff coverage is 89.25%.

❗ Current head 56eb0cf differs from pull request most recent head 9a72174. Consider uploading reports for the commit 9a72174 to get more accurate results

@@            Coverage Diff             @@
##           master   #60418      +/-   ##
==========================================
- Coverage   80.60%   76.21%   -4.39%     
==========================================
  Files        1879     2061     +182     
  Lines      202892   205073    +2181     
==========================================
- Hits       163543   156305    -7238     
- Misses      39349    48768    +9419     

test/test_nn.py Outdated Show resolved Hide resolved
test/test_nn.py Outdated
@@ -2171,6 +2176,41 @@ def forward(self, x):
initial_bias_id = id(model.bias)
initial_model = deepcopy(model)

# Test allow_resize flag
parametrize.register_parametrization(model, "weight", Resize()) # default allow_resize = False
self.assertRaises(RuntimeError)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think this is doing what you want.

You can use

with self.assertRaisesRegex(RuntimeError, "Your error msg"):
    the_code_that_should_fail()

To make sure you get the error you expect.

torch/nn/utils/parametrize.py Outdated Show resolved Hide resolved
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from 038e003 to 992969d Compare June 24, 2021 16:48
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from 992969d to c1b15f8 Compare June 24, 2021 17:01
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from c1b15f8 to 50c37bc Compare June 24, 2021 18:33
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from f675a3c to 47007ef Compare June 25, 2021 18:57
@kazhou kazhou requested review from albanD and lezcano June 25, 2021 18:58
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from 47007ef to d535504 Compare June 25, 2021 19:03
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from d535504 to 1b56884 Compare June 25, 2021 19:30
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@kazhou kazhou force-pushed the export-D29279442 branch from 1b56884 to 56eb0cf Compare June 25, 2021 21:14
kazhou added a commit to kazhou/pytorch that referenced this pull request Jun 25, 2021
Summary:
Pull Request resolved: pytorch#60418

Modify `parametrize.py` to allow resizing of parametrized tensors

Test Plan:
`buck test mode/dev-nosan //caffe2/test:nn -- --exact 'caffe2/test:nn - test_register_and_remove_parametrization (test_nn.TestNN)'`

https://pxl.cl/1KS5P

Differential Revision: D29279442

fbshipit-source-id: b185edee2bf66bbaa2969cef33c40a879bc517a8
kazhou added a commit to kazhou/pytorch that referenced this pull request Jun 25, 2021
Summary:
Pull Request resolved: pytorch#60418

Modify `parametrize.py` to allow resizing of parametrized tensors

Test Plan:
`buck test mode/dev-nosan //caffe2/test:nn -- --exact 'caffe2/test:nn - test_register_and_remove_parametrization (test_nn.TestNN)'`

https://pxl.cl/1KS5P

Differential Revision: D29279442

fbshipit-source-id: d91fffb4ce7e492cd6df1c618a25216e21aaceac
@albanD
Copy link
Collaborator
albanD commented Jun 25, 2021

@lezcano this is now rebased on top of the other PR, you can take a second look?

Copy link
Collaborator
@lezcano lezcano left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I left a few suggestions regarding the wording of the documentation, but the functionality looks good to me, thanks!

torch/nn/utils/parametrize.py Outdated Show resolved Hide resolved
torch/nn/utils/parametrize.py Outdated Show resolved Hide resolved
@@ -389,6 +394,9 @@ def right_inverse(self, X: Tensor) -> Union[Tensor, Sequence[Tensor]]
tensor_name (str): name of the parameter or buffer on which to register
the parametrization
parametrization (nn.Module): the parametrization to register
Keyword args:
unsafe (bool): a boolean flag that denotes whether the parametrization breaks the invariants. Default: `False`
Warning: the current Module might not work anymore, enable this flag at your own risk.
Copy link
Collaborator
@lezcano lezcano Jun 28, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think that the biggest problem of activating this flag is that the errors that the parametrization may have are not detected early and the errors messages that one gets are notably more cryptic. So perhaps:

Suggested change
Warning: the current Module might not work anymore, enable this flag at your own risk.
Warning: the parametrization is not checked for consistency upon registration. Enable this flag at your own risk.

Copy link
Collaborator
@lezcano lezcano Jun 28, 2021

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just realised that we also need to update the .. note: in L377-378 reflecting this change. A possible rewrite could be:

If unsafe=False (default) both the forward and right_inverse methods will be called once to perform a number of consistency checks.
If unsafe=True, then right_inverse will be called if the tensor is not parametrized, and nothing will be called otherwise.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@kazhou you marked these as resolved but perhaps forgot to push the changes?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it should be synced from phabricator to this PR? it was re-exported about 2 hrs ago

torch/nn/utils/parametrize.py Outdated Show resolved Hide resolved
torch/nn/utils/parametrize.py Outdated Show resolved Hide resolved
torch/nn/utils/parametrize.py Show resolved Hide resolved
torch/nn/utils/parametrize.py Show resolved Hide resolved
@facebook-github-bot
Copy link
Contributor

@kazhou has imported this pull request. If you are a Facebook employee, you can view this diff on Phabricator.

Summary:
Modify `parametrize.py` to allow resizing of parametrized tensors

Pull Request resolved: pytorch#60418

Test Plan:
`buck test mode/dev-nosan //caffe2/test:nn -- --exact 'caffe2/test:nn - test_register_and_remove_parametrization (test_nn.TestNN)'`

https://pxl.cl/1L0wh

Differential Revision: D29279442

Pulled By: kazhou

fbshipit-source-id: 96c2c5a6b40344710f91413d70c3b0439e3b89e2
@kazhou kazhou force-pushed the export-D29279442 branch from 56eb0cf to 9a72174 Compare June 28, 2021 16:42
@facebook-github-bot
Copy link
Contributor

This pull request was exported from Phabricator. Differential Revision: D29279442

@facebook-github-bot
Copy link
Contributor

@kazhou merged this pull request in 965dad2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants