Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

revert "raise Dynamo accumulated cache size limit" #53

Open
lchu6 opened this issue Mar 19, 2024 · 0 comments
Open

revert "raise Dynamo accumulated cache size limit" #53

lchu6 opened this issue Mar 19, 2024 · 0 comments
Assignees

Comments

@lchu6
Copy link
Contributor

lchu6 commented Mar 19, 2024

We recently added a commit to raise Dynamo accumulated cache size limit to make compile work with large models like 70b whose num_layer is greater than default limit (64): #45 (comment).

Now this number has been officially raised in PyTorch (related PR) from 64 to 256. So we can revert this commit as the new default is enough.

I am holding the revert change as the torch PR just got merged in today's nightly, and most env does not have this "fix" yet.

This is to be revisited later once that PR is picked by most env we have, and then we can revert it.

@lchu6 lchu6 self-assigned this Mar 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant