We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
This work introduces additional positional embedding for the number of tokens more than 512.
PreSumm/src/models/model_builder.py
Lines 150 to 154 in 70b810e
So, I guess the transformers automatically add the parameters of transformer, is this understanding correct?
The text was updated successfully, but these errors were encountered:
No branches or pull requests
This work introduces additional positional embedding for the number of tokens more than 512.
PreSumm/src/models/model_builder.py
Lines 150 to 154 in 70b810e
But, this code doesn't seem to extend transformer.
I think if the subsequent encoder does not have additional parameters, the shape will not match.
So, I guess the transformers automatically add the parameters of transformer, is this understanding correct?
The text was updated successfully, but these errors were encountered: