-
Notifications
You must be signed in to change notification settings - Fork 713
Open
Description
Checklist
- 1. I have searched related issues but cannot get the expected help.
- 2. The bug has not been fixed in the latest version.
- 3. Please note that if the bug-related issue you submitted lacks corresponding environment info and a minimal reproducible demo, it will be challenging for us to reproduce and resolve the issue, reducing the likelihood of receiving feedback.
Describe the bug
Thank you for your outstanding work on this project!
I'm currently conducting research that requires accessing the attention outputs of the model. However, despite trying several versions of the transformers library, I am unable to get the attention weights as expected.
Here's the approach I've been using, following the standard practice with transformers:
generated = model.forward(**inputs, output_attentions=True)
Unfortunately, the attention outputs remain None. Could you please advise if there's a specific method or configuration required to correctly obtain the model's attention weights? Any insights or suggestions would be greatly appreciated.
Thanks in advance for your help!
Reproduction
generated = model.forward(**inputs, output_attentions=True)
Environment
transformers >= 4.49
Error traceback
Metadata
Metadata
Assignees
Labels
No labels