You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
RuntimeError: FlashAttention backward for head dim > 64 requires A100 or H100 GPUs as the implementation needs a large amount of shared memory.
Are it referring to the head dimension of vicuna-7b being more than 64?
The text was updated successfully, but these errors were encountered:
RuntimeError: FlashAttention backward for head dim > 64 requires A100 or H100 GPUs as the implementation needs a large amount of shared memory.
Are it referring to the head dimension of vicuna-7b being more than 64?
The text was updated successfully, but these errors were encountered: