You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In newer versions of the transformer library, AutoModelForCausalLM can properly identify llama models.
There's therefore no need anymore for the LlamaModel class. Llama models run with --model_name causal.
The only hiccup I experienced was an error about the generate function receiving token_type_ids. I fixed this by adding to my tokenizer_config.json the lines
In newer versions of the transformer library, AutoModelForCausalLM can properly identify llama models.
There's therefore no need anymore for the LlamaModel class. Llama models run with --model_name causal.
The only hiccup I experienced was an error about the generate function receiving
token_type_ids
. I fixed this by adding to my tokenizer_config.json the linesThis could be addressed within flan-eval by setting return_token_type_ids=False in CausalModel's call to the tokenizer
The text was updated successfully, but these errors were encountered: