Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

提示表明程序在尝试从指定路径 /storage_fast/rhshui/llm/llama_hf/7B/ 加载模型配置时失败 #5

Open
lucky0223 opened this issue Oct 28, 2024 · 1 comment

Comments

@lucky0223
Copy link

感谢分享!我有如下错误请您帮助:
Traceback (most recent call last):
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 675, in _get_config_dict
resolved_config_file = cached_file(
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/utils/hub.py", line 428, in cached_file
resolved_file = hf_hub_download(
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 106, in _inner_fn
validate_repo_id(arg_value)
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 154, in validate_repo_id
raise HFValidationError(
huggingface_hub.errors.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/storage_fast/rhshui/llm/llama_hf/7B/'. Use repo_type argument if needed.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/root/DEALRec/code/prune/prune.py", line 15, in
effort = get_effort_score(args)
File "/root/DEALRec/code/prune/effort_score.py", line 64, in get_effort_score
model = Modified_LlamaForCausalLM.from_pretrained(
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/modeling_utils.py", line 2449, in from_pretrained
config, model_kwargs = cls.config_class.from_pretrained(
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 591, in from_pretrained
config_dict, kwargs = cls.get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 620, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File "/root/miniconda3/envs/test/lib/python3.10/site-packages/transformers/configuration_utils.py", line 696, in _get_config_dict
raise EnvironmentError(
OSError: Can't load the configuration of '/storage_fast/rhshui/llm/llama_hf/7B/'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure '/storage_fast/rhshui/llm/llama_hf/7B/' is the correct path to a directory containing a config.json file

@Linxyhaha
Copy link
Owner

请参考 #3

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants