runtime error
Exit code: 1. Reason: s to it and be authenticated to access it. Please log in. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/app/app.py", line 79, in <module> llama_pipe = pipeline( File "/usr/local/lib/python3.10/site-packages/transformers/pipelines/__init__.py", line 922, in pipeline config = AutoConfig.from_pretrained( File "/usr/local/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1332, in from_pretrained config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 662, in get_config_dict config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/configuration_utils.py", line 721, in _get_config_dict resolved_config_file = cached_file( File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 322, in cached_file file = cached_files(path_or_repo_id=path_or_repo_id, filenames=[filename], **kwargs) File "/usr/local/lib/python3.10/site-packages/transformers/utils/hub.py", line 543, in cached_files raise OSError( OSError: You are trying to access a gated repo. Make sure to have access to it at https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct. 401 Client Error. (Request ID: Root=1-68ff7ead-1027c9d64bc8233f62060a4f;d512d3ad-18c5-4972-a4e2-6a1eda44e950) Cannot access gated repo for url https://huggingface.co/meta-llama/Llama-3.1-8B-Instruct/resolve/main/config.json. Access to model meta-llama/Llama-3.1-8B-Instruct is restricted. You must have access to it and be authenticated to access it. Please log in. model.safetensors: 2%|▏ | 24.3M/990M [00:01<00:41, 23.3MB/s][A model.safetensors: 63%|██████▎ | 628M/990M [00:02<00:01, 358MB/s] [A model.safetensors: 100%|██████████| 990M/990M [00:02<00:00, 400MB/s]
Container logs:
Fetching error logs...