OSError: You are trying to access a gated repo. Make sure to have access to it at https://huggingface.co/google/gemma-3-27b-it. 401 Client Error. (Request ID: Root=1-67da7d97-6326b5f53a96415516d2c709;7a741876-1aae-4afd-8d26-afd122bc2c2d) Cannot access gated repo for url https://huggingface.co/google/gemma-3-27b-it/resolve/main/config.json. Access to model google/gemma-3-27b-it is restricted. You must have access to it and be authenticated to access it. Please log in.
This error indicates that you’re trying to access a model repository on Hugging Face (in this case, google/gemma-3-27b-it
) that is gated, meaning it requires special access permissions. Here’s how you can resolve this:
Steps to Fix:
- Log In to Hugging Face:
Ensure that you’re logged in to your Hugging Face account. - Request Access:
- Visit the repository page: google/gemma-3-27b-it.
- Check if there’s an option to request access. Some gated repositories require explicit permission from the repository owner.
To authenticate with Hugging Face CLI while using Conda and Jupyter Notebook, follow these steps:
Install Hugging Face CLI
First, ensure that the Hugging Face CLI is installed in your Conda environment. Activate your environment and run:
pip install huggingface_hub[cli]
This installs the CLI along with additional dependencies for a smoother experience.
Authenticate in Jupyter Notebook
If you’re working in a Jupyter Notebook, you can authenticate programmatically using the login()
function from the huggingface_hub
library:
from huggingface_hub import login
# Enter your token here
mytoken = "your_access_token"
login(mytoken)
To create an access token, go to https://huggingface.co/settings/tokens and click on Create new token. Remember to tick the option Read access to contents of all public gated repos you can access
in the Repositories section. Otherwise, you can’t access the contents of gated repos.
After that you can use the code as instructed in Gemma repo:
from transformers import pipeline
import torch
pipe = pipeline(
"image-text-to-text",
model="google/gemma-3-4b-it",
device="cuda",
torch_dtype=torch.bfloat16
)
Authenticate with Hugging Face CLI:
If you’re using a script you need to authenticate:
huggingface-cli login
Enter your Hugging Face credentials when prompted.
This is what can happen when you don’t tick the option Read access to contents of all public gated repos you can access
in the Repositories section when creating a token:
HTTPError Traceback (most recent call last)
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\utils\_http.py:409, in hf_raise_for_status(response, endpoint_name)
408 try:
--> 409 response.raise_for_status()
410 except HTTPError as e:
File C:\ProgramData\anaconda3\Lib\site-packages\requests\models.py:1024, in Response.raise_for_status(self)
1023 if http_error_msg:
-> 1024 raise HTTPError(http_error_msg, response=self)
HTTPError: 403 Client Error: Forbidden for url: https://huggingface.co/google/gemma-3-27b-it/resolve/main/config.json
The above exception was the direct cause of the following exception:
HfHubHTTPError Traceback (most recent call last)
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\file_download.py:1376, in _get_metadata_or_catch_error(repo_id, filename, repo_type, revision, endpoint, proxies, etag_timeout, headers, token, local_files_only, relative_filename, storage_folder)
1375 try:
-> 1376 metadata = get_hf_file_metadata(
1377 url=url, proxies=proxies, timeout=etag_timeout, headers=headers, token=token
1378 )
1379 except EntryNotFoundError as http_error:
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\utils\_validators.py:114, in validate_hf_hub_args.<locals>._inner_fn(*args, **kwargs)
112 kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.__name__, has_token=has_token, kwargs=kwargs)
--> 114 return fn(*args, **kwargs)
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\file_download.py:1296, in get_hf_file_metadata(url, token, proxies, timeout, library_name, library_version, user_agent, headers)
1295 # Retrieve metadata
-> 1296 r = _request_wrapper(
1297 method="HEAD",
1298 url=url,
1299 headers=hf_headers,
1300 allow_redirects=False,
1301 follow_relative_redirects=True,
1302 proxies=proxies,
1303 timeout=timeout,
1304 )
1305 hf_raise_for_status(r)
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\file_download.py:280, in _request_wrapper(method, url, follow_relative_redirects, **params)
279 if follow_relative_redirects:
--> 280 response = _request_wrapper(
281 method=method,
282 url=url,
283 follow_relative_redirects=False,
284 **params,
285 )
287 # If redirection, we redirect only relative paths.
288 # This is useful in case of a renamed repository.
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\file_download.py:304, in _request_wrapper(method, url, follow_relative_redirects, **params)
303 response = get_session().request(method=method, url=url, **params)
--> 304 hf_raise_for_status(response)
305 return response
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\utils\_http.py:472, in hf_raise_for_status(response, endpoint_name)
467 message = (
468 f"\n\n{response.status_code} Forbidden: {error_message}."
469 + f"\nCannot access content at: {response.url}."
470 + "\nMake sure your token has the correct permissions."
471 )
--> 472 raise _format(HfHubHTTPError, message, response) from e
474 elif response.status_code == 416:
HfHubHTTPError: (Request ID: Root=1-67da8c81-5cfc507b7579482b4a616183;5e4029a1-b527-40bc-813f-35832a956bbe)
403 Forbidden: Please enable access to public gated repositories in your fine-grained token settings to view this repository..
Cannot access content at: https://huggingface.co/google/gemma-3-27b-it/resolve/main/config.json.
Make sure your token has the correct permissions.
The above exception was the direct cause of the following exception:
LocalEntryNotFoundError Traceback (most recent call last)
File ~\AppData\Roaming\Python\Python312\site-packages\transformers\utils\hub.py:342, in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_gated_repo, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash, **deprecated_kwargs)
340 try:
341 # Load from URL or cache if already cached
--> 342 resolved_file = hf_hub_download(
343 path_or_repo_id,
344 filename,
345 subfolder=None if len(subfolder) == 0 else subfolder,
346 repo_type=repo_type,
347 revision=revision,
348 cache_dir=cache_dir,
349 user_agent=user_agent,
350 force_download=force_download,
351 proxies=proxies,
352 resume_download=resume_download,
353 token=token,
354 local_files_only=local_files_only,
355 )
356 except GatedRepoError as e:
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\utils\_validators.py:114, in validate_hf_hub_args.<locals>._inner_fn(*args, **kwargs)
112 kwargs = smoothly_deprecate_use_auth_token(fn_name=fn.__name__, has_token=has_token, kwargs=kwargs)
--> 114 return fn(*args, **kwargs)
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\file_download.py:862, in hf_hub_download(repo_id, filename, subfolder, repo_type, revision, library_name, library_version, cache_dir, local_dir, user_agent, force_download, proxies, etag_timeout, token, local_files_only, headers, endpoint, resume_download, force_filename, local_dir_use_symlinks)
861 else:
--> 862 return _hf_hub_download_to_cache_dir(
863 # Destination
864 cache_dir=cache_dir,
865 # File info
866 repo_id=repo_id,
867 filename=filename,
868 repo_type=repo_type,
869 revision=revision,
870 # HTTP info
871 endpoint=endpoint,
872 etag_timeout=etag_timeout,
873 headers=hf_headers,
874 proxies=proxies,
875 token=token,
876 # Additional options
877 local_files_only=local_files_only,
878 force_download=force_download,
879 )
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\file_download.py:969, in _hf_hub_download_to_cache_dir(cache_dir, repo_id, filename, repo_type, revision, endpoint, etag_timeout, headers, proxies, token, local_files_only, force_download)
968 # Otherwise, raise appropriate error
--> 969 _raise_on_head_call_error(head_call_error, force_download, local_files_only)
971 # From now on, etag, commit_hash, url and size are not None.
File ~\AppData\Roaming\Python\Python312\site-packages\huggingface_hub\file_download.py:1489, in _raise_on_head_call_error(head_call_error, force_download, local_files_only)
1487 else:
1488 # Otherwise: most likely a connection issue or Hub downtime => let's warn the user
-> 1489 raise LocalEntryNotFoundError(
1490 "An error happened while trying to locate the file on the Hub and we cannot find the requested files"
1491 " in the local cache. Please check your connection and try again or make sure your Internet connection"
1492 " is on."
1493 ) from head_call_error
LocalEntryNotFoundError: An error happened while trying to locate the file on the Hub and we cannot find the requested files in the local cache. Please check your connection and try again or make sure your Internet connection is on.
The above exception was the direct cause of the following exception:
OSError Traceback (most recent call last)
Cell In[22], line 8
5 device = f'cuda:{cuda.current_device()}' if cuda.is_available() else 'cpu'
7 # begin initializing HF items, you need an access token
----> 8 model_config = transformers.AutoConfig.from_pretrained(
9 model_id,
10 use_auth_token=mytoken
11 )
File ~\AppData\Roaming\Python\Python312\site-packages\transformers\models\auto\configuration_auto.py:1090, in AutoConfig.from_pretrained(cls, pretrained_model_name_or_path, **kwargs)
1087 trust_remote_code = kwargs.pop("trust_remote_code", None)
1088 code_revision = kwargs.pop("code_revision", None)
-> 1090 config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
1091 has_remote_code = "auto_map" in config_dict and "AutoConfig" in config_dict["auto_map"]
1092 has_local_code = "model_type" in config_dict and config_dict["model_type"] in CONFIG_MAPPING
File ~\AppData\Roaming\Python\Python312\site-packages\transformers\configuration_utils.py:594, in PretrainedConfig.get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
592 original_kwargs = copy.deepcopy(kwargs)
593 # Get config dict associated with the base config file
--> 594 config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
595 if config_dict is None:
596 return {}, kwargs
File ~\AppData\Roaming\Python\Python312\site-packages\transformers\configuration_utils.py:653, in PretrainedConfig._get_config_dict(cls, pretrained_model_name_or_path, **kwargs)
649 configuration_file = kwargs.pop("_configuration_file", CONFIG_NAME) if gguf_file is None else gguf_file
651 try:
652 # Load from local folder or from cache or download from model Hub and cache
--> 653 resolved_config_file = cached_file(
654 pretrained_model_name_or_path,
655 configuration_file,
656 cache_dir=cache_dir,
657 force_download=force_download,
658 proxies=proxies,
659 resume_download=resume_download,
660 local_files_only=local_files_only,
661 token=token,
662 user_agent=user_agent,
663 revision=revision,
664 subfolder=subfolder,
665 _commit_hash=commit_hash,
666 )
667 if resolved_config_file is None:
668 return None, kwargs
File ~\AppData\Roaming\Python\Python312\site-packages\transformers\utils\hub.py:385, in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_gated_repo, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash, **deprecated_kwargs)
379 if (
380 resolved_file is not None
381 or not _raise_exceptions_for_missing_entries
382 or not _raise_exceptions_for_connection_errors
383 ):
384 return resolved_file
--> 385 raise EnvironmentError(
386 f"We couldn't connect to '{HUGGINGFACE_CO_RESOLVE_ENDPOINT}' to load this file, couldn't find it in the"
387 f" cached files and it looks like {path_or_repo_id} is not the path to a directory containing a file named"
388 f" {full_filename}.\nCheckout your internet connection or see how to run the library in offline mode at"
389 " 'https://huggingface.co/docs/transformers/installation#offline-mode'."
390 ) from e
391 except EntryNotFoundError as e:
392 if not _raise_exceptions_for_missing_entries:
OSError: We couldn't connect to 'https://huggingface.co' to load this file, couldn't find it in the cached files and it looks like goole/gemma-3-27b-it is not the path to a directory containing a file named config.json.
Checkout your internet connection or see how to run the library in offline mode at 'https://huggingface.co/docs/transformers/installation#offline-mode'.
Discover more from Science Comics
Subscribe to get the latest posts sent to your email.