releases.shpreview
Hugging Face/huggingface_hub

huggingface_hub

$npx -y @buildinternet/releases show huggingface-hub
Mon
Wed
Fri
AprMayJunJulAugSepOctNovDecJanFebMarApr
Less
More
Releases20Avg6/moVersionsv1.3.2 → v1.11.0
Feb 28, 2024
Feb 27, 2024
v0.21.0: dataclasses everywhere, file-system, PyTorchModelHubMixin, serialization and more.

Discuss about the release in our Community Tab. Feedback welcome!! 🤗

🖇️ Dataclasses everywhere!

All objects returned by the HfApi client are now dataclasses!

In the past, objects were either dataclasses, typed dictionaries, non-typed dictionaries and even basic classes. This is now all harmonized with the goal of improving developer experience.

Kudos goes to the community for the implementation and testing of all the harmonization process. Thanks again for the contributions!

  • Use dataclasses for all objects returned by HfApi #1911 by @Ahmedniz1 in #1974
  • Updating HfApi objects to use dataclass by @Ahmedniz1 in #1988
  • Dataclasses for objects returned hf api by @NouamaneELGueddarii in #1993

💾 FileSystem

The HfFileSystem class implements the fsspec interface to allow loading and writing files with a filesystem-like interface. The interface is highly used by the datasets library and this release will improve further the efficiency and robustness of the integration.

  • Pass revision in path to AbstractBufferedFile init by @albertvillanova in #1948
  • [HfFileSystem] Fix rm on branch by @lhoestq in #1957
  • Retry fetching data on 502 error in HfFileSystem by @mariosasko in #1981
  • Add HfFileSystemStreamFile by @lhoestq in #1967
  • [HfFileSystem] Copy non lfs files by @lhoestq in #1996
  • Add HfFileSystem.url method by @mariosasko in #2027

🧩 Pytorch Hub Mixin

The PyTorchModelHubMixin class let's you upload ANY pytorch model to the Hub in a few lines of code. More precisely, it is a class that can be inherited in any nn.Module class to add the from_pretrained, save_pretrained and push_to_hub helpers to your class. It handles serialization and deserialization of weights and configs for you and enables download counts on the Hub.

With this release, we've fixed 2 pain points holding back users from using this lib:

  1. Configs are now better handled. The mixin automatically detects if the base class defines a config, saves it on the Hub and then injects it at load time, either as a dictionary or a dataclass depending on the base class's expectations.
  2. Weights are now saved as .safetensors files instead of pytorch pickles for safety reasons. Loading from previous pytorch pickles is still supported but we are moving toward completely deprecating them (in a mid to long term plan).
  • Better config support in ModelHubMixin by @Wauplin in #2001
  • Use safetensors by default for PyTorchModelHubMixin by @bmuskalla in #2033

✨ InferenceClient improvements

Audio-to-audio task is now supported by both by the InferenceClient!

>>> from huggingface_hub import InferenceClient
>>> client = InferenceClient()
>>> audio_output = client.audio_to_audio("audio.flac")
>>> for i, item in enumerate(audio_output):
>>>     with open(f"output_{i}.flac", "wb") as f:
            f.write(item["blob"])
  • Added audio to audio in inference client by @Ahmedniz1 in #2020

Also fixed a few things:

  • Fix intolerance for new field in TGI stream response: 'index' by @danielpcox in #2006
  • Fix optional model in tabular tasks by @Wauplin in #2018
  • Added best_of to non-TGI ignored parameters by @dopc in #1949

📤 Model serialization

With the aim of harmonizing repo structures and file serialization on the Hub, we added a new module serialization with a first helper split_state_dict_into_shards that takes a state dict and split it into shards. Code implementation is mostly taken from transformers and aims to be reused by other libraries in the ecosystem. It seamlessly supports torch, tensorflow and numpy weights, and can be easily extended to other frameworks.

This is a first step in the harmonization process and more loading/saving helpers will be added soon.

  • Framework-agnostic split_state_dict_into_shards helper by @Wauplin in #1938

📚 Documentation

🌐 Translations

Community is actively getting the job done to translate the huggingface_hub to other languages. We now have docs available in Simplified Chinese (here) and in French (here) to help democratize good machine learning!

  • [i18n-CN] Translated some files to simplified Chinese #1915 by @2404589803 in #1916
  • Update .github workflow to build cn docs on PRs by @Wauplin in #1931
  • [i18n-FR] Translated files in french and reviewed them by @JibrilEl in #2024

Docs misc

  • Document base_model in modelcard metadata by @Wauplin in #1936
  • Update the documentation of add_collection_item by @FremyCompany in #1958
  • Docs[i18n-en]: added pkgx as an installation method to the docs by @michaelessiet in #1955
  • Added hf_transfer extra into setup.py and docs/ by @jamesbraza in #1970
  • Documenting CLI default for download --repo-type by @jamesbraza in #1986
  • Update repository.md by @xmichaelmason in #2010

Docs fixes

  • Fix URL in get_safetensors_metadata docstring by @Wauplin in #1951
  • Fix grammar by @Anthonyg5005 in #2003
  • Fix doc by @jordane95 in #2013
  • typo fix by @Decryptu in #2035

🛠️ Misc improvements

Creating a commit with an invalid README will fail early instead of uploading all LFS files before failing to commit.

  • Fail early on invalid metadata by @Wauplin in #1934

Added a revision_exists helper, working similarly to repo_exists and file_exists:

>>> from huggingface_hub import revision_exists
>>> revision_exists("google/gemma-7b", "float16")
True
>>> revision_exists("google/gemma-7b", "not-a-revision")
False
  • Add revision_exists helper by @Wauplin in #2042

InferenceClient.wait(...) now raises an error if the endpoint is in a failed state.

  • raise on failed inference endpoint by @Wauplin in #1935

Improved progress bar when downloading a file

  • improve http_get by @Wauplin in #1954

Other stuff:

  • added will not echo message to the login token message by @vtrenton in #1925
  • Raise if repo is disabled by @Wauplin in #1965
  • Fix timezone in datetime parsing by @Wauplin in #1982
  • retry on any 5xx on upload by @Wauplin in #2026

💔 Breaking changes

  • Classes ModelFilter and DatasetFilter are deprecated when listing models and datasets in favor of a simpler API that lets you pass the parameters directly to list_models and list_datasets.
>>> from huggingface_hub import list_models, ModelFilter

# use
>>> list_models(language="zh")
# instead of 
>>> list_models(filter=ModelFilter(language="zh"))

Cleaner, right? ModelFilter and DatasetFilter will still be supported until v0.24 release.

  • Deprecate ModelFilter/DatasetFilter by @druvdub in #2028
  • List models tweaks by @julien-c in #2044
  • In the inference client, ModelStatus.compute_type is not a string anymore but a dictionary with more detailed information (instance type + number of replicas). This breaking change reflects a server-side update.
  • Fix ModelStatus compute type by @Wauplin in #2047

Small fixes and maintenance

⚙️ fixes

  • Make GitRefs backward comp by @Wauplin in #1960
  • Fix pagination when listing discussions by @Wauplin in #1962
  • Fix inconsistent warnings.warn in repocard.py by @Wauplin in #1980
  • fix: actual error won't be raised while force_download=True by @scruel in #1983
  • Fix download from private renamed repo by @Wauplin in #1999
  • Disable tqdm progress bar if no TTY attached by @mssalvatore in #2000
  • Deprecate legacy parameters in update_repo_visibility by @Wauplin in #2014
  • Fix getting widget_data from model_info by @Wauplin in #2041

⚙️ internal

  • prepare for 0.21.0 by @Wauplin in #1928
  • Remove PRODUCTION_TOKEN by @Wauplin in #1937
  • Add reminder for model card consistency by @Wauplin in #1979
  • Finished migration from setup.cfg to pyproject.toml by @jamesbraza in #1971
  • Newer pre-commit by @jamesbraza in #1987
  • Removed now unnecessary setup.cfg path variable by @jamesbraza in #1990
  • Added toml-sort tool by @jamesbraza in #1972
  • update name of dummy dataset user by @Wauplin in #2019

Significant community contributions

The following contributors have made significant changes to the library over the last release:

  • @2404589803
    • [i18n-CN] Translated some files to implified Chinese #1915 (#1916)
  • @jamesbraza
    • Added hf_transfer extra into setup.py and docs/ (#1970)
    • Finished migration from setup.cfg to pyproject.toml (#1971)
    • Documenting CLI default for download --repo-type (#1986)
    • Newer pre-commit (#1987)
    • Removed now unnecessary setup.cfg path variable (#1990)
    • Added toml-sort tool (#1972)
  • @Ahmedniz1
    • Use dataclasses for all objects returned by HfApi #1911 (#1974)
    • Updating HfApi objects to use dataclass (#1988)
    • Added audio to audio in inference client (#2020)
  • @druvdub
    • Deprecate ModelFilter/DatasetFilter (#2028)
  • @JibrilEl
    • [i18n-FR] Translated files in french and reviewed them (#2024)
  • @bmuskalla
    • Use safetensors by default for PyTorchModelHubMixin (#2033)
Jan 22, 2024
0.20.3 hot-fix: Fix HfFolder login when env variable not set

This patch release fixes an issue when retrieving the locally saved token using huggingface_hub.HfFolder.get_token. For the record, this is a "planned to be deprecated" method, in favor of huggingface_hub.get_token which is more robust and versatile. The issue came from a breaking change introduced in https://github.com/huggingface/huggingface_hub/pull/1895 meaning only 0.20.x is affected.

For more details, please refer to https://github.com/huggingface/huggingface_hub/pull/1966.

Full Changelog: https://github.com/huggingface/huggingface_hub/compare/v0.20.2...v0.20.3

Jan 5, 2024
0.20.2 hot-fix: Fix concurrency issues in google colab login

A concurrency issue when using userdata.get to retrieve HF_TOKEN token led to deadlocks when downloading files in parallel. This hot-fix release fixes this issue by using a global lock before trying to get the token from the secrets vault. More details in https://github.com/huggingface/huggingface_hub/pull/1953.

Full Changelog: https://github.com/huggingface/huggingface_hub/compare/v0.20.1...v0.20.2

Dec 20, 2023
0.20.1: hot-fix Fix circular import

This hot-fix release fixes a circular import error happening when import login or logout helpers from huggingface_hub.

Related PR: https://github.com/huggingface/huggingface_hub/pull/1930

Full Changelog: https://github.com/huggingface/huggingface_hub/compare/v0.20.0...v0.20.1

Latest
v1.11.0
Tracking Since
Dec 20, 2023
Last fetched Apr 19, 2026