releases.shpreview

v1.0.17

Release v1.0.17

$npx -y @buildinternet/releases show rel_iMVu_4qMhCePt6KruOisE

July 7, 2025

  • MobileNet-v5 backbone tweaks for improved Google Gemma 3n behaviour (to pair with updated official weights)
    • Add stem bias (zero'd in updated weights, compat break with old weights)
    • GELU -> GELU (tanh approx). A minor change to be closer to JAX
  • Add two arguments to layer-decay support, a min scale clamp and 'no optimization' scale threshold
  • Add 'Fp32' LayerNorm, RMSNorm, SimpleNorm variants that can be enabled to force computation of norm in float32
  • Some typing, argument cleanup for norm, norm+act layers done with above
  • Support Naver ROPE-ViT (https://github.com/naver-ai/rope-vit) in eva.py, add RotaryEmbeddingMixed module for mixed mode, weights on HuggingFace Hub
modelimg_sizetop1top5param_count
vit_large_patch16_rope_mixed_ape_224.naver_in1k22484.8497.122304.4
vit_large_patch16_rope_mixed_224.naver_in1k22484.82897.116304.2
vit_large_patch16_rope_ape_224.naver_in1k22484.6597.154304.37
vit_large_patch16_rope_224.naver_in1k22484.64897.122304.17
vit_base_patch16_rope_mixed_ape_224.naver_in1k22483.89496.75486.59
vit_base_patch16_rope_mixed_224.naver_in1k22483.80496.71286.44
vit_base_patch16_rope_ape_224.naver_in1k22483.78296.6186.59
vit_base_patch16_rope_224.naver_in1k22483.71896.67286.43
vit_small_patch16_rope_224.naver_in1k22481.2395.02221.98
vit_small_patch16_rope_mixed_224.naver_in1k22481.21695.02221.99
vit_small_patch16_rope_ape_224.naver_in1k22481.00495.01622.06
vit_small_patch16_rope_mixed_ape_224.naver_in1k22480.98694.97622.06
  • Some cleanup of ROPE modules, helpers, and FX tracing leaf registration
  • Preparing version 1.0.17 release

What's Changed

New Contributors

Full Changelog: https://github.com/huggingface/pytorch-image-models/compare/v1.0.16...v1.0.17

Fetched April 7, 2026