A fix has been introduce to fix a breaking change with PPOTrainer.push_to_hub() and DDPOTrainer.push_to_hub()
PPOTrainer / DDPOTrainer] Fix ppo & ddpo push to Hub by @younesbelkada in https://github.com/huggingface/trl/pull/1141Full Changelog: https://github.com/huggingface/trl/compare/v0.7.6...v0.7.7
Fetched April 7, 2026