v0.28.0: DataLoaderConfig, XLA improvements, FSDP + QLORA foundations, Gradient Synchronization Tweaks, and Bug Fixes
DataLoaderConfiguration and begin deprecation of arguments in the Accelerator+from accelerate import DataLoaderConfiguration
+dl_config = DataLoaderConfiguration(split_batches=True, dispatch_batches=True)
-accelerator = Accelerator(split_batches=True, dispatch_batches=True)
+accelerator = Accelerator(dataloader_config=dl_config)
from accelerate import GradientAccumulationPlugin
plugin = GradientAccumulationPlugin(
+ num_steps=2,
sync_each_batch=sync_each_batch
)
accelerator = Accelerator(gradient_accumulation_plugin=plugin)
launch changesmpirun for multi-cpu training by @dmsuehir in https://github.com/huggingface/accelerate/pull/2493is_torch_tensor over hasattr for torch.compile. by @PhilJd in https://github.com/huggingface/accelerate/pull/2387DataLoaderConfig by @muellerzr in https://github.com/huggingface/accelerate/pull/2441is_namedtuple implementation by @fxmarty in https://github.com/huggingface/accelerate/pull/2475os.path.sep.join path manipulations with a helper by @akx in https://github.com/huggingface/accelerate/pull/2446XLA device type by @will-cromar in https://github.com/huggingface/accelerate/pull/2467Accelerator to detect distributed type from the "LOCAL_RANK" env variable for XPU by @faaany in https://github.com/huggingface/accelerate/pull/2473accelerate launch by @muellerzr in https://github.com/huggingface/accelerate/pull/2498----main_process_port to --main_process_port) by @DerrickWang005 in https://github.com/huggingface/accelerate/pull/2516PYTORCH_NVML_BASED_CUDA_CHECK when calling accelerate.utils.imports.is_cuda_available() by @luiscape in https://github.com/huggingface/accelerate/pull/2524env=os.environ.copy()s by @akx in https://github.com/huggingface/accelerate/pull/2449zero_grad(set_to_none=None) to align with PyTorch by @yongchanghao in https://github.com/huggingface/accelerate/pull/2472Full Changelog: https://github.com/huggingface/accelerate/compare/v0.27.2...v0.28.0
Fetched April 7, 2026