Distributed Training: Train BART/T5 for Summarization using 🤗 Transformers and Amazon S ageMaker

https://huggingface.co/blog/sagemaker-distributed-training-seq2seq

Distributed Training: Train BART/T5 for Summarization using 🤗 Transformers and Amazon SageMaker
Tutorial We will use the new Hugging Face DLCs and Amazon SageMaker extension to train a distributed Seq2Seq-transformer model on the summarization task using the transformers and datasets libraries, and then upload the model to huggingface.co and test it.. As distributed training strategy we are going to use SageMaker Data Parallelism, which has been built into the Trainer API.
huggingface.co

Publié

dans

, , ,

par

Étiquettes :