DeepSpeed
http://dbpedia.org/resource/DeepSpeed an entity of type: Thing
DeepSpeed is an open source deep learning optimization library for PyTorch. The library is designed to reduce computing power and memory use and to train large distributed models with better parallelism on existing computer hardware. DeepSpeed is optimized for low latency, high throughput training. It includes the Zero Redundancy Optimizer (ZeRO) for training models with 1 trillion or more parameters. Features include mixed precision training, single-GPU, multi-GPU, and multi-node training as well as custom model parallelism. The DeepSpeed source code is licensed under MIT License and available on GitHub.
rdf:langString
rdf:langString
DeepSpeed
rdf:langString
DeepSpeed
rdf:langString
DeepSpeed
xsd:integer
64396232
xsd:integer
1109254514
xsd:date
2022-08-01
rdf:langString
v0.7.0
rdf:langString
DeepSpeed logo.svg
xsd:date
2020-05-18
rdf:langString
DeepSpeed is an open source deep learning optimization library for PyTorch. The library is designed to reduce computing power and memory use and to train large distributed models with better parallelism on existing computer hardware. DeepSpeed is optimized for low latency, high throughput training. It includes the Zero Redundancy Optimizer (ZeRO) for training models with 1 trillion or more parameters. Features include mixed precision training, single-GPU, multi-GPU, and multi-node training as well as custom model parallelism. The DeepSpeed source code is licensed under MIT License and available on GitHub. The team claimed to achieve up to a 6.2x throughput improvement, 2.8x faster convergence, and 4.6x less communication.
xsd:nonNegativeInteger
4228
xsd:date
2022-08-01
xsd:string
v0.7.0
xsd:date
2020-05-18