Graphcore Bow IPU with Hugging Face Optimum adds 10 IPU-optimized transformers
AI Impact Summary
Graphcore and Hugging Face have expanded Hugging Face Optimum with 10 IPU-optimized transformer models spanning NLP, vision, and speech, each shipped with IPU-specific configurations and weights. The Bow IPU hardware, paired with the Poplar SDK 2.5, delivers up to 350 teraFLOPS with ~16% higher efficiency, and config allows seamless migration from prior IPU generations with no code changes. This lowers the barrier for deploying large transformer workloads on IPUs, accelerating time-to-value for applications such as translation, sentiment analysis, and multimodal tasks, while enabling tight integration with Hugging Face Hub datasets for end-to-end experimentation and fine-tuning.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info