Hugging Face integrates Public AI as an Inference Provider
Action Required
Developers can now easily integrate and utilize Public AI’s inference capabilities within their Hugging Face workflows, expanding access to a wider range of AI models.
AI Impact Summary
Hugging Face has integrated Public AI as a supported Inference Provider on the Hugging Face Hub, significantly expanding access to public and sovereign AI models. This allows developers to seamlessly utilize Public AI's infrastructure – built on vLLM and distributed across partner clusters – directly within the Hub’s model pages and client SDKs, simplifying model deployment and reducing operational complexity. Users can now leverage Public AI’s platform for accessing models like Swiss AI Initiative’s Apertus-70B without managing their own infrastructure.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- high