Hugging Face integrates Protect AI Guardian to auto-scan models for insecure serialization
AI Impact Summary
Hugging Face has partnered with Protect AI to embed Guardian into the Hub’s security tooling, targeting unsafe serialization formats like pickle that can lead to arbitrary code execution. Guardian, together with PickleScan, automatically scans all public model repositories on push and surfaces results in the frontend, including new Pickle-specific indicators. The rollout emphasizes safer model sharing at scale, noting that hundreds of millions of files have already been scanned, but there may be delays reconciling scans for the 1+ million repositories. This reduces the risk of distributing exploitable artifacts through the HF Hub while guiding teams via the security docs and knowledge base.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info