Graphormer-based graph classification in HuggingFace Transformers (MolHIV) workflow
AI Impact Summary
HuggingFace Transformers now demonstrates graph classification using Graphormer, with ogbg-molhiv as the baseline dataset. The workflow relies on GraphormerForGraphClassification, GraphormerDataCollator, and preprocess_item to convert graphs to batches, enabling fine-tuning via Trainer and TrainingArguments. This provides a ready-to-use pathway for molecular property prediction, but training 20 epochs on CPU can be lengthy (roughly a day), so GPUs or clusters are recommended for production workloads and faster experimentation. Teams should plan around the hub-based datasets, on-the-fly preprocessing options, and the need to adjust num_classes and model heads when switching datasets.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info