Pricing Update: No-Packing Fine-Tuning Jobs — max_seq_length now calculated as len(dataset) * max_seq_length
Action Required
Users employing the no-packing fine-tuning option must now actively manage the `max_seq_length` parameter to control costs, as the token calculation has shifted to `len(dataset) * max_seq_length`.
AI Impact Summary
The pricing for no-packing fine-tuning jobs has changed, now calculating training dataset token usage as `len(dataset) * max_seq_length`. This update reflects the compute cost of packing-free jobs and gives users more control over their costs by adjusting the `max_seq_length` parameter. This change impacts any existing or new fine-tuning jobs utilizing the no-packing option, requiring careful consideration of sequence length to manage expenses.
Affected Systems
- Date
- Date not specified
- Change type
- pricing
- Severity
- medium