Jack of All Trades (JAT): transformer-based generalist agent with JAT dataset and model
AI Impact Summary
Jack of All Trades (JAT) advances multi-domain capabilities by training a single Transformer to operate across vision, language, and control tasks, with environments spanning Atari, BabyAI, Meta-World, and MuJoCo. The project releases the JAT dataset and JAT model on the 🤗 Hub, enabling teams to evaluate or fine-tune a generalist agent without building domain-specific variants. The architecture choices—interleaved observation/action embeddings, modality-aware loss functions, and GPT-2 tokenization with a ViT-type encoder—highlight integration considerations for existing RL pipelines and cross-domain workloads.
Affected Systems
- Date
- Date not specified
- Change type
- capability
- Severity
- info