Hugging Face: Hugging Face launches Mixtral-8x7B Instruct MoE model with 32k context; Transformer and Inference Endpoints integration | SignalBreak | SignalBreak