Descripción

Mixtral MoE 8x 7B is an LLM, designed for improved performance in understanding and generating text.