feat(mixtral): initial implementation of Mixtral MoE model, configs, and tests

- Add Mixtral architecture implementation with MoE support (llm/src/llm/models/mixtral/mixtral.py)
- Introduce generic Mixture-of-Experts (MoE) block (llm/src/llm/core/moe.py)
- Create dedicated configuration files for Mixtral training and generation experiments
- Register and test Mixtral support in experiment runner (run_llm_experiment.py)
- Add unit tests for Mixtral API including forward, caching, and generation modes
- Include Jupyter notebook mixstral.ipynb for architectural exploration and research
- Ensure correct handling of torch bool masks in sampling (top-k, top-p) during generation

BREAKING CHANGE: Adds new model code and test coverage, modifying experiment runner logic to register Mixtral.
This commit is contained in:
Sergey Penkovsky
2025-10-20 08:12:11 +03:00
parent 1aba02cab9
commit b1737bbce2
8 changed files with 2007 additions and 0 deletions

View File

@@ -45,6 +45,9 @@ def load_model_class(model_name):
elif model_name.lower() == 'mistral':
from llm.models.mistral import Mistral
return Mistral
elif model_name.lower() == 'mixtral':
from llm.models.mixtral import Mixtral
return Mixtral
else:
raise ValueError(f"Модель '{model_name}' не поддерживается.")