-
Godala-moe Stable
released this
2025-02-11 21:29:55 +00:00 | 4 commits to main since this releaseRelease Notes for Godala-moe-1102
Godala-moe-1102 is now available! This edge-ready Large Language Model (LLM) is designed to run efficiently on various platforms, based on the Hugging Face
HuggingFaceTB/SmolLM2-1.7B
Base model.Key Features:
- Edge-Ready: Optimized for minimal resource requirements.
- Mixture of Experts: Enhanced performance through multiple expert models.
- Continued Pretraining and Finetuning: Improved quality and task specialization.
Current Status:
Trained on 400 million tokens, with plans for further improvements. Code generation does not yet produce Markdown output.
License:
Licensed under the Creative Commons Attribution 4.0 International License. Original model under Apache 2.0 License.
Happy coding!
Downloads