Skip to content

svarunid/tinylm

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

49 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TinyLM

Building and training my own tiny langugae model from scratch. The model achitecture is blend between architectures of popular open-source models like LLaMA, Qwen2.5-Coder, etc. The architecture uses RoPE for positional encoding, Grouped Query Attention (GQA), QKV bias, weight tying, etc.

Future Enhancements

  • Initialize weights according to Maximal Update Parameterization (muP).

About

Tiny version of popular open-source LLMs.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages