Falcon is a 40B parameters autoregressive decoder-only model trained on 1 trillion tokens. It was trained on 384 GPUs on AWS over the course of 2 mo.

https://falconllm.tii.ae

#1 on Hugging Face for LLMs. Outperforms competitors like Meta's LLaMA & Stability AI's StableLM.

Reply to this note

Please Login to reply.

Discussion

No replies yet.