Fine-tuning -> Inference Pipeline on Akash Network!
Incredible work by Ishan Dhanani, Anish Maddipoti and Tom at Agora Labs, building a fine-tuning and inference pipeline on top of Akash Console (that we at Overclock Labs, creators of Akash Network open sourced earlier this year). It uses decentralized compute from akash.network (decentralized EC2 of sorts) and decentralized storage from Storj (Storj DCS buckets that are the decentralized equivalent of S3)
Take an open-source model like LLaMa-2, fine-tune it on your dataset, and deploy it for inference (all powered by decentralized compute)🚀