🚨 Today in Open Source 🚨

Natural language to SQL 🀩

A 15B parameter model that outperforms GPT3.5 and outperforms GPT4.0 once schema tuned.

As I said earlier this year, FOSS wins the AI race hands down because of specialisation. AGI will be realised as a sprawling forest of lean specialist models and not a gigantic and monolithic general model.

https://github.com/defog-ai/sqlcoder

Reply to this note

Please Login to reply.

Discussion

All that FOSS AI really needs, is a standard tool for selecting which model to run a given input.

The tech stack is…

> UI

> classifier / model selector

> specialist model

All 3 layers should consist of fully interchangeable and parallelisable modules.

It outperforms GTP-4?!

Only on this very narrow task.

Ah I missed that part.

Is your thesis we'll have a range of LLMs that perform specific tasks to a high degree rather than a multi-purpose one like GPT?

Yes, a forest of specialists is vastly more performant than a monolithic general model.

It means you can go much further with less data and also generate a much lighter model that is faster.

At my company we copied a cephalopod architecture back in 2019 and it was a lot better.

In my mind an octopus forest is the best chance at making something that looks and feels like AGI.

But personally I don’t think general intelligence is a thing, I think humans approximate to a stack of ~5,000 skillsets.

Yes, that makes sense and we're already beginning to see niche LLMs for things like health. Large orgs will definitely want their own version trained on their IP and where the data doesn't leave their servers.