Like the LLM machines, they are layers of transformer networks sandwiched between layers of attention networks.

Unlike LLMs, the music machines are trained on music not language.

The phrase for describing both/all is Generative Pre-Trained Transformer, GPT, even though "GPT" from OpenAI itself is only an LLM, and there are many other types of pre-trained transformers not trained on language.

Computer jargon is always stupid.

Reply to this note

Please Login to reply.

Discussion

Yeah I'd thought that whilst music is essentially a language of combined sine waves, it's something under the banner of an LLM but I'm too much away from the sources of input on such things, appreciated the explanation!