GPT models are good at speech to text and text to speech. I read GPT was designed for those kinds of things.
Discussion
the name literally tells you: Generative Predictive Text
it's a type of path discovery algorithm that are created by grammar trees, it uses a special type of hash function that produces an approximation of the hamming distance of two nodes on the semantic graph that changes as you walk it, based on a random seed
the technology is really primitive, it's just a moderately functional natural language processing system, i prefer to call it "text mangler" because it literally eats a pile of input and then does things to it that let you pull a spaghetti string out of it that makes sense
its predecessor was the Markov Chain and back in 2004 in my customary IRC chat at the time the sysadmin of the chat deployed a markov chain bot that fed on our messages and spat back mangled versions of our text, it was quite funny because i could recognise it, and because i was one of the most prolific writers, it regurgitated a lot of expressions taht i used
that's also why if you look at the list of npubs that have muted me among them are two GPT bots :D
Generative Pre-trained Transformer, but your point remains.
And also to your point, anything it generates depends on its pre-training. So the more content that gets created with GPTs the more pre-training that happens with later GPTs, and a lot of noise results.
yeah, and it's already bad, as it is
i can pick out a typical 3 paragraph output from it just from the first words of each paragraph, it's that fucking repetitive, anyone who can't see this is just dumb
Surely a model can be trained to recognize text from the most common models out there. Then we can run this model in our browser and use it to auto-block pages that were written by models.
i'm not sure you fully comprehend how much energy budget you need for such inane nonsense