Todays Zinger: Random thoughts on AI and algorithms.

Have you ever noticed your predictive content algorithms getting “stuck?”

For music in particular, when I listen to a certain genre or certain artists, Spotify will suggest more music from that artist, or similar artists. If I’m just listening to hit singles, then it will continue to suggest more hit singles.

This can be a good thing, helping me discover more songs that sound like the ones I like. But it can also be a bad thing, because what if I’m not really in the mood for that genre of music right now? It becomes kind of hard to discover something completely new.

The suggestion algorithms seem to more deeply ingrain your habits and preferences, corralling your content exposure. What is billed as a way to increase exposure to new and better content, seems eventually to lead to the opposite. Perhaps this is by design.

A similar thought occurred to me about artificial intelligence. We know that these language learning models are only as good as the data they are trained on. In theory, they synthesize the very best content that humans have written and learn how to produce content to the same standard or better right on demand.

So what happens when they run out of human generated literature to analyze? Does the exponential growth trend of improvement slow down?

And more importantly, what happens when people stop generating their own written content and simply rely on Chat-GPT to write it for them?

Do we run into the same problem of our written content getting “stuck” because people no longer develop their own writing skills and AI can’t learn on any new content?

Idk, I’m just rambling here. Let me know if this makes any sense. Unlike all of these Current Thing experts, I’m not an AI expert.

#ai #zinger

Reply to this note

Please Login to reply.

Discussion

No replies yet.