Been playing around with AI music generation, it’s really kind of interesting. In a roundabout way, it could actually make music “more human” again, depending on how it’s used.
If you look back at the last couple decades… so many producers default to using MIDI, which makes music really quantized, linear—where every drum hit and instrument is the exactly on beat, on a literal digital “grid”. It’s like making music on a spreadsheet. The most extreme example being EDM or something… which feels super computery and inorganic.
In contrast, I can get an AI to make a drum beat that is more like J Dilla—it’s rough around the edges, makes little ‘mistakes’ because the AI recognizes the imperfections that a human player delivers. But sometimes that’s what actually makes something unique. It is sensitive enough to pick up on the human subtleties that make music great.
Just a thought, curious to see where it goes from here.