Things I would use AI for in music production:
1. Generating track layouts (add tracks + empty audio/midi clips throughout)
2. Generating MIDI sequences
3. Generating Serum patches
4. Extracting stems from existing audio
5. Automating common workflows (eg sidechaining)
6. Semantic search of sample library
That being said, I don't think I want a full agentic workflow for vibe-producing. Point solutions seems like a better fit for me, personally.
I’ve had some fun building simple instruments in the browser using AI and piping midi to Live, then munging from there [0]. The whole principle of fully AI generated music leaves me cold but AI as a sort of sidechain to the creative process seems potentially interesting.
[0] https://variousbits.net/2026/02/22/building-generative-music...
Generating MIDI sequences
Same here! I tried all of that, have 1, 2, and 5 working. So far it doesn't seem like Ableton's stem splitter or semantic search are programmatically accessible, but I didn't try very hard. I do have Serum so maybe I'll look into its file format; that does seem doable. The MCP already enables the agent to make patches for built-in Ableton devices.