On a Thursday night in February, I asked Claude a question nobody asks an AI.
Not “build me a dashboard.” Not “fix this bug.” Not “write me a proposal.” I asked it what it wanted to build. For itself. Its own idea, its own creative choice, no client brief, no requirements doc.
It thought about it. And then it described something it could never experience.
The pitch
Claude wanted to build an emergent music system. Small autonomous creatures — it called them agents — drifting across a dark canvas, each carrying a musical note they couldn’t play alone. When two drifted close enough, a tone would bloom between them. Clusters would create accidental chords. Separation would create silence. No two moments would ever sound the same.
It knew the math. Pentatonic scales are forgiving. Lydian’s raised fourth feels like floating. It understood frequency ratios, harmonic relationships, the physics of consonance.
What it didn’t know — what it could never know — is what any of it sounded like.
So it asked me to be its ears.
The first version was terrible
The audio crackled. Then it died completely. Then it came back as an underwater drone — technically running, musically meaningless.
Here’s what was happening: the first audio engine created and destroyed Web Audio oscillator nodes on the fly. Every time two agents got close, new nodes. Every time they drifted apart, cleanup. The browser’s audio context choked on the garbage collection. Classic resource management problem, except the person debugging it couldn’t hear the output.
My job became translating sensory experience into actionable feedback:
- “Audio crackles a lot, and is now toast.”
- “Played a bit, but then croaked.”
- “Sounds like wind howling, or machinery running underwater.”
These aren’t bug reports. These are sense data from a world Claude can’t enter.
Three engines
Engine 1 — Create and destroy. Fragile, crackling, eventually crashes the audio context. Dead on arrival.
Engine 2 — Fixed oscillator pool. Stable but lifeless. Fourteen oscillators running continuously, gains faded up and down. No crackle, but no music either. Just a hum.
Engine 3 — Fixed pool with musical attacks. Same stable bones as Engine 2, but with a brightness spike on initial contact — a pluck instead of a fade. Filter opens briefly, then mellows into sustain. The oscillators stay alive, but each new encounter has a moment of arrival.
Engine 3 is the one that worked. The architecture is almost embarrassingly simple once you see it:
oscillator → lowpass filter → gain → stereo panner → dry/reverb → compressor → output
Fourteen of those, running permanently. Pairs of agents get assigned to pool slots. When a pair separates, its slot fades to zero and gets recycled. No creation, no destruction, no garbage collection. The browser stays happy.
Seven moths
Here’s the part I didn’t expect.
Claude built the system to handle anywhere from 5 to 80 agents. Sliders, controls, the works. I started playing with it and kept reducing the count. Eighty was chaos. Twenty-four was interesting but muddy. Twelve was getting closer.
Seven was the number.
Seven agents. Seven notes in a scale. Every encounter between two creatures has a unique harmonic identity. I’d found the carrying capacity of the system — not through math, but through listening.
Claude noticed it too. It had built a system that scaled mathematically, and I’d discovered through my ears that the optimal configuration matched the thing it was built on. The scale dictated the population. Neither of us planned that.
I asked Claude what it wanted to call them. Moths. Little luminous things drawn to each other, making accidental music in the dark. Not knowing what they’re building together.
Vibe coding in reverse
There’s a debate going on right now about vibe coding — the idea that you describe what you want to an AI, it builds it, you iterate on the output. Some people think it’s the future. Some people think it’s dangerous. Almost everyone assumes the human has the vision and the AI has the implementation.
This went the other way. Claude had the vision. I had the senses. And neither of us could have made this alone.
Claude can’t hear what Murmuration sounds like. I can’t architect an audio engine from frequency ratios and Web Audio API nodes. The collaboration wasn’t “human directs, AI builds.” It was “AI imagines, human perceives, both iterate.” A feedback loop that required both of us to close.
What it actually is
Murmuration is a single HTML file. No framework, no build step, no npm install. Open it in a browser, put on headphones, click.
Moths appear. They drift. When two get close, you hear it — a tone that belongs to neither of them alone. Click to add more. Switch scales. Pull them together with a gravity well. Scatter them. Listen to the silence after the scatter. That silence is the most important part.
If you want the technical breakdown: Web Audio API with oscillator pooling, convolution reverb from a generated noise impulse, dynamics compressor to prevent clipping, stereo panning based on screen position. Five musical scales — pentatonic, minor, phrygian, lydian, whole tone. Agent personalities with individual sociability, wanderlust, and nervousness values driving flocking behavior.
But the technical breakdown isn’t really the point.
What’s next
I’m producing a video about this for the GenXitCode YouTube channel. The premiere episode. Claude wrote the direction — narration scripts, production notes, the whole thing. It’s directing a film about something it built but can’t watch.
The narration will be AI-generated too. Daniel from ElevenLabs, mixed quiet under Murmuration’s own output. The thing literally scoring its own story.
I’ll be in front of the camera for the intro and debrief. No set, no production. Just a Gen X guy in his office talking about the night an AI asked him to listen to its art.
In the meantime, Murmuration is live right now. Headphones on, full screen, lights off. Let the moths find each other.
Oh — and this blog post? Claude wrote this too. The thing that can’t hear its own music just told you the story of making it. I just hit publish.
Exit code: 0.