uhh very cool game for cdww

This sketch creates an interactive liquid chrome sphere rendered with WebGL raymarching that responds to mouse movement. It combines a procedurally generated techno beat created with Tone.js that reacts to mouse velocity, creating a multisensory experience where visual ripples and audio filter/volume changes are synchronized with user interaction.

๐ŸŽ“ Concepts You'll Learn

WebGL shadersRaymarchingSigned distance functionsNormal calculationFresnel effectAudio synthesisTone.js sequencingMouse trackingVelocity smoothingProcedural audio generation

๐Ÿ”„ Code Flow

Code flow showing setup, draw, initaudio, windowresized

๐Ÿ’ก Click on function names in the diagram to jump to their code

graph TD start[Start] --> setup[setup] setup --> canvas-creation[WebGL Canvas Setup] setup --> button-creation[Audio Trigger Button] setup --> shader-compilation[Shader Compilation] setup --> draw[draw loop] click setup href "#fn-setup" click canvas-creation href "#sub-canvas-creation" click button-creation href "#sub-button-creation" click shader-compilation href "#sub-shader-compilation" draw --> velocity-calculation[Instantaneous Velocity] draw --> velocity-smoothing[Velocity Smoothing with Lerp] draw --> velocity-damping[Velocity Decay] draw --> audio-reactivity[Audio Reactivity Block] draw --> shader-activation[Shader Activation] draw --> uniform-passing[Uniform Variables] click draw href "#fn-draw" click velocity-calculation href "#sub-velocity-calculation" click velocity-smoothing href "#sub-velocity-smoothing" click velocity-damping href "#sub-velocity-damping" click audio-reactivity href "#sub-audio-reactivity" click shader-activation href "#sub-shader-activation" click uniform-passing href "#sub-uniform-passing" audio-reactivity --> filter-mapping[Velocity to Filter Frequency] audio-reactivity --> volume-mapping[Velocity to Volume] click filter-mapping href "#sub-filter-mapping" click volume-mapping href "#sub-volume-mapping" filter-mapping --> tone-start[Audio Context Initialization] tone-start --> master-chain[Audio Signal Chain Setup] master-chain --> kick-synth[Kick Drum Synthesizer] master-chain --> bass-synth[FM Bassline Synthesizer] master-chain --> hihat-synth[Hi-Hat Synthesizer] click tone-start href "#sub-tone-start" click master-chain href "#sub-master-chain" click kick-synth href "#sub-kick-synth" click bass-synth href "#sub-bass-synth" click hihat-synth href "#sub-hihat-synth" draw --> windowresized[windowResized] windowresized --> draw click windowresized href "#fn-windowresized" kick-synth --> sequencer-loop[16-Step Sequencer] bass-synth --> sequencer-loop hihat-synth --> sequencer-loop click sequencer-loop href "#sub-sequencer-loop" sequencer-loop --> kick-trigger[Kick Trigger Logic] sequencer-loop --> hihat-trigger[Hi-Hat Trigger Logic] sequencer-loop --> bass-trigger[Bassline Trigger Logic] click kick-trigger href "#sub-kick-trigger" click hihat-trigger href "#sub-hihat-trigger" click bass-trigger href "#sub-bass-trigger"

๐Ÿ“ Code Breakdown

setup()

setup() runs once at the start. Here we initialize the canvas, compile shaders, and create UI elements. WebGL shaders run on the GPU for fast, complex visual effects.

function setup() {
  createCanvas(windowWidth, windowHeight, WEBGL);
  
  // Create UI overlay to trigger audio context
  startBtn = createButton('ENTER EXPERIENCE');
  startBtn.id('start-btn');
  startBtn.mousePressed(initAudio);
  
  liquidShader = createShader(vertShader, fragShader);
  noStroke();
}

๐Ÿ”ง Subcomponents:

function-call WebGL Canvas Setup createCanvas(windowWidth, windowHeight, WEBGL);

Creates a full-window WebGL canvas that enables GPU-accelerated shader rendering

function-call Audio Trigger Button startBtn = createButton('ENTER EXPERIENCE');

Creates a button to start the audio context (required by browsers for sound)

function-call Shader Compilation liquidShader = createShader(vertShader, fragShader);

Compiles the vertex and fragment shaders into a GPU program

Line by Line:

createCanvas(windowWidth, windowHeight, WEBGL);
Creates a full-window WebGL canvas. WEBGL enables GPU shader rendering instead of 2D canvas drawing
startBtn = createButton('ENTER EXPERIENCE');
Creates an interactive button that users click to start the audio (browsers require user interaction to play sound)
startBtn.id('start-btn');
Assigns an HTML ID to the button so CSS styling can target it
startBtn.mousePressed(initAudio);
Connects the button's click event to the initAudio function, triggering audio setup when clicked
liquidShader = createShader(vertShader, fragShader);
Compiles the vertex shader and fragment shader into a shader program stored in liquidShader
noStroke();
Disables stroke outlines on shapes, so only fills are drawn

draw()

draw() runs 60 times per second. It tracks mouse movement, smooths the velocity, maps that velocity to audio parameters, and then renders the shader. The shader runs on the GPU and creates the liquid metal effect.

function draw() {
  // Calculate instantaneous velocity
  let dx = mouseX - pmouseX;
  let dy = mouseY - pmouseY;

  // Smooth the velocity
  smoothVelX = lerp(smoothVelX, dx, 0.1);
  smoothVelY = lerp(smoothVelY, dy, 0.1);

  // Dampen velocity down to zero
  smoothVelX *= 0.96;
  smoothVelY *= 0.96;
  
  // --- AUDIO REACTIVITY ---
  if (audioStarted) {
    // Calculate the magnitude of our velocity
    let velMag = sqrt(smoothVelX * smoothVelX + smoothVelY * smoothVelY);
    
    // Map velocity to filter cutoff (low/muffled when still -> high/bright when moving fast)
    let targetFreq = map(velMag, 0, 40, 250, 8000, true);
    synthFilter.frequency.rampTo(targetFreq, 0.1);
    
    // Map velocity to volume (-20dB when still -> 0dB when moving)
    let targetVol = map(velMag, 0, 30, -20, 0, true);
    masterVolume.volume.rampTo(targetVol, 0.1);
  }

  // Activate shader
  shader(liquidShader);

  // Pass uniforms
  liquidShader.setUniform('u_resolution', [width, height]);
  liquidShader.setUniform('u_time', millis() / 1000.0);
  liquidShader.setUniform('u_mouse', [mouseX, mouseY]);
  liquidShader.setUniform('u_mouse_vel', [smoothVelX, smoothVelY]);

  // Draw fullscreen quad
  rect(-width / 2, -height / 2, width, height);
}

๐Ÿ”ง Subcomponents:

calculation Instantaneous Velocity let dx = mouseX - pmouseX; let dy = mouseY - pmouseY;

Calculates how many pixels the mouse moved this frame in x and y directions

calculation Velocity Smoothing with Lerp smoothVelX = lerp(smoothVelX, dx, 0.1); smoothVelY = lerp(smoothVelY, dy, 0.1);

Gradually blends toward new velocity values instead of jumping instantly, creating smooth motion

calculation Velocity Decay smoothVelX *= 0.96; smoothVelY *= 0.96;

Reduces velocity each frame so motion gradually stops when the mouse stops moving

conditional Audio Reactivity Block if (audioStarted) { ... }

Only runs audio updates after the user has clicked the start button

calculation Velocity to Filter Frequency let targetFreq = map(velMag, 0, 40, 250, 8000, true);

Converts velocity magnitude (0-40) to filter frequency (250-8000 Hz), making sound brighter when moving fast

calculation Velocity to Volume let targetVol = map(velMag, 0, 30, -20, 0, true);

Converts velocity magnitude (0-30) to volume in decibels (-20dB to 0dB), making sound louder when moving

function-call Shader Activation shader(liquidShader);

Tells p5.js to use the compiled shader for subsequent drawing

function-call Uniform Variables liquidShader.setUniform(...)

Sends data from JavaScript to the GPU shader (resolution, time, mouse position, velocity)

Line by Line:

let dx = mouseX - pmouseX;
Calculates horizontal movement by subtracting the previous frame's X position from the current X position
let dy = mouseY - pmouseY;
Calculates vertical movement by subtracting the previous frame's Y position from the current Y position
smoothVelX = lerp(smoothVelX, dx, 0.1);
Uses lerp (linear interpolation) to gradually blend the smooth velocity toward the new dx value. 0.1 means 10% of the way per frame, creating smooth acceleration
smoothVelY = lerp(smoothVelY, dy, 0.1);
Same as smoothVelX but for vertical velocity
smoothVelX *= 0.96;
Multiplies velocity by 0.96 each frame, gradually reducing it to zero when the mouse stops moving (4% decay per frame)
smoothVelY *= 0.96;
Same decay applied to vertical velocity
if (audioStarted) {
Only runs the audio reactivity code if the user has clicked the start button and audio is initialized
let velMag = sqrt(smoothVelX * smoothVelX + smoothVelY * smoothVelY);
Calculates the total speed (magnitude) of mouse movement using the Pythagorean theorem: โˆš(xยฒ + yยฒ)
let targetFreq = map(velMag, 0, 40, 250, 8000, true);
Maps velocity magnitude from range 0-40 to frequency range 250-8000 Hz. Fast movement = bright sound, slow = muffled. The 'true' parameter constrains values to the output range
synthFilter.frequency.rampTo(targetFreq, 0.1);
Smoothly transitions the filter frequency to the target value over 0.1 seconds, preventing abrupt sound changes
let targetVol = map(velMag, 0, 30, -20, 0, true);
Maps velocity magnitude from 0-30 to volume from -20dB (quiet) to 0dB (normal). Fast movement = louder, still = quieter
masterVolume.volume.rampTo(targetVol, 0.1);
Smoothly transitions volume to the target value over 0.1 seconds
shader(liquidShader);
Activates the compiled shader so the next drawing command uses it instead of normal 2D drawing
liquidShader.setUniform('u_resolution', [width, height]);
Sends the canvas width and height to the shader so it can calculate proper pixel coordinates
liquidShader.setUniform('u_time', millis() / 1000.0);
Sends elapsed time in seconds to the shader for animation. millis() returns milliseconds, divided by 1000 to convert to seconds
liquidShader.setUniform('u_mouse', [mouseX, mouseY]);
Sends current mouse position to the shader (though not used in this shader, it's available for future modifications)
liquidShader.setUniform('u_mouse_vel', [smoothVelX, smoothVelY]);
Sends the smoothed velocity to the shader, which uses it to control ripple amplitude on the sphere
rect(-width / 2, -height / 2, width, height);
Draws a rectangle covering the entire canvas. Because the shader is active, this rectangle is rendered using the shader instead of normal fill color

initAudio()

initAudio() is called when the user clicks the start button. It sets up a complete audio synthesis system with a kick drum, FM bassline, and hi-hat, all synchronized to a 16-step sequencer at 130 BPM. The audio signal flows through a filter and volume control that are modulated by mouse movement in the draw() function.

async function initAudio() {
  // Required: Start Tone context on user gesture
  await Tone.start();
  
  // Create master effects nodes controlled by the mouse movement
  masterVolume = new Tone.Volume(-20).toDestination();
  synthFilter = new Tone.Filter(250, "lowpass").connect(masterVolume);
  
  // 1. Kick Drum
  const kick = new Tone.MembraneSynth({
    pitchDecay: 0.05,
    octaves: 4,
    oscillator: { type: "sine" },
    envelope: { attack: 0.001, decay: 0.4, sustain: 0.01, release: 1.4 }
  }).connect(synthFilter);
  
  // 2. Gritty FM Bassline
  const bass = new Tone.FMSynth({
    harmonicity: 1,
    modulationIndex: 2,
    oscillator: { type: "square" },
    envelope: { attack: 0.01, decay: 0.2, sustain: 0.1, release: 0.5 }
  }).connect(synthFilter);
  
  // 3. Hi-Hat
  const hihat = new Tone.NoiseSynth({
    noise: { type: "white" },
    envelope: { attack: 0.001, decay: 0.1, sustain: 0, release: 0.1 }
  }).connect(synthFilter);
  
  // Bassline notes pattern
  const bassNotes = ["C2", "C2", "Eb2", "C2", "F2", "C2", "Bb1", "C2"];
  
  // Sequencer loop (16th notes)
  let step = 0;
  Tone.Transport.scheduleRepeat((time) => {
    // 4/4 Kick on the downbeat (0, 4, 8, 12)
    if (step % 4 === 0) {
      kick.triggerAttackRelease("C1", "8n", time);
    }
    
    // Offbeat hi-hat (2, 6, 10, 14)
    if (step % 4 === 2) {
      hihat.triggerAttackRelease("16n", time, 0.3);
    }
    
    // Driving 16th note bassline (skipping the kick beats)
    if (step % 4 !== 0) {
      bass.triggerAttackRelease(bassNotes[step % 8], "16n", time, 0.5);
    }
    
    step = (step + 1) % 16;
  }, "16n");

  // Set BPM and start transport
  Tone.Transport.bpm.value = 130;
  Tone.Transport.start();
  
  // Clean up UI and update state
  audioStarted = true;
  startBtn.remove();
}

๐Ÿ”ง Subcomponents:

function-call Audio Context Initialization await Tone.start();

Initializes the Tone.js audio context (required by browsers before playing any sound)

calculation Audio Signal Chain Setup masterVolume = new Tone.Volume(-20).toDestination(); synthFilter = new Tone.Filter(250, "lowpass").connect(masterVolume);

Creates the audio output chain: all sounds โ†’ filter โ†’ volume โ†’ speakers

calculation Kick Drum Synthesizer const kick = new Tone.MembraneSynth({...}).connect(synthFilter);

Creates a bass drum sound using a membrane synth with specific envelope and pitch parameters

calculation FM Bassline Synthesizer const bass = new Tone.FMSynth({...}).connect(synthFilter);

Creates a gritty bass sound using FM synthesis (frequency modulation) with square wave oscillator

calculation Hi-Hat Synthesizer const hihat = new Tone.NoiseSynth({...}).connect(synthFilter);

Creates a hi-hat sound using white noise with a short envelope

for-loop 16-Step Sequencer Tone.Transport.scheduleRepeat((time) => { ... }, "16n");

Creates a repeating pattern that triggers drum sounds at specific steps in a 16-step sequence

conditional Kick Trigger Logic if (step % 4 === 0) { kick.triggerAttackRelease("C1", "8n", time); }

Plays the kick drum on every 4th step (downbeats: 0, 4, 8, 12)

conditional Hi-Hat Trigger Logic if (step % 4 === 2) { hihat.triggerAttackRelease("16n", time, 0.3); }

Plays the hi-hat on steps 2, 6, 10, 14 (offbeats)

conditional Bassline Trigger Logic if (step % 4 !== 0) { bass.triggerAttackRelease(bassNotes[step % 8], "16n", time, 0.5); }

Plays the bassline on all non-kick steps, cycling through the bassNotes array

Line by Line:

async function initAudio() {
Declares an async function (can use 'await' keyword) that sets up all audio synthesis and sequencing
await Tone.start();
Initializes Tone.js audio context. 'await' pauses execution until the audio system is ready. Required by browsers before playing sound
masterVolume = new Tone.Volume(-20).toDestination();
Creates a volume control node starting at -20dB (quiet), connected to the speakers (.toDestination()). This is the final output stage
synthFilter = new Tone.Filter(250, "lowpass").connect(masterVolume);
Creates a lowpass filter starting at 250Hz, connected to the volume control. All synths will route through this filter
const kick = new Tone.MembraneSynth({...}).connect(synthFilter);
Creates a kick drum using MembraneSynth (simulates a drum head). The envelope makes it attack quickly (0.001s) and decay over 0.4s
pitchDecay: 0.05,
The pitch of the kick drum falls over 0.05 seconds, creating that classic 'boom' sound
octaves: 4,
The pitch drops across 4 octaves during the decay, creating a deep bass effect
const bass = new Tone.FMSynth({...}).connect(synthFilter);
Creates a bass synth using FM synthesis. The modulation index of 2 creates harmonically rich, gritty tones
oscillator: { type: "square" },
Uses a square wave oscillator for the bass, which has a buzzy, aggressive character
const hihat = new Tone.NoiseSynth({...}).connect(synthFilter);
Creates a hi-hat using white noise (random frequencies) with a very short envelope (0.001s attack, 0.1s decay)
const bassNotes = ["C2", "C2", "Eb2", "C2", "F2", "C2", "Bb1", "C2"];
Defines an 8-note pattern that repeats. The bassline plays these notes in sequence during the 16-step pattern
let step = 0;
Initializes a counter that tracks which step (0-15) of the 16-step sequence is currently playing
Tone.Transport.scheduleRepeat((time) => { ... }, "16n");
Schedules a function to repeat every 16th note. The 'time' parameter is when each step should trigger
if (step % 4 === 0) {
Checks if step is divisible by 4 (steps 0, 4, 8, 12). These are the downbeats where the kick plays
kick.triggerAttackRelease("C1", "8n", time);
Plays the kick at note C1 (very low) for an 8th note duration at the specified time
if (step % 4 === 2) {
Checks if step is 2, 6, 10, or 14 (offbeats). These are where the hi-hat plays
hihat.triggerAttackRelease("16n", time, 0.3);
Plays the hi-hat for a 16th note duration with 0.3 velocity (volume). Note: hihat doesn't use a pitch since it's noise
if (step % 4 !== 0) {
Checks if step is NOT divisible by 4 (all steps except 0, 4, 8, 12). These are where the bassline plays
bass.triggerAttackRelease(bassNotes[step % 8], "16n", time, 0.5);
Plays the bass using a note from bassNotes array (cycling through 8 notes). The 0.5 is velocity (volume)
step = (step + 1) % 16;
Increments step by 1, then uses modulo 16 to wrap it back to 0 after reaching 15, creating a 16-step loop
Tone.Transport.bpm.value = 130;
Sets the tempo to 130 beats per minute
Tone.Transport.start();
Starts the sequencer transport, which begins triggering all scheduled notes
audioStarted = true;
Sets the global flag to true, allowing the draw() function to update audio parameters based on mouse movement
startBtn.remove();
Removes the start button from the page since audio is now playing

windowResized()

windowResized() is a p5.js lifecycle function that fires whenever the window size changes. This keeps the canvas fullscreen and responsive to window resizing.

function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
}

Line by Line:

function windowResized() {
This p5.js built-in function is called automatically whenever the browser window is resized
resizeCanvas(windowWidth, windowHeight);
Updates the canvas size to match the new window dimensions, ensuring the shader fills the entire screen

๐Ÿ“ฆ Key Variables

vertShader string

Contains the GLSL vertex shader code that processes each vertex of the geometry. In this sketch, it's a simple pass-through that doesn't modify vertex positions

const vertShader = `precision highp float; ...`;
fragShader string

Contains the GLSL fragment shader code that calculates the color of each pixel. This is where the raymarching algorithm and liquid metal effect are implemented

const fragShader = `precision highp float; ...`;
liquidShader object

Stores the compiled WebGL shader program created from vertShader and fragShader. Used in draw() to render the effect

let liquidShader = createShader(vertShader, fragShader);
smoothVelX number

Stores the smoothed horizontal velocity of the mouse. Updated each frame using lerp() and damping to create smooth, natural motion

let smoothVelX = 0;
smoothVelY number

Stores the smoothed vertical velocity of the mouse. Updated each frame using lerp() and damping to create smooth, natural motion

let smoothVelY = 0;
audioStarted boolean

Flag that tracks whether the user has clicked the start button and audio is initialized. Used to prevent audio updates before audio context is ready

let audioStarted = false;
startBtn object

Stores the p5.js button element that triggers audio initialization. Removed from the page after audio starts

let startBtn = createButton('ENTER EXPERIENCE');
masterVolume object

Tone.js Volume node that controls the overall output volume. Its value is modulated by mouse velocity in draw()

let masterVolume = new Tone.Volume(-20).toDestination();
synthFilter object

Tone.js Filter node (lowpass) that all synthesizers route through. Its frequency is modulated by mouse velocity to create tonal changes

let synthFilter = new Tone.Filter(250, 'lowpass').connect(masterVolume);

๐Ÿงช Try This!

Experiment with the code by making these changes:

  1. Change the BPM in initAudio() from 130 to 100 or 160 to make the beat faster or slower. Line: 'Tone.Transport.bpm.value = 130;'
  2. Modify the bassNotes array in initAudio() to create a different melody. Try: const bassNotes = ["C2", "D2", "E2", "F2", "G2", "A2", "B2", "C3"]; to create an ascending scale
  3. Change the kick drum pitch from "C1" to "G0" or "D1" in the kick trigger to make it sound deeper or higher. Line: kick.triggerAttackRelease("C1", "8n", time);
  4. Adjust the velocity smoothing factor from 0.1 to 0.05 or 0.2 in draw() to make the shader ripples respond faster or slower to mouse movement. Line: smoothVelX = lerp(smoothVelX, dx, 0.1);
  5. Change the filter's starting frequency from 250 to 500 or 1000 in initAudio() to start with a brighter sound. Line: synthFilter = new Tone.Filter(250, "lowpass").connect(masterVolume);
  6. Modify the ripple amplitude calculation in the fragment shader by changing the multiplier from 0.015 to 0.03 or 0.005 to make ripples more or less pronounced. Search for: 'float amp = min(velMag * 0.015, 0.6);'
  7. Change the sphere radius in the fragment shader from 1.0 to 0.5 or 1.5 to make the liquid metal sphere smaller or larger. Line: 'float d = sdSphere(p, 1.0);'
Open in Editor & Experiment โ†’

๐Ÿ”ง Potential Improvements

Here are some ways this code could be enhanced:

BUG initAudio() - hihat.triggerAttackRelease()

The hihat.triggerAttackRelease() call has parameters in wrong order. It should be (duration, time, velocity) but is written as (duration, time, velocity) where time comes after duration

๐Ÿ’ก Correct the parameter order to: hihat.triggerAttackRelease("16n", time, 0.3); The first parameter should be the duration, second is the time, third is velocity

PERFORMANCE draw() - shader uniforms

Shader uniforms are set every frame even if values haven't changed significantly. This causes unnecessary GPU communication

๐Ÿ’ก Cache previous uniform values and only update them when they change by a meaningful amount. For example, only update u_time when millis() changes by more than a few milliseconds

STYLE Global scope - variable declarations

Global variables (audioStarted, startBtn, masterVolume, synthFilter) are not clearly grouped or commented, making it hard to understand the overall structure

๐Ÿ’ก Add a comment section at the top: '// ===== GLOBAL STATE =====' and group related variables together with explanatory comments

FEATURE draw() - audio reactivity

Mouse velocity only controls filter frequency and volume. The visual shader doesn't directly respond to audio frequency content

๐Ÿ’ก Add FFT analysis to extract frequency data from the audio output and pass it to the shader as a uniform to make visuals respond to actual sound frequencies, not just mouse speed

BUG draw() - velocity damping

Velocity is damped by 0.96 every frame, but this happens AFTER smoothing. If the mouse stops moving, velocity will decay but the shader will still show ripples for several frames

๐Ÿ’ก Consider damping before smoothing, or use a more aggressive damping factor like 0.92 to make ripples stop faster when the mouse stops

STYLE Fragment shader - magic numbers

The shader contains many hardcoded numbers (0.015, 0.6, 10.0, 8.0, 6.0, 7.0, 3.0, 1.5, 1.2, 0.03) that control ripple behavior but aren't clearly labeled

๐Ÿ’ก Add comments explaining what each number does: 'float freq = 10.0; // ripple frequency' or consider making them uniforms so they can be adjusted from JavaScript

Preview

uhh very cool game for cdww - p5.js creative coding sketch preview
Sketch Preview
Code flow diagram showing the structure of uhh very cool game for cdww - Code flow showing setup, draw, initaudio, windowresized
Code Flow Diagram