uhh very cool game for cdww but tiny diffrnet

This sketch creates an interactive, shimmering chrome sphere rendered with WebGL raymarching that responds to mouse movement. The metallic surface ripples and distorts with deep blues and purples, while a procedural techno beat generated with Tone.js plays in the background, with audio parameters modulated by the user's mouse velocity.

πŸŽ“ Concepts You'll Learn

WebGL shadersRaymarchingSigned distance functionsNormal mappingFresnel effectProcedural audio generationAudio reactivityMouse interactionVelocity smoothingGLSL fragment shaders

πŸ”„ Code Flow

Code flow showing setup, draw, initaudio, windowresized

πŸ’‘ Click on function names in the diagram to jump to their code

graph TD start[Start] --> setup[setup] setup --> canvas-creation[Canvas Setup] setup --> button-creation[Start Button Creation] setup --> shader-creation[Shader Compilation] setup --> initaudio[initaudio] setup --> windowresized[windowResized] setup --> ui-cleanup[UI Cleanup] setup --> canvas-resize[Canvas Resize] setup --> draw[draw loop] click setup href "#fn-setup" click initaudio href "#fn-initaudio" click windowresized href "#fn-windowresized" click canvas-creation href "#sub-canvas-creation" click button-creation href "#sub-button-creation" click shader-creation href "#sub-shader-creation" click ui-cleanup href "#sub-ui-cleanup" click canvas-resize href "#sub-canvas-resize" draw --> velocity-calculation[Instantaneous Velocity] draw --> velocity-smoothing[Velocity Smoothing] draw --> velocity-damping[Velocity Damping] draw --> audio-reactivity-block[Audio Reactivity] draw --> filter-modulation[Filter Frequency Modulation] draw --> volume-modulation[Volume Modulation] draw --> shader-activation[Shader Activation] draw --> uniform-passing[Uniform Passing] click draw href "#fn-draw" click velocity-calculation href "#sub-velocity-calculation" click velocity-smoothing href "#sub-velocity-smoothing" click velocity-damping href "#sub-velocity-damping" click audio-reactivity-block href "#sub-audio-reactivity-block" click filter-modulation href "#sub-filter-modulation" click volume-modulation href "#sub-volume-modulation" click shader-activation href "#sub-shader-activation" click uniform-passing href "#sub-uniform-passing" initaudio --> tone-start[Audio Context Initialization] initaudio --> master-volume-creation[Master Volume Node] initaudio --> filter-creation[Lowpass Filter] initaudio --> kick-creation[Kick Drum Synthesizer] initaudio --> bass-creation[Bass Synthesizer] initaudio --> hihat-creation[Hi-Hat Synthesizer] initaudio --> sequencer-loop[Sequencer Loop] click tone-start href "#sub-tone-start" click master-volume-creation href "#sub-master-volume-creation" click filter-creation href "#sub-filter-creation" click kick-creation href "#sub-kick-creation" click bass-creation href "#sub-bass-creation" click hihat-creation href "#sub-hihat-creation" click sequencer-loop href "#sub-sequencer-loop" sequencer-loop --> kick-trigger[Kick Trigger] sequencer-loop --> hihat-trigger[Hi-Hat Trigger] sequencer-loop --> bass-trigger[Bass Trigger] click kick-trigger href "#sub-kick-trigger" click hihat-trigger href "#sub-hihat-trigger" click bass-trigger href "#sub-bass-trigger" transport-start[Transport Start] --> sequencer-loop click transport-start href "#sub-transport-start"

πŸ“ Code Breakdown

setup()

setup() runs once when the sketch starts. In this sketch, it initializes the WebGL canvas, creates UI elements, and compiles the shaders that will render the chrome sphere. The button requires a user gesture to start audio due to browser security policies.

function setup() {
  createCanvas(windowWidth, windowHeight, WEBGL);
  
  // Create UI overlay to trigger audio context
  startBtn = createButton('ENTER EXPERIENCE');
  startBtn.id('start-btn');
  startBtn.mousePressed(initAudio);
  
  // Create instructions text (hidden initially)
  instructions = createDiv('MOVE MOUSE / TOUCH TO SHAPE THE METAL & MODULATE THE BEAT');
  instructions.id('instructions');
  
  // Create creator credit text
  creditText = createDiv('MADE BY CORBUN');
  creditText.id('credit');
  
  liquidShader = createShader(vertShader, fragShader);
  noStroke();
}

πŸ”§ Subcomponents:

function-call Canvas Setup createCanvas(windowWidth, windowHeight, WEBGL)

Creates a fullscreen WebGL canvas that fills the entire window

function-call Start Button Creation startBtn = createButton('ENTER EXPERIENCE')

Creates an interactive button to trigger audio initialization

function-call Shader Compilation liquidShader = createShader(vertShader, fragShader)

Compiles the vertex and fragment shaders into a WebGL shader program

Line by Line:

createCanvas(windowWidth, windowHeight, WEBGL)
Creates a WebGL-enabled canvas that fills the entire window. WEBGL enables 3D graphics and shader support.
startBtn = createButton('ENTER EXPERIENCE')
Creates a clickable button that will trigger the audio initialization when pressed.
startBtn.id('start-btn')
Assigns a CSS ID to the button so it can be styled with the rules in style.css.
startBtn.mousePressed(initAudio)
Connects the button click event to the initAudio() function, which starts the audio context and music.
instructions = createDiv('MOVE MOUSE / TOUCH TO SHAPE THE METAL & MODULATE THE BEAT')
Creates a text element that displays instructions to the user.
creditText = createDiv('MADE BY CORBUN')
Creates a text element that credits the original creator.
liquidShader = createShader(vertShader, fragShader)
Compiles the vertex shader and fragment shader strings into a working WebGL shader program stored in liquidShader.
noStroke()
Disables stroke outlines for all shapes drawn in this sketch.

draw()

draw() runs 60 times per second (by default) and is where all animation happens. This sketch uses draw() to calculate mouse velocity, modulate audio based on that velocity, and pass data to the shader. The shader then renders the chrome sphere for each pixel on screen using raymarchingβ€”a technique that traces rays from the camera through each pixel to find where they intersect the sphere surface.

function draw() {
  // Calculate instantaneous velocity
  let dx = mouseX - pmouseX;
  let dy = mouseY - pmouseY;

  // Smooth the velocity
  smoothVelX = lerp(smoothVelX, dx, 0.1);
  smoothVelY = lerp(smoothVelY, dy, 0.1);

  // Dampen velocity down to zero
  smoothVelX *= 0.96;
  smoothVelY *= 0.96;
  
  // --- AUDIO REACTIVITY ---
  if (audioStarted) {
    // Calculate the magnitude of our velocity
    let velMag = sqrt(smoothVelX * smoothVelX + smoothVelY * smoothVelY);
    
    // Map velocity to filter cutoff (low/muffled when still -> high/bright when moving fast)
    let targetFreq = map(velMag, 0, 40, 250, 8000, true);
    synthFilter.frequency.rampTo(targetFreq, 0.1);
    
    // Map velocity to volume (-20dB when still -> 0dB when moving)
    let targetVol = map(velMag, 0, 30, -20, 0, true);
    masterVolume.volume.rampTo(targetVol, 0.1);
  }

  // Activate shader
  shader(liquidShader);

  // Pass uniforms
  liquidShader.setUniform('u_resolution', [width, height]);
  liquidShader.setUniform('u_time', millis() / 1000.0);
  liquidShader.setUniform('u_mouse', [mouseX, mouseY]);
  liquidShader.setUniform('u_mouse_vel', [smoothVelX, smoothVelY]);

  // Draw fullscreen quad
  rect(-width / 2, -height / 2, width, height);
}

πŸ”§ Subcomponents:

calculation Instantaneous Velocity let dx = mouseX - pmouseX; let dy = mouseY - pmouseY;

Calculates how far the mouse moved since the last frame

calculation Velocity Smoothing smoothVelX = lerp(smoothVelX, dx, 0.1); smoothVelY = lerp(smoothVelY, dy, 0.1);

Smooths velocity changes using linear interpolation to prevent jittery values

calculation Velocity Damping smoothVelX *= 0.96; smoothVelY *= 0.96;

Gradually reduces velocity to zero when the mouse stops moving

conditional Audio Reactivity if (audioStarted) { ... }

Only modulates audio when the audio context has been started

calculation Filter Frequency Modulation let targetFreq = map(velMag, 0, 40, 250, 8000, true); synthFilter.frequency.rampTo(targetFreq, 0.1);

Maps mouse velocity to the filter cutoff frequency, making the sound brighter when moving faster

calculation Volume Modulation let targetVol = map(velMag, 0, 30, -20, 0, true); masterVolume.volume.rampTo(targetVol, 0.1);

Maps mouse velocity to volume, making the music louder when the user moves the mouse

function-call Shader Activation shader(liquidShader);

Activates the WebGL shader program for subsequent drawing

function-call Uniform Passing liquidShader.setUniform(...)

Sends data from JavaScript to the shader (time, mouse position, resolution, velocity)

Line by Line:

let dx = mouseX - pmouseX;
Calculates horizontal mouse movement by subtracting the previous frame's X position from the current X position.
let dy = mouseY - pmouseY;
Calculates vertical mouse movement by subtracting the previous frame's Y position from the current Y position.
smoothVelX = lerp(smoothVelX, dx, 0.1);
Smooths the X velocity using linear interpolation (lerp). The 0.1 factor means it moves 10% of the way toward the new value each frame, creating smooth transitions.
smoothVelY = lerp(smoothVelY, dy, 0.1);
Smooths the Y velocity using the same interpolation technique as X velocity.
smoothVelX *= 0.96;
Multiplies velocity by 0.96 each frame, gradually reducing it to zero. This creates a 'friction' effect where movement naturally decays.
smoothVelY *= 0.96;
Applies the same damping to Y velocity as X velocity.
if (audioStarted) {
Only runs the audio reactivity code if the user has clicked the start button and audio has been initialized.
let velMag = sqrt(smoothVelX * smoothVelX + smoothVelY * smoothVelY);
Calculates the magnitude (total speed) of the velocity vector using the Pythagorean theorem. This gives a single number representing how fast the mouse is moving.
let targetFreq = map(velMag, 0, 40, 250, 8000, true);
Maps velocity magnitude (0-40) to filter frequency (250-8000 Hz). Slow movement = muffled sound (250 Hz), fast movement = bright sound (8000 Hz). The 'true' parameter constrains the value to the output range.
synthFilter.frequency.rampTo(targetFreq, 0.1);
Smoothly transitions the filter frequency to the target value over 0.1 seconds, preventing abrupt audio changes.
let targetVol = map(velMag, 0, 30, -20, 0, true);
Maps velocity magnitude (0-30) to volume in decibels (-20dB to 0dB). Slow = quiet, fast = loud.
masterVolume.volume.rampTo(targetVol, 0.1);
Smoothly transitions the volume to the target value over 0.1 seconds.
shader(liquidShader);
Activates the WebGL shader program so that subsequent drawing commands use this shader.
liquidShader.setUniform('u_resolution', [width, height]);
Passes the canvas resolution to the shader so it can calculate proper aspect ratio and pixel coordinates.
liquidShader.setUniform('u_time', millis() / 1000.0);
Passes elapsed time in seconds to the shader. The shader uses this to animate the ripples and color shifts over time.
liquidShader.setUniform('u_mouse', [mouseX, mouseY]);
Passes the current mouse position to the shader (though this particular shader doesn't use it in the current implementation).
liquidShader.setUniform('u_mouse_vel', [smoothVelX, smoothVelY]);
Passes the smoothed mouse velocity to the shader, which uses it to scale the amplitude of the ripples.
rect(-width / 2, -height / 2, width, height);
Draws a rectangle covering the entire canvas. In WebGL mode, this rectangle is rendered using the active shader, creating the fullscreen effect.

initAudio()

initAudio() is called when the user clicks the start button. It initializes the Tone.js audio library, creates three synthesizers (kick, bass, hi-hat), and sets up a 16-step sequencer that plays a procedural techno beat at 130 BPM. The function is async because starting the Web Audio API context is asynchronous and requires a user gesture for security reasons. The sequencer uses modulo arithmetic to trigger different instruments on different beats, creating a polyrhythmic pattern.

async function initAudio() {
  // Required: Start Tone context on user gesture
  await Tone.start();
  
  // Create master effects nodes controlled by the mouse movement
  masterVolume = new Tone.Volume(-20).toDestination();
  synthFilter = new Tone.Filter(250, "lowpass").connect(masterVolume);
  
  // 1. Kick Drum
  const kick = new Tone.MembraneSynth({
    pitchDecay: 0.05,
    octaves: 4,
    oscillator: { type: "sine" },
    envelope: { attack: 0.001, decay: 0.4, sustain: 0.01, release: 1.4 }
  }).connect(synthFilter);
  
  // 2. Gritty FM Bassline
  const bass = new Tone.FMSynth({
    harmonicity: 1,
    modulationIndex: 2,
    oscillator: { type: "square" },
    envelope: { attack: 0.01, decay: 0.2, sustain: 0.1, release: 0.5 }
  }).connect(synthFilter);
  
  // 3. Hi-Hat
  const hihat = new Tone.NoiseSynth({
    noise: { type: "white" },
    envelope: { attack: 0.001, decay: 0.1, sustain: 0, release: 0.1 }
  }).connect(synthFilter);
  
  // Bassline notes pattern
  const bassNotes = ["C2", "C2", "Eb2", "C2", "F2", "C2", "Bb1", "C2"];
  
  // Sequencer loop (16th notes)
  let step = 0;
  Tone.Transport.scheduleRepeat((time) => {
    // 4/4 Kick on the downbeat (0, 4, 8, 12)
    if (step % 4 === 0) {
      kick.triggerAttackRelease("C1", "8n", time);
    }
    
    // Offbeat hi-hat (2, 6, 10, 14)
    if (step % 4 === 2) {
      hihat.triggerAttackRelease("16n", time, 0.3);
    }
    
    // Driving 16th note bassline (skipping the kick beats)
    if (step % 4 !== 0) {
      bass.triggerAttackRelease(bassNotes[step % 8], "16n", time, 0.5);
    }
    
    step = (step + 1) % 16;
  }, "16n");

  // Set BPM and start transport
  Tone.Transport.bpm.value = 130;
  Tone.Transport.start();
  
  // Clean up UI and update state
  audioStarted = true;
  startBtn.remove();
  
  // Fade in texts
  instructions.style('opacity', '1');
  creditText.style('opacity', '1');
}

πŸ”§ Subcomponents:

function-call Audio Context Initialization await Tone.start();

Initializes the Web Audio API context, required by browser security policies

object-creation Master Volume Node masterVolume = new Tone.Volume(-20).toDestination();

Creates the main volume control that connects to the speakers

object-creation Lowpass Filter synthFilter = new Tone.Filter(250, "lowpass").connect(masterVolume);

Creates a filter that removes high frequencies, controlled by mouse movement

object-creation Kick Drum Synthesizer const kick = new Tone.MembraneSynth({...}).connect(synthFilter);

Creates a drum-like synth that produces the bass kick sound

object-creation FM Bass Synthesizer const bass = new Tone.FMSynth({...}).connect(synthFilter);

Creates a frequency-modulated synth for the bassline melody

object-creation Hi-Hat Noise Synthesizer const hihat = new Tone.NoiseSynth({...}).connect(synthFilter);

Creates a noise-based synth for hi-hat percussion sounds

function-call Sequencer Loop Tone.Transport.scheduleRepeat((time) => {...}, "16n");

Creates a repeating pattern that triggers drum and bass notes at 16th note intervals

conditional Kick Trigger if (step % 4 === 0) { kick.triggerAttackRelease(...) }

Triggers the kick drum on beats 0, 4, 8, and 12 of a 16-step pattern

conditional Hi-Hat Trigger if (step % 4 === 2) { hihat.triggerAttackRelease(...) }

Triggers the hi-hat on the offbeats (steps 2, 6, 10, 14)

conditional Bass Trigger if (step % 4 !== 0) { bass.triggerAttackRelease(...) }

Triggers the bassline on every 16th note except the kick beats

function-call Transport Start Tone.Transport.start();

Starts the timing system that drives all scheduled events

function-call UI Cleanup startBtn.remove(); instructions.style('opacity', '1');

Removes the start button and fades in the instructions and credit text

Line by Line:

async function initAudio() {
Declares an async function, which allows the use of 'await' for asynchronous operations like starting the audio context.
await Tone.start();
Waits for the Web Audio API context to start. This is required by browsers before any sound can play. The 'await' keyword pauses execution until Tone.start() completes.
masterVolume = new Tone.Volume(-20).toDestination();
Creates a Volume node starting at -20dB (quiet) and connects it to the destination (speakers). This is the final output point for all audio.
synthFilter = new Tone.Filter(250, "lowpass").connect(masterVolume);
Creates a lowpass filter starting at 250 Hz and connects it to the master volume. All synths will connect to this filter, so their output passes through it.
const kick = new Tone.MembraneSynth({...}).connect(synthFilter);
Creates a MembraneSynth (drum-like sound) with specific envelope settings and connects it to the filter. The envelope controls how the sound attacks, decays, sustains, and releases.
pitchDecay: 0.05,
The pitch decays (drops in frequency) over 50 milliseconds, creating a realistic drum sound that starts high and quickly drops.
octaves: 4,
The pitch drops across 4 octaves, creating a wide pitch sweep that sounds like a deep kick drum.
const bass = new Tone.FMSynth({...}).connect(synthFilter);
Creates an FM (frequency modulation) synthesizer for the bassline. FM synthesis creates complex timbres by modulating one oscillator with another.
harmonicity: 1,
The modulating oscillator is at the same frequency as the carrier, creating a gritty, metallic sound.
modulationIndex: 2,
Controls how much the modulator affects the carrier. Higher values create more complex, noisier timbres.
const hihat = new Tone.NoiseSynth({...}).connect(synthFilter);
Creates a NoiseSynth that generates white noise shaped by an envelope, creating a hi-hat percussion sound.
const bassNotes = ["C2", "C2", "Eb2", "C2", "F2", "C2", "Bb1", "C2"];
Defines an 8-note pattern for the bassline. This pattern repeats throughout the song, creating a memorable melodic motif.
let step = 0;
Initializes a step counter that tracks which position in the 16-step sequencer pattern we're at.
Tone.Transport.scheduleRepeat((time) => {...}, "16n");
Schedules a repeating function to run every 16th note. The 'time' parameter is the exact audio-thread time when this should trigger.
if (step % 4 === 0) {
Checks if step is divisible by 4 (steps 0, 4, 8, 12). These are the downbeats where the kick drum plays.
kick.triggerAttackRelease("C1", "8n", time);
Triggers the kick drum at note C1 (very low frequency) for an 8th note duration at the specified audio time.
if (step % 4 === 2) {
Checks if step modulo 4 equals 2 (steps 2, 6, 10, 14). These are the offbeats where the hi-hat plays.
hihat.triggerAttackRelease("16n", time, 0.3);
Triggers the hi-hat for a 16th note duration with 0.3 velocity (volume relative to max).
if (step % 4 !== 0) {
Checks if step is NOT divisible by 4, meaning every 16th note except the kick beats.
bass.triggerAttackRelease(bassNotes[step % 8], "16n", time, 0.5);
Triggers the bass synth with a note from the bassNotes array (cycling through 8 notes), for a 16th note duration at 0.5 velocity.
step = (step + 1) % 16;
Increments the step counter and wraps it back to 0 after reaching 15, creating a 16-step loop.
Tone.Transport.bpm.value = 130;
Sets the tempo to 130 beats per minute, controlling how fast all scheduled events play.
Tone.Transport.start();
Starts the transport (timing system), which begins executing all scheduled events.
audioStarted = true;
Sets the global audioStarted flag to true, which enables audio reactivity in the draw() function.
startBtn.remove();
Removes the start button from the DOM since audio has been initialized.
instructions.style('opacity', '1');
Fades in the instructions text by setting its opacity to 1 (the CSS transition property animates this over 2 seconds).
creditText.style('opacity', '1');
Fades in the credit text in the same way.

windowResized()

windowResized() is a built-in p5.js function that gets called automatically whenever the browser window is resized. By calling resizeCanvas() inside it, we ensure the WebGL canvas always fills the entire window, maintaining the fullscreen experience even when the user resizes their browser.

function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
}

πŸ”§ Subcomponents:

function-call Canvas Resize resizeCanvas(windowWidth, windowHeight)

Resizes the canvas to match the current window dimensions

Line by Line:

function windowResized() {
This is a special p5.js function that automatically runs whenever the window is resized.
resizeCanvas(windowWidth, windowHeight);
Resizes the canvas to match the new window width and height, ensuring the sketch remains fullscreen.

πŸ“¦ Key Variables

vertShader string

Contains the GLSL vertex shader code. In this sketch, it's a simple pass-through shader that doesn't modify vertex positions.

const vertShader = `precision highp float; ...`;
fragShader string

Contains the GLSL fragment shader code that performs raymarching to render the liquid chrome sphere with ripples and reflections.

const fragShader = `precision highp float; ...`;
liquidShader object

Stores the compiled WebGL shader program created from vertShader and fragShader. Used in draw() to render the scene.

let liquidShader = createShader(vertShader, fragShader);
smoothVelX number

Stores the smoothed horizontal mouse velocity. Updated each frame using lerp() and damping to create smooth animation.

let smoothVelX = 0;
smoothVelY number

Stores the smoothed vertical mouse velocity. Updated each frame using lerp() and damping to create smooth animation.

let smoothVelY = 0;
audioStarted boolean

Tracks whether the audio context has been initialized. Controls whether audio reactivity is active in draw().

let audioStarted = false;
startBtn object

Stores a reference to the 'ENTER EXPERIENCE' button element. Used to remove it after audio starts.

let startBtn = createButton('ENTER EXPERIENCE');
instructions object

Stores a reference to the instructions text div that appears after audio starts.

let instructions = createDiv('MOVE MOUSE / TOUCH TO SHAPE THE METAL & MODULATE THE BEAT');
creditText object

Stores a reference to the creator credit text div that appears after audio starts.

let creditText = createDiv('MADE BY CORBUN');
masterVolume object

A Tone.js Volume node that controls the overall output volume. Connected to speakers and modulated by mouse velocity.

let masterVolume = new Tone.Volume(-20).toDestination();
synthFilter object

A Tone.js lowpass Filter node that all synthesizers connect through. Cutoff frequency is modulated by mouse velocity.

let synthFilter = new Tone.Filter(250, 'lowpass').connect(masterVolume);

πŸ§ͺ Try This!

Experiment with the code by making these changes:

  1. Change the BPM from 130 to 90 or 160 in the initAudio() function (line with 'Tone.Transport.bpm.value = 130') to speed up or slow down the techno beat.
  2. Modify the bassNotes array in initAudio() to create a different melody. Try changing "C2", "Eb2", "F2", "Bb1" to other notes like "D2", "G2", "A2" to hear how the bassline changes.
  3. In the draw() function, change the velocity mapping ranges (currently 0-40 for frequency and 0-30 for volume) to make the audio reactivity more or less sensitive to mouse movement.
  4. Modify the smoothing factor in draw() from 0.1 to 0.05 or 0.2 to make velocity changes more or less responsive.
  5. Change the damping factor from 0.96 to 0.90 or 0.99 to make the velocity decay faster or slower after you stop moving the mouse.
  6. In the fragment shader, change the ripple frequency from 10.0 to 5.0 or 20.0 to make the surface waves larger or smaller.
  7. Modify the color values in the fragment shader (darkBlue, purple, silver) to create different metallic color schemes.
  8. Change the raymarching loop iteration count from 80 to 40 or 120 to see how it affects rendering quality and performance.
Open in Editor & Experiment β†’

πŸ”§ Potential Improvements

Here are some ways this code could be enhanced:

BUG initAudio() - hihat.triggerAttackRelease()

The hihat triggerAttackRelease() call has parameters in the wrong order. The second parameter should be duration, but it's passing 'time' as the second parameter.

πŸ’‘ Change 'hihat.triggerAttackRelease("16n", time, 0.3)' to 'hihat.triggerAttackRelease("16n", time)' or use the correct parameter order: hihat.triggerAttackRelease(duration, time, velocity).

PERFORMANCE draw() - Raymarching loop

The fragment shader performs 80 iterations of raymarching per pixel, which is computationally expensive on high-resolution displays. This can cause performance drops on slower devices.

πŸ’‘ Reduce the loop iteration count from 80 to 40-60 for better performance on mobile devices, or make it adaptive based on device capabilities using a uniform.

STYLE Global variable declarations

Global variables (audioStarted, startBtn, instructions, etc.) are declared without 'let', 'const', or 'var', making them implicit globals which is poor practice.

πŸ’‘ Add proper declarations at the top of the file: 'let audioStarted = false; let startBtn; let instructions; let creditText; let masterVolume; let synthFilter; let liquidShader; let smoothVelX = 0; let smoothVelY = 0;'

FEATURE Audio reactivity

The audio reactivity only responds to mouse movement velocity, not to the actual audio spectrum or beat. The music could be more tightly synchronized with visual changes.

πŸ’‘ Add Tone.js FFT analysis to detect frequency content in the audio and use that to modulate shader parameters like ripple amplitude or color based on bass/mid/treble content.

BUG draw() - Velocity calculation

pmouseX and pmouseY are only updated at the end of draw(), so on the first frame they may be undefined or 0, causing a large initial velocity spike.

πŸ’‘ Initialize pmouseX and pmouseY in setup() to match mouseX and mouseY, or add a check to ignore the first frame's velocity calculation.

STYLE Fragment shader - Magic numbers

The shader contains many hardcoded values (10.0 for frequency, 0.015 for amplitude scaling, 0.6 for max amplitude) that are difficult to tune without editing shader code.

πŸ’‘ Convert these to uniforms that can be controlled from JavaScript, allowing real-time tweaking without recompiling the shader.

Preview

uhh very cool game for cdww but tiny diffrnet - p5.js creative coding sketch preview
Sketch Preview
Code flow diagram showing the structure of uhh very cool game for cdww but tiny diffrnet - Code flow showing setup, draw, initaudio, windowresized
Code Flow Diagram