AI Emotion Particles - xelsed.ai

This sketch creates an interactive particle system that responds to user emotions detected by OpenAI's API. Users type text expressing their feelings, and the sketch analyzes the emotion in real-time, transforming particle colors, speeds, and movement patterns to visually represent emotions like happy (yellow, fast), sad (blue, slow), angry (red, chaotic), calm (green, gentle), and excited (orange, energetic).

๐ŸŽ“ Concepts You'll Learn

Particle systemsPerlin noiseAPI integrationAsync/awaitObject-oriented programmingColor manipulationAnimation loopsEvent handlingCanvas resizingEncryption/decryption

๐Ÿ”„ Code Flow

Code flow showing getapikey, setup, draw, analyzeemotion, windowresized, particle

๐Ÿ’ก Click on function names in the diagram to jump to their code

graph TD start[Start] --> setup[setup] setup --> draw[draw loop] setup --> canvas-creation[Canvas Setup] setup --> input-setup[Input Field Creation] setup --> particle-initialization[Particle Array Population] draw --> trail-effect[Fading Trail Background] draw --> particle-loop[Particle Update and Render Loop] particle-loop --> particle-lifecycle[Particle Lifecycle Management] particle-lifecycle --> particle-initialization particle-loop --> particle-update[Particle Update] particle-update --> update-method[Update Method] update-method --> gravity-application[Gravity Application] update-method --> perlin-forces[Perlin Noise Forces] update-method --> friction[Friction/Drag] particle-update --> display-method[Display Method] particle-update --> isdead-method[isDead Method] click setup href "#fn-setup" click draw href "#fn-draw" click canvas-creation href "#sub-canvas-creation" click input-setup href "#sub-input-setup" click particle-initialization href "#sub-particle-initialization" click trail-effect href "#sub-trail-effect" click particle-loop href "#sub-particle-loop" click particle-lifecycle href "#sub-particle-lifecycle" click particle-update href "#sub-particle-update" click update-method href "#sub-update-method" click gravity-application href "#sub-gravity-application" click perlin-forces href "#sub-perlin-forces" click friction href "#sub-friction" click display-method href "#sub-display-method" click isdead-method href "#sub-isdead-method"

๐Ÿ“ Code Breakdown

getApiKey()

This function demonstrates basic encryption/decryption using XOR operations. While not cryptographically secure, it obfuscates the API key from casual inspection. The atob() function is JavaScript's built-in Base64 decoder.

function getApiKey(){
  return atob(encoded).split('').map(c=>String.fromCharCode(c.charCodeAt(0)^key)).join('');
}

๐Ÿ”ง Subcomponents:

calculation Base64 Decoding atob(encoded)

Converts the base64-encoded string back to its original encrypted form

calculation XOR Decryption map(c=>String.fromCharCode(c.charCodeAt(0)^key))

Applies XOR operation with the key to decrypt each character

Line by Line:

atob(encoded)
Decodes the base64-encoded string into its encrypted binary form
split('')
Splits the decoded string into an array of individual characters
map(c=>String.fromCharCode(c.charCodeAt(0)^key))
For each character, gets its character code, XORs it with the key (0x5A), and converts back to a character
join('')
Combines all decrypted characters back into a single string (the API key)

setup()

setup() runs once when the sketch starts. It initializes the canvas, creates UI elements, and populates the particle array. The input event listener means analyzeEmotion will be called on every keystroke, allowing real-time emotion detection.

function setup() {
  createCanvas(windowWidth, windowHeight);
  API_KEY = getApiKey(); // Decrypt the API key

  // Create text input field
  emotionInput = createInput('');
  emotionInput.attribute('placeholder', 'Type something to express your emotion...');
  emotionInput.input(analyzeEmotion); // Call analyzeEmotion on every keystroke

  // Create emotion display element (div)
  emotionDisplay = createDiv(currentEmotion);
  emotionDisplay.id('emotion-display');

  // Initialize 200 particles with the default emotion's configuration
  for (let i = 0; i < 200; i++) {
    particles.push(new Particle(random(width), random(height), emotionConfigs[currentEmotion]));
  }
}

๐Ÿ”ง Subcomponents:

calculation Canvas Setup createCanvas(windowWidth, windowHeight)

Creates a full-screen canvas that fills the entire browser window

calculation Input Field Creation emotionInput = createInput('')

Creates an HTML text input field for users to type emotions

for-loop Particle Array Population for (let i = 0; i < 200; i++) { particles.push(...) }

Creates 200 initial particles distributed randomly across the canvas

Line by Line:

createCanvas(windowWidth, windowHeight)
Creates a p5.js canvas that fills the entire browser window using dynamic width and height values
API_KEY = getApiKey()
Decrypts the OpenAI API key from the encoded string and stores it in the global API_KEY variable
emotionInput = createInput('')
Creates a text input HTML element and stores a reference to it for later use
emotionInput.attribute('placeholder', 'Type something to express your emotion...')
Sets the placeholder text that appears in the input field to guide users
emotionInput.input(analyzeEmotion)
Registers the analyzeEmotion function to run every time the user types in the input field
emotionDisplay = createDiv(currentEmotion)
Creates an HTML div element to display the currently detected emotion
emotionDisplay.id('emotion-display')
Assigns an ID to the emotion display div so CSS styling can be applied to it
for (let i = 0; i < 200; i++) { particles.push(new Particle(...)) }
Creates 200 particle objects with random positions and the default calm emotion configuration

draw()

draw() runs 60 times per second by default. The semi-transparent background creates a motion blur effect, and particles are continuously updated and redrawn. Dead particles are immediately replaced to maintain visual continuity and constant particle count.

function draw() {
  // Semi-transparent background for a fading trail effect
  background(0, 0, 0, 20);

  // Update and display particles
  for (let i = particles.length - 1; i >= 0; i--) {
    particles[i].update();
    particles[i].display();
    if (particles[i].isDead()) {
      // Reinitialize dead particles based on the current emotion
      particles[i] = new Particle(random(width), random(height), emotionConfigs[currentEmotion]);
    }
  }
}

๐Ÿ”ง Subcomponents:

calculation Fading Trail Background background(0, 0, 0, 20)

Creates a semi-transparent black overlay that slowly fades previous frames, creating motion trails

for-loop Particle Update and Render Loop for (let i = particles.length - 1; i >= 0; i--)

Iterates through all particles in reverse order to safely remove dead particles

conditional Particle Lifecycle Management if (particles[i].isDead()) { particles[i] = new Particle(...) }

Replaces dead particles with new ones to maintain constant particle count

Line by Line:

background(0, 0, 0, 20)
Fills the canvas with semi-transparent black (alpha=20). This creates a fading trail effect instead of clearing the canvas completely
for (let i = particles.length - 1; i >= 0; i--)
Loops through particles in reverse order (from last to first). Reverse iteration is safe for removing/replacing array elements
particles[i].update()
Calls the update method on the current particle, which applies forces, gravity, and updates its position
particles[i].display()
Calls the display method to draw the particle at its current position with appropriate color and transparency
if (particles[i].isDead())
Checks if the particle's age exceeds its lifespan, indicating it should be replaced
particles[i] = new Particle(random(width), random(height), emotionConfigs[currentEmotion])
Creates a new particle at a random position with the current emotion's configuration, replacing the dead particle

analyzeEmotion()

This async function demonstrates modern JavaScript patterns: async/await for handling asynchronous API calls, try/catch/finally for error handling, and guard clauses to prevent invalid states. The function is called on every keystroke via the input event listener, making emotion detection real-time. The loading flag prevents race conditions where multiple API calls could overlap.

async function analyzeEmotion() {
  const text = emotionInput.value();

  // If input is empty, default to calm
  if (text.trim() === '') {
    currentEmotion = "calm";
    emotionDisplay.html(currentEmotion);
    return;
  }

  // Prevent multiple API calls if one is already in progress
  if (loading) {
    console.log("Still loading, please wait...");
    return;
  }

  loading = true;
  emotionDisplay.html(`Loading...`); // Indicate loading state to the user

  try {
    const response = await fetch('https://api.openai.com/v1/chat/completions', {
      method: 'POST',
      headers: {
        'Content-Type': 'application/json',
        'Authorization': 'Bearer ' + API_KEY // Use the decrypted API key
      },
      body: JSON.stringify({
        model: 'gpt-4o-mini', // Cost-effective and capable model
        messages: [{
          role: 'user',
          content: 'Return ONLY one word: happy, sad, angry, calm, or excited based on this text: ' + text
        }],
        max_tokens: 10 // Strongly encourages a single-word response
      })
    });

    // Check if the API request was successful
    if (!response.ok) {
      const errorData = await response.json();
      throw new Error(`OpenAI API error: ${response.status} - ${errorData.error ? errorData.error.message : 'Unknown error'}`);
    }

    const data = await response.json();
    let detectedEmotion = data.choices[0].message.content.toLowerCase().trim();

    // Validate and normalize the detected emotion
    const validEmotions = Object.keys(emotionConfigs);
    if (validEmotions.includes(detectedEmotion)) {
      currentEmotion = detectedEmotion;
    } else {
      console.warn(`Detected emotion "${detectedEmotion}" is not one of the expected values. Defaulting to calm.`);
      currentEmotion = "calm";
    }

    emotionDisplay.html(currentEmotion); // Update the displayed emotion

  } catch (error) {
    console.error("Error analyzing emotion:", error);
    currentEmotion = "calm"; // Default to calm on API error
    emotionDisplay.html(`Error: ${error.message || 'Could not analyze'}`);
  } finally {
    loading = false; // Reset loading flag
  }
}

๐Ÿ”ง Subcomponents:

conditional Empty Input Check if (text.trim() === '') { currentEmotion = "calm"; ... return; }

Returns early if the input is empty, defaulting to calm emotion

conditional Loading State Guard if (loading) { console.log(...); return; }

Prevents multiple concurrent API calls by checking the loading flag

calculation OpenAI API Call const response = await fetch('https://api.openai.com/v1/chat/completions', {...})

Sends a POST request to OpenAI's API with the user's text to analyze emotion

conditional Response Status Check if (!response.ok) { ... throw new Error(...) }

Checks if the API request was successful and throws an error if not

calculation Emotion Extraction let detectedEmotion = data.choices[0].message.content.toLowerCase().trim()

Extracts the emotion word from the API response and normalizes it

conditional Emotion Validation if (validEmotions.includes(detectedEmotion)) { ... } else { ... }

Ensures the detected emotion is one of the five supported emotions

Line by Line:

const text = emotionInput.value()
Gets the current text from the input field that the user has typed
if (text.trim() === '') { currentEmotion = "calm"; emotionDisplay.html(currentEmotion); return; }
If the input is empty or only whitespace, sets emotion to calm and exits early without calling the API
if (loading) { console.log("Still loading, please wait..."); return; }
If an API call is already in progress, logs a message and exits to prevent duplicate requests
loading = true
Sets the loading flag to true, preventing additional API calls until this one completes
emotionDisplay.html(`Loading...`)
Updates the emotion display to show 'Loading...' while waiting for the API response
const response = await fetch('https://api.openai.com/v1/chat/completions', {...})
Sends a POST request to OpenAI's API with the user's text and waits for the response
'Authorization': 'Bearer ' + API_KEY
Includes the decrypted API key in the Authorization header for authentication
'Return ONLY one word: happy, sad, angry, calm, or excited based on this text: ' + text
The prompt sent to GPT that instructs it to return only one emotion word from the specified list
max_tokens: 10
Limits the API response to 10 tokens maximum, strongly encouraging a single-word response
if (!response.ok) { ... throw new Error(...) }
Checks if the HTTP response status indicates success; if not, throws an error with details
const data = await response.json()
Parses the API response JSON to access the emotion data
let detectedEmotion = data.choices[0].message.content.toLowerCase().trim()
Extracts the emotion word from the nested response structure and normalizes it to lowercase without whitespace
const validEmotions = Object.keys(emotionConfigs)
Gets an array of valid emotion names from the emotionConfigs object keys: ['happy', 'sad', 'angry', 'calm', 'excited']
if (validEmotions.includes(detectedEmotion)) { currentEmotion = detectedEmotion } else { currentEmotion = "calm" }
Updates currentEmotion if it's valid; otherwise defaults to calm to prevent undefined emotion configurations
emotionDisplay.html(currentEmotion)
Updates the displayed emotion text to show the newly detected emotion
catch (error) { ... currentEmotion = "calm"; emotionDisplay.html(...) }
If any error occurs during the API call, logs it, defaults to calm, and displays the error message
finally { loading = false }
Always resets the loading flag to false when the function completes, whether successful or not

windowResized()

windowResized() is a special p5.js function that automatically runs whenever the browser window is resized. This ensures the canvas and UI elements remain responsive and properly sized on different screen dimensions.

function windowResized() {
  resizeCanvas(windowWidth, windowHeight);
  emotionInput.size(width - 20, 30);
}

๐Ÿ”ง Subcomponents:

calculation Canvas Resize resizeCanvas(windowWidth, windowHeight)

Adjusts the p5.js canvas to match the new window dimensions

calculation Input Field Resize emotionInput.size(width - 20, 30)

Resizes the input field to match the new canvas width

Line by Line:

resizeCanvas(windowWidth, windowHeight)
Automatically called by p5.js when the window is resized; updates the canvas to fill the new window dimensions
emotionInput.size(width - 20, 30)
Resizes the input field to be almost as wide as the canvas (width - 20 for padding) and 30 pixels tall

Particle class

The Particle class encapsulates all behavior for individual particles. Key features include: (1) Perlin noise for organic movement, (2) Dynamic property updates based on current emotion, (3) Gravity and friction for realistic physics, (4) Boundary wrapping to keep particles on screen, and (5) Age-based alpha fading for smooth visual transitions. The update() method is called every frame and the display() method renders the particle.

class Particle {
  constructor(x, y, emotionConfig) {
    this.x = x;
    this.y = y;
    this.baseSpeed = emotionConfig.speedFactor;
    this.directionNoiseScale = emotionConfig.directionNoiseScale;
    this.color = emotionConfig.color;
    this.size = emotionConfig.size;
    this.lifespan = emotionConfig.life;
    this.age = 0;
    this.gravity = emotionConfig.gravity;

    // Initial velocity based on Perlin noise for organic starting movement
    this.vx = (noise(x * this.directionNoiseScale, y * this.directionNoiseScale) - 0.5) * this.baseSpeed * 2;
    this.vy = (noise(y * this.directionNoiseScale, x * this.directionNoiseScale) - 0.5) * this.baseSpeed * 2;
  }

  update() {
    // Dynamically adjust particle properties based on the current emotion
    const config = emotionConfigs[currentEmotion];
    if (config) {
      this.baseSpeed = config.speedFactor;
      this.directionNoiseScale = config.directionNoiseScale;
      this.color = config.color;
      this.size = config.size;
      this.gravity = config.gravity;
    }

    // Apply gravity
    this.vy += this.gravity;

    // Apply Perlin noise-based forces for organic, swirling movement
    const noiseForceX = (noise(this.x * this.directionNoiseScale, this.y * this.directionNoiseScale, frameCount * 0.01) - 0.5) * this.baseSpeed;
    const noiseForceY = (noise(this.y * this.directionNoiseScale, this.x * this.directionNoiseScale, frameCount * 0.01) - 0.5) * this.baseSpeed;

    this.vx += noiseForceX * 0.1; // Small influence to avoid overly wild movement
    this.vy += noiseForceY * 0.1;

    // Add some drag/friction
    this.vx *= 0.98;
    this.vy *= 0.98;

    // Update position
    this.x += this.vx;
    this.y += this.vy;

    // Reinitialize particle if it goes off-screen
    if (this.x > width || this.x < 0 || this.y > height || this.y < 0) {
      this.x = random(width);
      this.y = random(height);
      this.age = 0;
      // Re-calculate initial velocity with current emotion config
      const currentConfig = emotionConfigs[currentEmotion];
      this.vx = (noise(this.x * currentConfig.directionNoiseScale, this.y * currentConfig.directionNoiseScale) - 0.5) * currentConfig.speedFactor * 2;
      this.vy = (noise(this.y * currentConfig.directionNoiseScale, this.x * currentConfig.directionNoiseScale) - 0.5) * currentConfig.speedFactor * 2;
    }

    this.age++;
  }

  display() {
    // Fade out particles as they age
    const alpha = map(this.age, 0, this.lifespan, 255, 0);
    const col = color(this.color);
    col.setAlpha(alpha); // Set alpha for fading effect
    fill(col);
    noStroke();
    circle(this.x, this.y, this.size);
  }

  isDead() {
    return this.age > this.lifespan;
  }
}

๐Ÿ”ง Subcomponents:

calculation Constructor Method constructor(x, y, emotionConfig) { ... }

Initializes a new particle with position, emotion-based properties, and initial velocity

calculation Update Method update() { ... }

Updates particle position, velocity, and properties each frame based on physics and current emotion

calculation Gravity Application this.vy += this.gravity

Applies emotion-specific gravity to create upward/downward motion

calculation Perlin Noise Forces const noiseForceX = (noise(...) - 0.5) * this.baseSpeed

Uses Perlin noise to create organic, flowing movement patterns

calculation Friction/Drag this.vx *= 0.98; this.vy *= 0.98

Applies friction to gradually slow particles and create natural deceleration

conditional Off-Screen Boundary Check if (this.x > width || this.x < 0 || this.y > height || this.y < 0) { ... }

Resets particles that leave the canvas to a new random position

calculation Display Method display() { ... }

Draws the particle with color and alpha fade based on age

conditional isDead Method isDead() { return this.age > this.lifespan }

Returns true if the particle has exceeded its lifespan

Line by Line:

constructor(x, y, emotionConfig) {
Defines the constructor that runs when a new Particle is created with position and emotion configuration
this.x = x; this.y = y;
Stores the particle's starting x and y coordinates
this.baseSpeed = emotionConfig.speedFactor;
Sets the particle's speed multiplier from the emotion configuration
this.directionNoiseScale = emotionConfig.directionNoiseScale;
Stores the noise scale factor that controls how much Perlin noise influences movement
this.color = emotionConfig.color; this.size = emotionConfig.size;
Sets the particle's color and size based on the emotion configuration
this.lifespan = emotionConfig.life; this.age = 0;
Sets the maximum lifespan and initializes age to 0
this.gravity = emotionConfig.gravity;
Sets the gravity value that will pull particles up or down each frame
this.vx = (noise(x * this.directionNoiseScale, y * this.directionNoiseScale) - 0.5) * this.baseSpeed * 2;
Uses Perlin noise based on position to create organic initial horizontal velocity; subtracts 0.5 to center around 0
this.vy = (noise(y * this.directionNoiseScale, x * this.directionNoiseScale) - 0.5) * this.baseSpeed * 2;
Uses Perlin noise with swapped coordinates to create organic initial vertical velocity
const config = emotionConfigs[currentEmotion];
Gets the configuration object for the currently detected emotion
if (config) { this.baseSpeed = config.speedFactor; ... }
Updates all particle properties to match the current emotion, allowing real-time visual changes
this.vy += this.gravity;
Applies gravity by adding it to vertical velocity each frame (positive = downward, negative = upward)
const noiseForceX = (noise(this.x * this.directionNoiseScale, this.y * this.directionNoiseScale, frameCount * 0.01) - 0.5) * this.baseSpeed;
Calculates a Perlin noise-based force that changes over time (using frameCount) to create flowing, swirling motion
this.vx += noiseForceX * 0.1; this.vy += noiseForceY * 0.1;
Adds the noise forces to velocity with a 0.1 multiplier to keep movement natural and not too chaotic
this.vx *= 0.98; this.vy *= 0.98;
Applies friction by multiplying velocity by 0.98 each frame, gradually slowing particles
this.x += this.vx; this.y += this.vy;
Updates position by adding velocity, moving the particle each frame
if (this.x > width || this.x < 0 || this.y > height || this.y < 0) {
Checks if the particle has moved outside the canvas boundaries
this.x = random(width); this.y = random(height); this.age = 0;
Resets the particle to a new random position and resets its age to 0
const alpha = map(this.age, 0, this.lifespan, 255, 0);
Maps the particle's age to an alpha value: young particles are opaque (255), old particles are transparent (0)
const col = color(this.color); col.setAlpha(alpha);
Creates a color object from the hex string and sets its alpha channel for transparency
fill(col); noStroke(); circle(this.x, this.y, this.size);
Sets the fill color, removes the stroke, and draws a circle at the particle's position with its configured size
return this.age > this.lifespan;
Returns true if the particle is older than its lifespan, indicating it should be replaced

๐Ÿ“ฆ Key Variables

encoded string

Stores the base64-encoded and XOR-encrypted OpenAI API key to prevent casual inspection of the key in source code

const encoded='KTF3Kig1MHcIaDhqCms+OA0AYhEADSo9NDEbLDxjKwMTPW0fdwgLMQ8wahcvFy4UIDAxYiI4CWgwdy0fLGgsDgwcLCkDAgsuNh40OxwcN24LbA5pGDY4MRwQKR0+AigiLDATP2piFwwqLzk3agorOBM2ExkvLTMeDmg8OwoTGCMpLhQXOWMuNBIPDRc2bRlpFA0THT0RIh8NETsRKyADAykUKRs=';
key number

XOR decryption key used to decrypt the encoded API key

const key=0x5A;
particles array

Stores all active Particle objects that are updated and displayed each frame

let particles = [];
emotionInput object

p5.Element reference to the HTML text input field where users type emotions

let emotionInput;
emotionDisplay object

p5.Element reference to the HTML div that displays the currently detected emotion

let emotionDisplay;
currentEmotion string

Stores the name of the currently detected emotion, used to configure particle behavior

let currentEmotion = 'calm';
API_KEY string

Stores the decrypted OpenAI API key used for authentication in API requests

let API_KEY;
loading boolean

Flag to prevent multiple concurrent API calls; set to true while waiting for API response

let loading = false;
emotionConfigs object

Configuration object containing particle behavior parameters for each of the five emotions (happy, sad, angry, calm, excited)

const emotionConfigs = { 'happy': { color: '#FFFF00', speedFactor: 2.5, ... }, ... };

๐Ÿงช Try This!

Experiment with the code by making these changes:

  1. Change the speedFactor values in emotionConfigs (lines 24-45) to make particles move faster or slower for each emotion. Try setting happy's speedFactor to 5 instead of 2.5 to see how much faster it becomes.
  2. Modify the color values in emotionConfigs to create your own emotion color scheme. For example, change sad from '#0000FF' (blue) to '#FF1493' (deep pink) and see how the visual representation changes.
  3. Increase the initial particle count from 200 to 500 in the setup() function (line 143) to create a denser, more visually intense particle field.
  4. Experiment with the gravity values in emotionConfigs. Try setting calm's gravity to -0.2 to make calm particles float upward, or set happy's gravity to 0.3 to make happy particles fall.
  5. Change the directionNoiseScale values to control how 'swirly' particles move. Try setting angry's directionNoiseScale to 0.5 (higher = more chaotic) to make angry particles move more erratically.
  6. Modify the background alpha in draw() from 20 to 50 or 100 to create longer or shorter motion trails. Higher values clear the screen faster.
  7. Add a new emotion to emotionConfigs (like 'confused' or 'energetic') with its own configuration, then update the API prompt in analyzeEmotion() to include it in the list of valid emotions.
  8. Change the particle size values in emotionConfigs to make certain emotions more visually prominent. Try making sad particles much larger (size: 30) to emphasize sadness.
  9. Experiment with the lifespan values to control how long particles exist before being replaced. Try setting excited's life to 100 (very short) to create rapid particle cycling.
  10. Modify the friction value in the update() method (line 104: this.vx *= 0.98) to 0.95 or 0.99 to see how it affects particle deceleration and smoothness.
Open in Editor & Experiment โ†’

๐Ÿ”ง Potential Improvements

Here are some ways this code could be enhanced:

BUG analyzeEmotion() function, line 173

The API response parsing assumes data.choices[0].message.content exists without checking if the response structure is valid. If OpenAI changes response format or returns unexpected data, this will cause a runtime error.

๐Ÿ’ก Add defensive checks: const content = data?.choices?.[0]?.message?.content; if (!content) throw new Error('Invalid API response structure');

PERFORMANCE Particle.update() method, lines 91-108

Every particle recalculates its properties from emotionConfigs every frame, even though all particles share the same currentEmotion. This is redundant computation.

๐Ÿ’ก Move the emotion config lookup to the draw() function and pass it as a parameter, or cache the current config as a global variable that updates only when emotion changes.

BUG Particle boundary reset, lines 109-117

When particles reset at boundaries, they're placed at completely random positions which can create visual discontinuities. Particles might teleport across the screen abruptly.

๐Ÿ’ก Reset particles at the edge they exited from: if (this.x > width) this.x = 0; if (this.x < 0) this.x = width; etc. This creates smoother wrapping.

STYLE emotionConfigs object, lines 24-45

The configuration values lack documentation and are somewhat arbitrary (e.g., why 0.1 for calm's directionNoiseScale?). It's difficult to understand the relationship between values.

๐Ÿ’ก Add comments explaining the valid ranges and effects: // directionNoiseScale: 0.05-0.2 controls swirl intensity; lower = smooth, higher = chaotic

FEATURE analyzeEmotion() function

The sketch calls the API on every keystroke, which is wasteful and expensive. Typing 'happy' triggers 5 API calls.

๐Ÿ’ก Add debouncing: use setTimeout to delay the API call until the user stops typing for 500ms. This reduces API usage by ~80% and improves performance.

BUG setup() function, line 144

The emotionInput field is created but never positioned or sized initially. It relies on CSS positioning, which could fail if CSS doesn't load or is overridden.

๐Ÿ’ก Add explicit positioning in setup(): emotionInput.position(10, 10); emotionInput.size(width - 20, 30);

PERFORMANCE draw() function, line 153

The background() function with alpha=20 creates a semi-transparent overlay every frame, which can be slow on low-end devices. The motion trail effect accumulates 60 times per second.

๐Ÿ’ก Consider using a lower alpha value (10-15) or a faster frame rate cap with frameRate(30) to reduce computational load while maintaining visual quality.

SECURITY getApiKey() function, lines 6-8

The API key is obfuscated but not truly secure. Anyone can run getApiKey() in the browser console to reveal it. The key is also exposed in network requests.

๐Ÿ’ก Move API calls to a backend server (Node.js/Express) that handles authentication. The frontend should send text to your server, which calls OpenAI privately.

Preview

AI Emotion Particles - xelsed.ai - p5.js creative coding sketch preview
Sketch Preview
Code flow diagram showing the structure of AI Emotion Particles - xelsed.ai - Code flow showing getapikey, setup, draw, analyzeemotion, windowresized, particle
Code Flow Diagram