Visualizers

Introduction

Visualizers analyze audio but typically do not modify it.

The goal of this guide is to demonstrate simple and reliable patterns for building each of these using the standard Web Audio API. These examples intentionally avoid complex frameworks and focus on the core audio building blocks that work well inside WAX.

Audio Input

WAX uses the microphone input interface to stream audio from the DAW host into the web app. By using the standard getUserMedia() microphone pattern, the same codebase can run both inside WAX and in external browsers without modification.

// Request microphone / host audio input

navigator.mediaDevices.getUserMedia({ audio: true })
.then(stream => {

  const input = ctx.createMediaStreamSource(stream)

})

Analyser node

The AnalyserNode provides real-time access to an audio signal’s waveform or frequency data without altering the sound. It is commonly used for visualizers, VU meters, and spectrum displays.

// Create an analyser node for visualization or metering
const analyser = ctx.createAnalyser()

// Route audio input into the analyser
input.connect(analyser)

Basic VU Meter

// Audio context
const ctx = new AudioContext()

// Capture audio from host / microphone
navigator.mediaDevices.getUserMedia({ audio:true })
.then(async stream => {

  const input = ctx.createMediaStreamSource(stream)

  // Load an AudioWorklet that measures signal level on the audio thread
  await ctx.audioWorklet.addModule("vu-meter-processor.js")

  const vuNode = new AudioWorkletNode(ctx, "vu-meter")

  input.connect(vuNode)

  // Receive RMS values from the audio thread
  vuNode.port.onmessage = e => {
    const rms = e.data
    console.log("VU", rms)
  }

})
Was this article helpful?
Dislike