Documentation

Overview

Create and integrate web apps within the DAW via the WAX plugin. Communicate with host transport, receive & send MIDI, write/read automation, and state management.

All integration is done through normal web APIs plus a few global WAX functions. No special SDK or plugin framework is required. No prior DAW or plugin experience required.

Audio

Use the Web Audio API as usual. In a normal browser, audio often must start after a user gesture because of autoplay policies. Inside WAX, the host has already “activated” audio.

You do not need to hide your app or audio behind a “Click to start” button.

Output (instruments and effects) 
Create an AudioContext, build your graph, connect to context.destination. WAX runs your page tied to the DAW’s sample rate and block size, so your project stays in sync.

const ctx = new AudioContext()

const osc = ctx.createOscillator()
osc.connect(ctx.destination)

osc.start()

// simple example: create an oscillator and send it to the output

Input (effects)
Use navigator.mediaDevices.getUserMedia({ audio: true })(or legacy navigator.getUserMedia) to stream audio into the web app. Use getUserMedia as in a normal web app; no special handling needed.

navigator.mediaDevices.getUserMedia({ audio: true }).then(function(stream) {

  // create audio context
  const ctx = new AudioContext()

  // convert mic stream into an audio node
  const mic = ctx.createMediaStreamSource(stream)

  // connect mic to output (for testing)
  mic.connect(ctx.destination)

}).catch(function(err) {

  console.error("Mic access failed", err)

});

// simple example: capture microphone audio and route direct to output. Essentially a bypass

Transport

Respond to DAW Playback. Define these global functions and WAX will call them automatically when DAW transport begins to play and stop and when BPM changes.

window.WAX_Play = function () {
  console.log("DAW started");
  // Start your sequencer, animations, etc.
}

window.WAX_Stop = function () {
  console.log("DAW stopped");
  // Stop your sequencer, reset UI, etc.
}

window.WAX_BPM = function (bpm) {
  console.log("Host BPM:", bpm);
  // Called when the host BPM begins to play or changes. 
  //bpm is a number (e.g. 120). 
}

Define these early in your page (head or start of body) so they are available when WAX initializes.

Transport Info

Read the current playhead position, tempo, time signature, and whether the DAW is playing or recording via the global object window.PlayheadInfo (and optionally a continuous stream).

Single Request

Call window.WAX_RequestPlayheadInfo(). Shortly after, window.PlayheadInfo will be updated. Read it when you need it (e.g. after a short timeout).

window.WAX_RequestPlayheadInfo();

setTimeout(function () {
  var info = window.PlayheadInfo;

  if (info) {
    console.log("Playing?", info.state.isPlaying);
    console.log("BPM", info.tempo.bpm);
    console.log("Time (s)", info.timing.timeInSeconds);
    console.log("PPQ", info.timing.ppqPosition);
  }
}, 50);

Continuous Request

Call window.Request_PlayheadTimerStart(speed) when you need frequent updates (e.g. for a moving playhead), start a timer and stop it when done:

window.Request_PlayheadTimerStart(clockSpeed);
// clockSpeed = interval in ms (e.g. 16 for ~60 fps). Clamped 4–2000.

// Read playhead info while timer is running
var info = window.PlayheadInfo;

if (info) {
  console.log("Playing?", info.state.isPlaying);
  console.log("BPM", info.tempo.bpm);
  console.log("Time (s)", info.timing.timeInSeconds);
  console.log("PPQ", info.timing.ppqPosition);
}

window.Request_PlayheadTimerStop();

PlayheadInfo Object

Below are more values avaialble with PlayheadInfo

PlayheadInfo.state.isPlaying
// boolean, is the transport playing?

PlayheadInfo.state.isRecording
// boolean, is the transport recording?

PlayheadInfo.state.isLooping
// boolean, is loop on?


PlayheadInfo.tempo.bpm
// number, host tempo (e.g. 120)

PlayheadInfo.tempo.timeSigNumerator
PlayheadInfo.tempo.timeSigDenominator
// time signature (e.g. 4 / 4)


PlayheadInfo.timing.timeInSamples
// current position in samples

PlayheadInfo.timing.timeInSeconds
// current position in seconds

PlayheadInfo.timing.ppqPosition
// position in quarter notes (ppq). Example: 0 = start, 1 = one quarter in, 2.5 = halfway through beat 3
// useful for tempo-based scheduling (e.g. step = floor(ppq * 4) for 16th notes)

PlayheadInfo.timing.ppqPositionOfLastBarStart


PlayheadInfo.loop.ppqLoopStart
PlayheadInfo.loop.ppqLoopEnd
// loop boundaries when looping is enabled

MIDI

Use the standard Web MIDI API. WAX already has MIDI wired to the DAW: input comes from the host (the track), and output from your page is sent to the host. WAX provides virtual MIDI input and output; you do not need to configure ports.

There is no separate WAX MIDI API — only the normal Web MIDI API.

Request MIDI access

Use navigator.requestMIDIAccess() as you would on the web. Call it as soon as the page loads. WAX does not require audio or MIDI to be gated behind a mouse click or dropdown. If you don’t need MIDI right away, gate MIDI at a later stage after the connection is established so it is ready when needed.

navigator.requestMIDIAccess().then(function (access) {

  access.inputs.forEach(function (input) {
    input.onmidimessage = function (e) {

      // e.data is a Uint8Array: [status, data1, data2]
      console.log("MIDI IN", e.data[0], e.data[1], e.data[2]);

    };
  });

  access.outputs.forEach(function (output) {

    // Example: send MIDI note
    output.send([0x90, 60, 127]);

  });

}).catch(function (err) {
  console.error("MIDI failed", err);
});

Message Basics

Every incoming MIDI message looks like this:

e.data = [status, data1, data2]
  • status → what type of message it is (note, CC, etc.)
  • data1 → usually the note number or controller number
  • data2 → usually the velocity or value

The message is stored in a Uint8Array (numbers from 0–255).vv


MIDI Input (from DAW)

Note On → [0x90, note, velocity]
Note Off → [0x80, note, velocity]
CC → [0xB0, ccNumber, value]

input.onmidimessage = function (e) {

  var st = e.data[0];
  var d1 = e.data[1];
  var d2 = e.data[2];

  var type = st & 0xF0;

  if (type === 0x90 && d2 > 0) {
    // Note On
  }

  else if (type === 0x80 || (type === 0x90 && d2 === 0)) {
    // Note Off
  }

  else if (type === 0xB0) {
    // Control Change
    // d1 = CC number
    // d2 = value
  }

};

MIDI Output (to DAW)

Note On → output.send([0x90, note, velocity])
Note Off → output.send([0x80, note, velocity])
CC → output.send([0xB0, ccNumber, value])

navigator.requestMIDIAccess().then(function(access) {

  access.outputs.forEach(function(output) {

    // Note On (Middle C)
    output.send([0x90, 60, 127]);

    // Note Off
    output.send([0x80, 60, 0]);

    // Control Change (CC1)
    output.send([0xB0, 1, 64]);

  });

});

Automation

How to support DAW automation (two-way sync)

  1. Send CC when the user moves a control — So the host can record it.
  2. Update UI and audio when you receive CC — In your onmidimessage handler (status 0xB0), update both the on-screen control and the audio parameters so playback drives your synth correctly.

Avoiding input vs. output conflicts

You have two sources of change: (1) the user moving a control, and (2) incoming MIDI (automation). If you send CC for every change, you can get feedback. Rule: only send CC when the change came from the user. When the change came from incoming MIDI, update your UI and audio only; do not call output.send(). If setting a control’s value in code (e.g. cutoff.value = x) fires the same handler as a user move, use a fromMIDI flag and skip sendCC() when that flag is true.

DataTree

DataTree lets you save and load a snapshot of your app’s state (knobs, presets, project data) so it can be restored when the user reopens the project or loads a preset.

Push (save)

window.WAX_DataTree.push(data, appName)data = any JSON-serializable object. appName = string id for your app (required; use the same when you pull).

// save simple state
window.WAX_DataTree.push({
  volume: 0.8
}, "my-app")

// save synth parameters
window.WAX_DataTree.push({
  cutoff: 1200,
  resonance: 0.6,
  waveform: "saw"
}, "my-app")

// save sequencer state
window.WAX_DataTree.push({
  tempo: 120,
  pattern: [1,0,1,0,1,0,0,1]
}, "my-app")

// save full app state
window.WAX_DataTree.push({
  preset: "Bass 01",
  volume: 0.7,
  cutoff: 900,
  resonance: 0.4,
  steps: [1,0,1,1,0,0,1,0]
}, "my-app")

Pull (load)

window.WAX_DataTree.pull(appName, timeoutMs) returns a Promise that resolves with the last pushed data (or rejects on error/timeout). timeoutMs is optional (default 3000).

// load previously saved state
window.WAX_DataTree.pull("my-app").then(data => {

  console.log(data)

  // apply restored values to your UI / audio

})

Helpers

  • window.WAX_DataTree.getCached() — returns cached data after a pull.
  • window.WAX_DataTree.onHydrated(fn) — callback when data is available.
  • window.WAX_DataTree.setProvider(fn) — provide state when the host requests a pull.
// get cached data after a pull
const cached = window.WAX_DataTree.getCached()

if (cached) {
  console.log("Cached data", cached)
}


// run when data is ready
window.WAX_DataTree.onHydrated(function(data) {
  console.log("Hydrated data", data)
})


// provide current app state when the host requests a pull
window.WAX_DataTree.setProvider(function() {
  return {
    volume: 0.8,
    preset: "Bass 01",
    cutoff: 1200
  }
})

Best practices

  1. Attempt pull on startup. As soon as your app is ready, call WAX_DataTree.pull(appName). If data comes back, apply it to your UI and audio. If it rejects or times out, use defaults.
  2. Only after that attempt, send to DataTree. Do not push before you have tried to pull. Pushing default state on load can overwrite the user’s saved state. Pull first, then push only when the user changes something (slider, save button, etc.).

Scheduling

Timed events should always be scheduled on the Web Audio timeline rather than fired immediately from JavaScript timers.

JavaScript timers such as setInterval or requestAnimationFrame can stall if the UI thread slows down (when editor closes). The audio engine runs independently and guarantees accurate playback.

// Instead of this:
setInterval(() => {
  note.start()
}, 125)


// Do this:
const t = audioContext.currentTime + 0.05
note.start(t)

Tempo-Based Scheduling

For a beat or step sequencer that fires events on a tempo grid, use transport (playhead) when the DAW is running and a local clock when it is not. Always schedule sound on the audio thread so playback does not depend on JavaScript timers (which can stall when the tab is in the background or the editor is closed).

How to use the playhead to schedule an event in sync with the DAW

The idea: the DAW tells you “where I am in the song.” You turn that into “which step I’m on,” and you tell the Web Audio API to play the sound for that step at a specific time. The audio engine then plays it at the right moment, even if your JavaScript runs a bit late.

Step 1 — Get regular position updates

When your sequencer starts, call window.Request_PlayheadTimerStart(8); (8 = update interval in ms). WAX will keep updating window.PlayheadInfo at that rate.

Step 2 — Figure out “which step” you’re on

PlayheadInfo has timing.ppqPosition (position in quarter notes). For a 16-step pattern (one step per 16th note), one quarter note = 4 steps:

// convert playhead position to a step index
step = Math.floor(ppq * 4)   // 0 to 15 for 16 steps

var info = window.PlayheadInfo;
if (!info || !info.state.isPlaying) return;
var ppq = info.timing.ppqPosition;
var step = Math.floor(ppq * 4) % 16;
// 0–15, wraps for long songs

Step 3 — Schedule the sound at a specific time (don’t play “right now”)

If you play inside the callback, it can drift. Instead, pick a time slightly in the future and tell Web Audio “play at this time”:

var whenToPlay = audioCtx.currentTime + 0.01;   
// schedule playback slightly in the future (10 ms)

bufferSource.start(whenToPlay);

// Web Audio will play it at the exact scheduled time

When you see the step change, schedule the sound for that step at whenToPlay. The audio thread plays at that exact time, so you stay in sync.

Minimal Example

var lastStep = -1;

function onPlayheadUpdate() {

  var info = window.PlayheadInfo;
  if (!info || !info.state || !info.state.isPlaying) return;

  var ppq = info.timing && info.timing.ppqPosition;
  if (typeof ppq !== "number") return;

  var step = Math.floor(ppq * 4) % 16;

  if (step === lastStep) return;
  lastStep = step;

  var whenSec = audioCtx.currentTime + 0.01;

  playStepAt(step, whenSec);
  // your function: play sound for step at whenSec
}

window.Request_PlayheadTimerStart(8);
setInterval(onPlayheadUpdate, 20);

playStepAt(step, whenSec) should use bufferSource.start(whenSec), not audioCtx.currentTime, so the DAW’s position and your sound stay in sync.

Schedule on the audio thread, not on JavaScript timers

setInterval and requestAnimationFrame are throttled when the tab is hidden. Use AudioContext.currentTime and schedule with bufferSource.start(whenSec) or gain.gain.setValueAtTime(..., whenSec). Use a timer only to “refill” a lookahead window: every 20–50 ms, schedule the next 100–200 ms of steps.

When there is no DAW playback (local-only)

Keep BPM from WAX_BPM() or your UI. Step duration = 60 / bpm / 4 (for 16th notes). Maintain “next step time” and step index; in your refill tick, schedule steps at nextStepTime and advance. Same “schedule at exact time” logic; the source of time is local instead of PlayheadInfo.

Quick Reference

Transport (define these globals)
window.WAX_Play, window.WAX_Stop, window.WAX_BPM(bpm)
Playhead
window.WAX_RequestPlayheadInfo() → then read window.PlayheadInfo
window.Request_PlayheadTimerStart(intervalMs), window.Request_PlayheadTimerStop()
MIDI & automation
navigator.requestMIDIAccess() → use inputs/outputs as Web MIDI. Message types: 0x80 note off, 0x90 note on, 0xB0 CC; channel = (status & 0x0F)+1. input.onmidimessage / output.send([status, data1, data2]). Automation: send CC on control change; in onmidimessage (0xB0), update UI + audio. Avoid conflicts: only send CC when user moved the control.
Audio
new AudioContext() — use as usual; no need to gate behind a button. navigator.mediaDevices.getUserMedia({ audio: true }) for mic / live input.
DataTree
window.WAX_DataTree.push(data, appName), window.WAX_DataTree.pull(appName, timeoutMs), getCached(), onHydrated(fn). Best practice: pull on startup first; only then push (on user changes).
Sequencer / tempo
Request_PlayheadTimerStart(ms) → ingest PlayheadInfo; compute step from ppq/bpm. Schedule sounds with bufferSource.start(whenSec) on AudioContext; use timers only to refill lookahead. When host not playing: local clock (nextStepTime += stepDuration), same refill pattern.
Was this article helpful?
Dislike