Tone.js
Wire AudioUI controls to Tone.js — a higher-level audio framework with synths, effects, and transport. Covers PolySynth, filter routing, and MIDI-to-note-name bridging.
Tone.js is a higher-level abstraction over the Web Audio API with built-in synths, effects, transports, and musical scheduling. This guide shows how to bind AudioUI controls to a Tone.js polysynth.
If you need lower-level control, the Web Audio API integration covers the same patterns against the raw browser API.
Install
npm install toneTone.js ships its own TypeScript types.
What you'll build
A polysynth driven by AudioUI controls:
CycleButton→ oscillator waveformKnob→ filter cutoffSlider→ master volumeKeys→ polyphonic note-on / note-off
The synth
Build the Tone graph once and keep it in a ref:
import { useEffect, useRef, useState } from "react";
import * as Tone from "tone";
import { Knob, Slider, CycleButton, Keys } from "@cutoff/audio-ui-react";
import "@cutoff/audio-ui-react/style.css";
type Waveform = "sawtooth" | "square" | "sine" | "triangle";
export default function Synth() {
const synthRef = useRef<Tone.PolySynth | null>(null);
const filterRef = useRef<Tone.Filter | null>(null);
const masterRef = useRef<Tone.Gain | null>(null);
const [ready, setReady] = useState(false);
const [cutoff, setCutoff] = useState(60);
const [volume, setVolume] = useState(60);
const [waveform, setWaveform] = useState<Waveform>("sawtooth");
useEffect(() => {
const filter = new Tone.Filter(1200, "lowpass");
const master = new Tone.Gain(0.6);
const synth = new Tone.PolySynth(Tone.Synth, {
oscillator: { type: "sawtooth" },
envelope: { attack: 0.01, decay: 0.1, sustain: 0.6, release: 0.3 },
});
synth.chain(filter, master, Tone.getDestination());
synthRef.current = synth;
filterRef.current = filter;
masterRef.current = master;
return () => {
synth.dispose();
filter.dispose();
master.dispose();
};
}, []);
// Tone.start() requires a user gesture — browsers suspend AudioContext otherwise
const startAudio = async () => {
await Tone.start();
setReady(true);
};
// Filter cutoff — map Knob's linear 0..100 to 80 Hz .. 10 kHz exponentially
const handleCutoff = (e: { value: number }) => {
setCutoff(e.value);
const min = 80, max = 10000;
const hz = min * Math.pow(max / min, e.value / 100);
filterRef.current?.frequency.rampTo(hz, 0.02);
};
const handleVolume = (e: { value: number }) => {
setVolume(e.value);
masterRef.current?.gain.rampTo(e.value / 100, 0.02);
};
const handleWaveform = (e: { value: Waveform }) => {
setWaveform(e.value);
synthRef.current?.set({ oscillator: { type: e.value } });
};
// Keys emits { note, active } — convert MIDI to note name for Tone
const handleKeys = (e: { value: { note: number; active: boolean } }) => {
const { note, active } = e.value;
const pitch = Tone.Frequency(note, "midi").toNote();
if (active) synthRef.current?.triggerAttack(pitch);
else synthRef.current?.triggerRelease(pitch);
};
return (
<div className="dark">
{!ready && <button onClick={startAudio}>Start Audio</button>}
<CycleButton
value={waveform}
onChange={handleWaveform}
options={[
{ value: "sawtooth", label: "Saw" },
{ value: "square", label: "Square" },
{ value: "sine", label: "Sine" },
{ value: "triangle", label: "Tri" },
]}
label="Wave"
/>
<Knob value={cutoff} min={0} max={100} onChange={handleCutoff} label="Cutoff" />
<Slider
value={volume}
min={0}
max={100}
orientation="vertical"
onChange={handleVolume}
label="Vol"
/>
<Keys nbKeys={25} startKey="C" octaveShift={0} onChange={handleKeys} />
</div>
);
}Key patterns
Tone.start() gates audio on user gesture
Tone.js internally manages AudioContext. Call Tone.start() on a user click/tap before triggering sound — otherwise notes are silent. See Tone's AudioContext docs.
MIDI to note name
Tone.Frequency(midi, "midi").toNote() converts a MIDI integer (what AudioUI's Keys emits) to a Tone-compatible pitch string ("C4", "A#3"). Use this every time you bridge Keys to Tone.
Alternative: pass the MIDI number directly to triggerAttack as a frequency. Tone accepts both.
PolySynth for polyphonic notes
A bare Tone.Synth is monophonic. For the keyboard to hold multiple notes, wrap it in Tone.PolySynth(Tone.Synth, {...}). Triggering the same pitch twice is idempotent.
Smooth parameter changes via rampTo
Tone's Signal.rampTo(value, time) is equivalent to the Web Audio setTargetAtTime pattern — use it for knob-driven params to avoid zipper noise.
Effects chain
Add effects inline via synth.chain(...). For example, to add reverb and delay:
const reverb = new Tone.Reverb({ decay: 2, wet: 0.3 });
const delay = new Tone.FeedbackDelay({ delayTime: "8n", feedback: 0.3 });
synth.chain(filter, delay, reverb, master, Tone.getDestination());Bind AudioUI knobs to reverb.wet.value, delay.feedback.value, etc.
Tone.Transport vs direct triggers
For free-form playing (a keyboard), trigger directly. For scheduled sequences (step sequencer, arpeggiator), use Tone.Transport with Tone.Pattern or Tone.Sequence. The keyboard example above uses direct triggers.
Related
- Web Audio API integration — lower-level approach
- Audio Parameters — AudioUI's parameter model
- Keys component — keyboard API reference