// overview
WyseDSP is a suite of audio plugins for guitarists, producers, and musicians — targeting the VST3 format and built entirely in C++ using the JUCE framework. The project spans guitar amp simulations, a full synthesiser, effects chains, a bass amp, and a drum machine.
The entire suite was developed with Claude (Anthropic) as a pair-programming partner — not just for boilerplate or autocomplete, but for real technical problem-solving: DSP algorithm design, UI architecture, debugging, feature planning, and code review. The full codebase is version-controlled on GitHub. Months of active development, hundreds of commits, and a product that's genuinely ready to sell.
// the plugins — in action
Each plugin has its own visual language — dark/gold for the guitar chain, teal for the drum machine, deep purple for the synthesiser, and a clean dark theme for the bass amp.
// the plugins
Each plugin was built to solve a real musical need, with genuine amp modelling, MIDI support, user presets, and a consistent design language throughout.
Multi-channel guitar amp sim with a full pedal chain — Green Drive, Gold Drive, Blue Drive, Yellow Drive, Compressor. MIDI Learn, user presets, arc-ring knob UI.
A focused version of GREC with a leaner pedal selection and streamlined UI — aimed at quick workflow integrations without the full amp chain overhead.
Inspired by classic British clean/crunch tone. Output-compensated, EQ-tuned, with noise gate defaults refined across development. Two-channel design.
Modelled on a high-gain valve character — tight low-end, aggressive mids. Channel switching, presence control, cabinet simulation.
Full synthesiser with 189 presets across 12 categories including Synth Revival. MIDI Learn, mod wheel, warble fix (filter state no longer hard-resets on voice retrigger), chord guide.
A faithful recreation of LinnDrum-era drum machine character using high-quality samples. Pattern-based, MIDI-triggered, with individual channel outputs.
// how it was built
The project went through many meaningful iterations: DSP algorithm tuning for the amp models, a custom knob renderer with arc-style value rings, an XML-based global preset system with user save/load, MIDI Learn functionality implemented across all plugins, and extensive preset creation.
Working with AI on a project this long and complex is genuinely different to using it for one-off tasks. Claude held context across sessions, caught bugs in my reasoning before they became compiler errors, suggested design patterns I hadn't considered, and was available at any hour. It didn't replace engineering judgement — it accelerated it significantly.
Some specific examples of where AI collaboration made a real difference:
The synth had a warble artefact where the filter state was being hard-reset on every voice retrigger. Tracking this down involved identifying the exact point in the signal chain where the reset was happening, understanding why it caused the artefact perceptually, and designing a fix that didn't introduce latency. Having a second perspective in real-time made this significantly faster.
Adding a right-click MIDI Learn system to all parameters across multiple plugins — with global XML persistence — was a substantial design problem. The implementation involved listener patterns, thread safety considerations in the audio thread, and a consistent UI affordance across all controls. This was co-designed with AI assistance from architecture through to implementation.
Over 189 presets for the synth alone, organised across 12 categories including a dedicated "Synth Revival" category with content inspired by classic synth tones. The preset naming, organisation, and parameter values were developed collaboratively.
// version control
The entire WyseDSP codebase is version-controlled on GitHub. For a project of this scale — multiple plugins, months of iterative development, and constant collaboration with an AI pair-programmer — having a solid Git workflow wasn't optional, it was foundational.
When AI is helping you write code quickly, it's easy to accumulate large blocks of untested changes. Frequent, well-described commits act as checkpoints — if a feature goes sideways or an AI suggestion introduces a regression, rolling back is clean and surgical rather than a painful manual undo. Every meaningful feature, fix, or refactor got its own commit with a clear message.
Some of the more experimental work — trying different DSP approaches, testing alternative UI architectures, exploring amp model variations — was done on feature branches. This meant the main branch always stayed in a working, playable state. It also made it straightforward to compare two approaches side by side before deciding which to keep.
The commit history is effectively a log of the entire development arc — which features landed when, what was fixed and how, how the plugin evolved from a rough prototype to a finished product. For a project built openly and documented here, that history has value beyond just source control.
Working with AI on code that you'll actually ship requires more discipline, not less. GitHub kept the project grounded — it's easy to drift when a collaborator never gets tired and always has a suggestion. The commit history is proof that every change was deliberate.