- Run all audio in a single process.
- Instead of a main audio process connected to the backend and one process per opened project.
- Audio engine is composed of a tree of 'realms', with the top-level realm hooked up to the
backend, and each project creates a sub-realm.
- Run plugins in a separate process, isolated from the main audio engine.
- Turns out I really want to support the LV2 instance access feature, so UIs need to run in the
same process as the plugin itself.
- So plugins need to be isolated from each other, so they can use different UI toolkits without
linker symbol clashes. I.e. they can't all run in the main audio process.
- Separate audio processing from the project process.
- Instead of generating MIDI events from the project and route them to the audio pipeline, have
processors in the pipeline, which capture the full state of the project and emit events during
playback. Changes to the project update those processors to keep them in sync.
- Use protocol buffers for a lot of internal messages in non-realtime contexts.
- Very basic support for LV2 plugin UIs.
- Trying to get that working triggered all those changes above...
- And many, many more changes...