Your one-stop shop for JUCE modules, tools, assets and resources
an architecture for neural network inference in real-time audio applications
Anira is a high-performance library designed to enable easy real-time safe integration of neural network inference within audio applications. Compatible with multiple inference backends, LibTorch, ONNXRuntime, and Tensorflow Lite, anira bridges the gap between advanced neural network architectures and real-time audio processing. In the paper you can find more information about the architecture and the design decisions of anira, as well as extensive performance evaluations with the built-in benchmarking capabilities.
The basic usage of anira is as follows:
#include <anira/anira.h>
anira::InferenceConfig inference_config(
{{"path/to/your/model.onnx", anira::InferenceBackend::ONNX}}, // Model path
{{{256, 1, 1}}, {{256, 1}}}, // Input, Output shape
5.33f // Maximum inference time in ms
);
// Create a pre- and post-processor instance
anira::PrePostProcessor pp_processor(inference_config);
// Create an InferenceHandler instance
anira::InferenceHandler inference_handler(pp_processor, inference_config);
// Pass the host configuration and allocate memory for audio processing
inference_handler.prepare({buffer_size, sample_rate});
// Select the inference backend
inference_handler.set_inference_backend(anira::InferenceBackend::ONNX);
// Optionally get the latency of the inference process in samples
unsigned int latency_in_samples = inference_handler.get_latency();
// Real-time safe audio processing in process callback of your application
process(float** audio_data, int num_samples) {
inference_handler.process(audio_data, num_samples);
}
// audio_data now contains the processed audio samples
Anira can be easily integrated into your CMake project. You can either add anira as a submodule, download the pre-built binaries from the releases page, or build from source.
# Add anira repo as a submodule
git submodule add https://github.com/anira-project/anira.git modules/anira
In your CMakeLists.txt:
# Setup your project and target
project(your_project)
add_executable(your_target main.cpp ...)
# Add anira as a subdirectory
add_subdirectory(modules/anira)
# Link your target to the anira library
target_link_libraries(your_target anira::anira)
Download pre-built binaries from the releases page.
In your CMakeLists.txt:
# Setup your project and target
project(your_project)
add_executable(your_target main.cpp ...)
# Add the path to the anira library as cmake prefix path and find the package
list(APPEND CMAKE_PREFIX_PATH "path/to/anira")
find_package(anira REQUIRED)
# Link your target to the anira library
target_link_libraries(your_target anira::anira)
git clone https://github.com/anira-project/anira.git
cd anira
cmake . -B build -DCMAKE_BUILD_TYPE=Release
cmake --build build --config Release --target anira
cmake --install build --prefix /path/to/install/directory
By default, all three inference engines are installed. You can disable specific backends as needed:
-DANIRA_WITH_LIBTORCH=OFF-DANIRA_WITH_ONNXRUNTIME=OFF-DANIRA_WITH_TFLITE=OFFMoreover, the following options are available:
-DANIRA_WITH_BENCHMARK=ON-DANIRA_WITH_EXAMPLES=ON-DANIRA_WITH_TESTS=ON-DANIRA_WITH_DOCS=ON-DANIRA_WITH_LOGGING=OFFanira's real-time safety is checked in this repository with the rtsan sanitizer.
Something wrong with this product?
ReportAre you the owner of this product?
Request Ownership