This release provides LightGlue ONNX models that support dynamic batch sizes. Depending on where you are running inference, use the corresponding models and signatures: ONNX-only - ...