This release provides LightGlue ONNX models that support dynamic batch sizes. Depending on where you are running inference, use the corresponding models and signatures: ONNX-only - ...
Some results have been hidden because they may be inaccessible to you