Publication

Place and Object Recognition by CNN-Based COSFIRE Filters

Lopez-Antequera, M., Vallina, M. L., Strisciuglio, N. & Petkov, N., 2019, In : IEEE Access. 7, p. 66157-66166 10 p.

Research output: Contribution to journalArticleAcademicpeer-review

APA

Lopez-Antequera, M., Vallina, M. L., Strisciuglio, N., & Petkov, N. (2019). Place and Object Recognition by CNN-Based COSFIRE Filters. IEEE Access, 7, 66157-66166. https://doi.org/10.1109/ACCESS.2019.2918267

Author

Lopez-Antequera, Manuel ; Vallina, Maria Leyva ; Strisciuglio, Nicola ; Petkov, Nicolai. / Place and Object Recognition by CNN-Based COSFIRE Filters. In: IEEE Access. 2019 ; Vol. 7. pp. 66157-66166.

Harvard

Lopez-Antequera, M, Vallina, ML, Strisciuglio, N & Petkov, N 2019, 'Place and Object Recognition by CNN-Based COSFIRE Filters', IEEE Access, vol. 7, pp. 66157-66166. https://doi.org/10.1109/ACCESS.2019.2918267

Standard

Place and Object Recognition by CNN-Based COSFIRE Filters. / Lopez-Antequera, Manuel; Vallina, Maria Leyva; Strisciuglio, Nicola; Petkov, Nicolai.

In: IEEE Access, Vol. 7, 2019, p. 66157-66166.

Research output: Contribution to journalArticleAcademicpeer-review

Vancouver

Lopez-Antequera M, Vallina ML, Strisciuglio N, Petkov N. Place and Object Recognition by CNN-Based COSFIRE Filters. IEEE Access. 2019;7:66157-66166. https://doi.org/10.1109/ACCESS.2019.2918267


BibTeX

@article{2cdba6abf29a4603a1e79421cf95a89b,
title = "Place and Object Recognition by CNN-Based COSFIRE Filters",
abstract = "COSFIRE filters are an effective means for detecting and localizing visual patterns. In contrast to a convolutional neural network (CNN), such a filter can be configured by presenting a single training example and it can be applied on images of any size. The main limitation of COSFIRE filters so far was the use of only Gabor and DoGs contributing filters for the configuration of a COSFIRE filter. In this paper, we propose to use a much broader class of contributing filters, namely filters defined by intermediate CNN representations. We apply our proposed method on the MNIST data set, on the butterfly data set, and on a garden data set for place recognition, obtaining accuracies of 99.49%, 96.57%, and 89.84%, respectively. Our method outperforms a CNN-baseline method in which the full CNN representation at a certain layer is used as input to an SVM classifier. It also outperforms traditional non-CNN methods for the studied applications. In the case of place recognition, our method outperforms NetVLAD when only one reference image is used per scene and the two methods perform similarly when many reference images are used.",
keywords = "COSFIRE filter, CNN, object recognition, place recognition, VESSEL DELINEATION",
author = "Manuel Lopez-Antequera and Vallina, {Maria Leyva} and Nicola Strisciuglio and Nicolai Petkov",
year = "2019",
doi = "10.1109/ACCESS.2019.2918267",
language = "English",
volume = "7",
pages = "66157--66166",
journal = "IEEE Access",
issn = "2169-3536",
publisher = "IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC",

}

RIS

TY - JOUR

T1 - Place and Object Recognition by CNN-Based COSFIRE Filters

AU - Lopez-Antequera, Manuel

AU - Vallina, Maria Leyva

AU - Strisciuglio, Nicola

AU - Petkov, Nicolai

PY - 2019

Y1 - 2019

N2 - COSFIRE filters are an effective means for detecting and localizing visual patterns. In contrast to a convolutional neural network (CNN), such a filter can be configured by presenting a single training example and it can be applied on images of any size. The main limitation of COSFIRE filters so far was the use of only Gabor and DoGs contributing filters for the configuration of a COSFIRE filter. In this paper, we propose to use a much broader class of contributing filters, namely filters defined by intermediate CNN representations. We apply our proposed method on the MNIST data set, on the butterfly data set, and on a garden data set for place recognition, obtaining accuracies of 99.49%, 96.57%, and 89.84%, respectively. Our method outperforms a CNN-baseline method in which the full CNN representation at a certain layer is used as input to an SVM classifier. It also outperforms traditional non-CNN methods for the studied applications. In the case of place recognition, our method outperforms NetVLAD when only one reference image is used per scene and the two methods perform similarly when many reference images are used.

AB - COSFIRE filters are an effective means for detecting and localizing visual patterns. In contrast to a convolutional neural network (CNN), such a filter can be configured by presenting a single training example and it can be applied on images of any size. The main limitation of COSFIRE filters so far was the use of only Gabor and DoGs contributing filters for the configuration of a COSFIRE filter. In this paper, we propose to use a much broader class of contributing filters, namely filters defined by intermediate CNN representations. We apply our proposed method on the MNIST data set, on the butterfly data set, and on a garden data set for place recognition, obtaining accuracies of 99.49%, 96.57%, and 89.84%, respectively. Our method outperforms a CNN-baseline method in which the full CNN representation at a certain layer is used as input to an SVM classifier. It also outperforms traditional non-CNN methods for the studied applications. In the case of place recognition, our method outperforms NetVLAD when only one reference image is used per scene and the two methods perform similarly when many reference images are used.

KW - COSFIRE filter

KW - CNN

KW - object recognition

KW - place recognition

KW - VESSEL DELINEATION

U2 - 10.1109/ACCESS.2019.2918267

DO - 10.1109/ACCESS.2019.2918267

M3 - Article

VL - 7

SP - 66157

EP - 66166

JO - IEEE Access

JF - IEEE Access

SN - 2169-3536

ER -

ID: 86472897