Performance of neural networks for localizing moving objects with an artificial lateral line

Luuk H Boulogne, Ben J Wolf, Marco A Wiering, Sietse M van Netten

Research output: Contribution to journalArticleAcademicpeer-review

35 Citations (Scopus)
528 Downloads (Pure)

Abstract

Fish are able to sense water flow velocities relative to their body with their mechanoreceptive lateral line organ. This organ consists of an array of flow detectors distributed along the fish body. Using the excitation of these individual detectors, fish can determine the location of nearby moving objects. Inspired by this sensory modality, it is shown here how neural networks can be used to extract an object's location from simulated excitation patterns, as can be measured along arrays of stationary artificial flow velocity sensors. The applicability, performance and robustness with respect to input noise of different neural network architectures are compared. When trained and tested under high signal to noise conditions (46 dB), the Extreme Learning Machine architecture performs best with a mean Euclidean error of 0.4% of the maximum depth of the field D, which is taken half the length of the sensor array. Under lower signal to noise conditions Echo State Networks, having recurrent connections, enhance the performance while the Multilayer Perceptron is shown to be the most noise robust architecture. Neural network performance decreased when the source moves close to the sensor array or to the sides of the array. For all considered architectures, increasing the number of detectors per array increased localization performance and robustness.

Original languageEnglish
Article number056009
Number of pages12
JournalBioinspiration & biomimetics
Volume12
Issue number5
Early online date14-Jul-2017
DOIs
Publication statusPublished - Oct-2017

Keywords

  • Journal Article

Fingerprint

Dive into the research topics of 'Performance of neural networks for localizing moving objects with an artificial lateral line'. Together they form a unique fingerprint.

Cite this