News Release

Low-quality video target detection based on EEG signal using eye movement alignment

Peer-Reviewed Publication

Beijing Institute of Technology Press Co., Ltd

The experimental paradigm and setup.

image: 

(A) An example of the screen in the experimental video. The aircraft and carrier in the red box are targets, which are hidden in the layer, wave, and island. (B) Experimental paradigm. (C) Experimental setup.

view more 

Credit: Weijie Fei, School of Mechanical Engineering, Beijing Institute of Technology.

In a research paper, scientists from the Beijing Institute of Technology proposed an event related potential (ERP) extraction method to solve the asynchronous problem of low-quality video target detection, designed the time-frequency features based on continuous wavelet transform, and established an EEG decoding model based on neural characterization. The average decoding accuracy of 84.56% is achieved in pseudo-online test.

The new research paper, published July 4 in the journal Cyborg and Bionic Systems, introduces a low-quality video object detection technique based on EEG signals and an ERP alignment method based on eye movement signals, demonstrating proven effectiveness and feasibility. The technology is expected to be widely used in military, civil and medical fields.

According to Fei, "Machine vision technology has developed rapidly in recent years. Image processing and recognition are very efficient. However, identifying low-quality targets remains a challenge for machine vision." Based on these problems, Fei, the author of this study, proposed a solution: a) designed a new low-quality video target detection experimental paradigm to simulate UAV reconnaissance video in complex environments. b) An eye movement synchronization method based on eye movement signals was designed to determine the target recognition time by analyzing different eye movement types, so as to accurately extract ERP fragments. c) Neural representations in the process of target recognition are analyzed, including time domain, frequency domain and source space domain. d) Designed time-frequency features based on continuous wavelet transform, and constructed a low-quality video target EEG decoding model.

The authors say this work is the first to explore EEG based low-quality video object detection, breaking the limitations of using only clear and eye-catching video objects or the RSVP paradigm. In addition, to solve the problem of asynchronous detection in video object detection, an ERP alignment method based on eye movement signal is proposed, and a low-quality video object detection method based on EEG is developed, which is conducive to the practical application of this kind of brain-computer interface.

Fei said, “We simply simulated the low quality of the target due to factors such as weather, environment, or the target being partially obscured by clouds, waves, and islands. Although the simulation can reflect the challenge of low-quality video target detection to a certain extent, it is still relatively simple compared with the complex and changeable low-quality situations that may be encountered in the actual scene. To better apply the target detection technology based on EEG to the human-computer interaction system, it is necessary to further study the influence of different video object quality parameters (video size, definition, and screen complexity) on target detection.” In conclusion, the proposed method based on eye movement signals can perform ERP alignment more efficiently and achieve higher target recognition accuracy (84.56%). The technology can be applied to military reconnaissance, disaster relief, monitoring, medical and other fields to help quickly identify key targets.

Authors of the paper include Jianting Shi, Luzheng Bi, Xinbo Xu, Aberham Genetu Feleke, and Weijie Fei.

This work was supported by Basic Research Plan under Grant JCKY2022602C024.

The paper, “Low-Quality Video Target Detection Based on EEG Signal Using Eye Movement Alignment” was published in the journal Cyborg and Bionic Systems on Jul 4, 2024, at DOI: 10.34133/cbsystems.0121.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.