Skip to main content

Emotiv Insight with Convolutional Neural Network: Visual Attention Test Classification

  • Conference paper
  • First Online:
Advances in Computational Collective Intelligence (ICCCI 2021)

Abstract

The purpose of this paper is to use the low-cost EEG device to collect brain signal and use the neural network algorithm to classify the attention level based on the recorded EEG data as input. Fifteen volunteers participated in the experiment. The Emotiv Insight headset was used to record the brain signal during participants performing the Visual Attention Colour Pattern Recognition (VACPR) test. The test was divided into 2 tasks namely task A for stimulating the participant to be attentive and task B for stimulating the participant to be inattention. Later, the recorded raw EEG signal passed through a Notch filter and Independent Component Analysis (ICA) to filter out the noise. After that, Power Spectral Density (PSD) was used to calculate the power value of pre-processed EEG signal to verify whether the recorded EEG signal is consistent with the mental state stimulated during task A and task B before performing classification. Since EEG signals exhibit significantly complex behaviour with dynamic and non-linear characteristics, Convolutional Neural Network (CNN) shows great promise in helping to classify EEG signal due to its capacity to learn good feature representation from the signals. An accuracy of 76% was achieved, indicating the feasibility of using Emotiv Insight with CNN for attention level classification.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Das, M., Bennett, D.M., Dutton, G.N.: Visual attention as an important visual function: an outline of manifestations, diagnosis and management of impaired visual attention. Br. J. Ophthalmol. 91(11), 1556–1560 (2007). https://doi.org/10.1136/bjo.2006.104844

    Article  Google Scholar 

  2. Tóth, B., et al.: Attention and speech-processing related functional brain networks activated in a multi-speaker environment. PLOS ONE 14(2), e0212754 (2019)

    Article  Google Scholar 

  3. Shestyuk, A.Y., Kasinathan, K., Karapoondinott, V., Knight, R.T., Gurumoorthy, R.: Individual EEG measures of attention, memory, and motivation predict population level TV viewership and Twitter engagement. PLoS ONE 14(3), 1–27 (2019). https://doi.org/10.1371/journal.pone.0214507

    Article  Google Scholar 

  4. Aliakbaryhosseinabadi, S., Kamavuako, E.N., Jiang, N., Farina, D., Mrachacz-Kersting, N.: Classification of EEG signals to identify variations in attention during motor task execution. J. Neurosci. Methods 284, 27–34 (2017). https://doi.org/10.1016/j.jneumeth.2017.04.008

    Article  Google Scholar 

  5. Tan, B.H.: Using a Low-cost EEG Sensor to Detect Mental States (2012)

    Google Scholar 

  6. Van Hal, B., Rhodes, S., Dunne, B., Bossemeyer, R.: Low-cost EEG-based sleep detection. In: 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2014, pp. 4571–4574 (2014). https://doi.org/10.1109/EMBC.2014.6944641

  7. Zabcikova, M.: Visual and auditory stimuli response, measured by Emotiv Insight headset. MATEC Web Conf. 292, 01024 (2019). https://doi.org/10.1051/matecconf/201929201024

    Article  Google Scholar 

  8. KumarAhirwal, M., londhe, D.N.: Power spectrum analysis of EEG signals for estimating visual attention. Int. J. Comput. Appl. 42(15), 34–40 (2012). https://doi.org/10.5120/5769-7993

    Article  Google Scholar 

  9. Jebelli, H., Khalili, M.M., Lee, S.: Mobile EEG-based workers’ stress recognition by applying deep neural network. In: Mutis, I., Hartmann, T. (eds.) Advances in Informatics and Computing in Civil and Construction Engineering, pp. 173–180. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-00220-6_21

    Chapter  Google Scholar 

  10. Borst, J., Schneider, D., Walsh, M., Anderson, J.: Stages of processing in associative recognition: evidence from behavior, EEG, and classification. J. Cogn. Neurosci. 25(12), 2151–2166 (2013). https://doi.org/10.1162/jocn_a_00457

    Article  Google Scholar 

  11. Stoet, G.: PsyToolkit: a software package for programming psychological experiments using Linux. Behav. Res. Methods 42(4), 1096–1104 (2010). https://doi.org/10.3758/BRM.42.4.1096

    Article  Google Scholar 

  12. Lim, Z.Y., Sim, K.S., Tan, S.C.: An evaluation of left and right brain dominance using electroencephalogram signal. Eng. Lett. 28(4), 1358–1367 (2020)

    Google Scholar 

  13. Gola, M., Magnuski, M., Szumska, I., Wróbel, A.: EEG beta band activity is related to attention and attentional deficits in the visual performance of elderly subjects. Int. J. Psychophysiol. 89(3), 334–341 (2013). https://doi.org/10.1016/j.ijpsycho.2013.05.007

    Article  Google Scholar 

  14. Toa, C.K., Sim, K.S., Tan, S.C.: Electroencephalogram-based attention level classification using convolution attention memory neural network. IEEE Access 9, 58870–58881 (2021). https://doi.org/10.1109/ACCESS.2021.3072731

    Article  Google Scholar 

  15. Abhang, P.A., Gawali, B.W., Mehrotra, S.C.: Chapter 3: Technical aspects of brain rhythms and speech parameters. In: Abhang, P.A., Gawali, B.W., Mehrotra, S.C. (eds.). Introduction to EEG- and Speech-Based Emotion Recognition, pp. 51–79. Academic Press, New York (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Toa, C.K., Sim, K.S., Tan, S.C. (2021). Emotiv Insight with Convolutional Neural Network: Visual Attention Test Classification. In: Wojtkiewicz, K., Treur, J., Pimenidis, E., Maleszka, M. (eds) Advances in Computational Collective Intelligence. ICCCI 2021. Communications in Computer and Information Science, vol 1463. Springer, Cham. https://doi.org/10.1007/978-3-030-88113-9_28

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-88113-9_28

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-88112-2

  • Online ISBN: 978-3-030-88113-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy