M.Sc. Tezi Görüntüleme

Student: Shahin POURZARE
Supervisor: Prof. Dr. Temel KAYIKÇIOĞLU
Department: Elektrik-Elektronik Müh.
Institution: Graduate School of Natural and Applied Sciences
University: Karadeniz Technical University Turkey
Title of the Thesis: Classifying Eye and Chin Movement Artifacts in EEG Signals
Level: M.Sc.
Acceptance Date: 9/11/2012
Number of Pages: 75
Registration Number: i2563

      Today, the most important studies have been done based on the internet and computer technology to facilitate the lifes of people with disabilities. One of these technologies is Brain Computer Interface. A BCI system is a computer system which enables people to use the various neuroprosthesis system in motor nervous system or electromechanical arm. The current BCI system is carried out to the processing of deep and superficial records EEG (Electroencephalography) from the brain signals. Electroencephalography (EEG) signals are the low amplitude bioelectrical signals which are received from brain scalp. Those signals peak to peak amplitude is 2-100 μV and frequency band situates between 0.1-60 Hz. Due to their irregular structure, EEG signals can be carried artifacts. Generally artifacts are unwanted signal data. These artifacts such as moving limbs, eye movements or as a result of the external body s reaction influences are bioelectric potentials. Electrooculography (EOG) and electromyography (EMG) artifacts are considered among the most important sources of physiological artifacts in BCI systems.

In this study, a novel approach has been presented to classifying various face movement artifacts such as eye and chin in EEG signals record. EEG signals were acquired in our EEG Laboratory with Brain Quick EEG system (Micromed, Italy) from three healthy human subjects in age group of between 28 and 30 years old and on different days. Extracted feature vectors based on root mean square, polynomial fitting and Hjorth descriptors were classified by k-nearest neighbor algorithm. The proposed method was successfully applied to our data sets and achieved 99%, 94% and 89% classification accuracy rate on the test data of three subjects.