Paste

Author: Anonymous f3sk0m

Expire: Never

Abstract Computer cursor control using electroencephalogram (EEG) signals is a common and well-studied brain-computer interface (BCI). The emphasis of the literature has been primarily on evaluation of the objective measures of assistive BCIs such as accuracy of the neural decoder whereas the subjective measures such as user’s satisfaction play an essential role for the overall success of a BCI. As far as we know, the BCI literature lacks a comprehensive evaluation of the usability of the mind-controlled computer cursor in terms of decoder efficiency (accuracy), user experience, and relevant confounding variables concerning the platform for the public use. To fill this gap, we conducted a two-dimensional EEG-based cursor control experiment among 28 healthy participants. The computer cursor velocity was controlled by the imagery of hand movement using a paradigm presented in the literature named imagined body kinematics (IBK) with a low-cost wireless EEG headset. We evaluated the usability of the platform for different objective and subjective measures while we investigated the extent to which the training phase may influence the ultimate BCI outcome. We conducted pre- and post- BCI experiment interview questionnaires to evaluate the usability. Analyzing the questionnaires and the testing phase outcome shows a positive correlation between the individuals’ ability of visualization and their level of mental controllability of the cursor. Despite individual differences, analyzing training data shows the significance of electrooculogram (EOG) on the predictability of the linear model. The results of this work may provide useful insights towards designing a personalized user-centered assistive BCI. Keywords: Brain-Computer Interface, Cursor control, EEG, Usability, Confounding variables, Imagined Body Kinematics Go to: I. INTRODUCTION A brain-controlled computer cursor has been utilized as a testbed for developing assistive BCIs. Many BCI systems have been designed to harness the computer cursor using noninvasive brain imaging techniques such as electroencephalogram (EEG) [1, 2]. Several intention-driven (endogenous) BCI paradigms have been established based on various EEG monitoring techniques, including motor imagery (MI) [1, 2], and imagined body kinematics (IBK) [3–5]. While MI encodes brainwaves over the sensorimotor area on alpha and beta frequency bands, IBK encodes the associated motor activity in temporal fluctuations of EEG signals. There is a vast BCI literature revolving around sensorimotor rhythm (SMR) based motor imagery. For instance, Bai et al. [6] designed a two-directional and a four-directional cursor control BCI systems using motor execution and motor imagery paradigms. They achieved an average accuracy of 88% and 80% in a two-dimensional control task and 58% and 45% for a four-dimensional control task using motor execution and motor imagery, respectively. Despite its successes, MI-based assistive BCIs are prone to two main shortcomings. First, the user’s training takes days, if not weeks, to happen. Learning to modulate the brain activity in the frequency bands/brain areas of interest on particular brain areas is often long. For instance, it might take days for users to move a computer cursor with adequate accuracy [1, 2, 7]. Second, MI paradigm is not considered a natural way of control [3] since there is not necessarily a correlated kinematics between motor execution (or imagination) and the controlled object [8]. The discrepancy may add more to the potential user’s frustration and affect the overall acceptance of the assistive BCI. This article is one of the few studies that investigate the capability of BCI using imagined body kinematics. Several studies have witnessed that IBK paradigm may help to reduce the duration of training [9]. Bradberry et al. [3] studied an EEG-based BCI paradigm that maps natural imagery movement of the dominant hand onto the velocity of a computer cursor in two-dimensional space. The results showed that users could acquire acceptable controllability over the cursor within less than an hour of user’s training. Ofner et al. exploited the IBK paradigm in a trial-based experiment [10], and Kim et al. compared the efficacy of the paradigm to control the trajectory of a robotic arm using different classifiers with execution and imagination of different body parts [11]. The imagined kinematics can be extracted from low-frequency EEG signals (usually less than 1Hz) [3, 4, 10, 11]. A similar approach has been applied in invasive BCI studies where participants with implanted electrodes were able to maintain high controllability over a computer cursor [12]. A few previous studies gave evidence that among different measures of upper limb body kinematics (position, velocity, acceleration, etc.), velocity may be easier to determine from neural activities, and may attain a more robust predictive measure in both offline and real-time applications [3, 13–15]. Following the literature, we modeled the cursor velocity with the EEG signals. Thomas et al. [16] pointed out subject-specific discriminative frequency components (DFC) pattern for MI-based BCI. They also observed the variability of DFC over sessions of BCI experiments. To track the variation in DFC over time, they developed an adaptive algorithm based on the deviation of discriminative weight values of frequency components, which offers better classification performance than the conventional approaches. Das et al. [17] developed a subject-specific approach to select spatio-spectral filters for classification in MI tasks. Their methodology provides an improved classification performance as well as addressing the non-stationarity of neural patterns in a multi-session BCI study. Temporal characteristics of EEG signals has also been used to model imagined kinematics. Kalman filter [18], particle filter model [19], and kernel ridge regression [11] were implemented and analyzed using offline data. As the most convenient method, multiple linear regression (MLR) has been employed for both offline analysis [13, 14] and real-time implementation [3] to predict imagined object velocity. Borhani et al. [15] evaluated various machine learning techniques to investigate some of the advanced decoding algorithms for an IBK paradigm using EEG signals with application to neural cursor control. They also studied a directional classifier to determine horizontal and vertical intention of movement with EEG signals. Table I summarized previous BCI studies on EEG-based object control. For a thorough review of EEG paradigms and the associated decoding methodologies, we refer the readers to our recent review article [9]. TABLE I Previous BCI studies on EEG-based object control. EEG paradigms References No. of subjects Control paradigm Analysis Mean success rate SMR Wolpaw et al. [26] 5 2D: hit 4 targets on a screen FFT 60.4 % SMR Wolpaw et al. [1] 4 2D: hit 8 targets on a screen Linear regression + LMS 82.2% SMR McFarland et al. [2] 4 3D: hit an arbitrary target FFT 75.8% IBK Bradberry et al. [3] 5 2D: 4 targets on a screen Linear Regression on time-shifted EEG samples 73% IBK Current Study 28 2D: 4 targets on a screen Linear Regression on time-shifted EEG samples 66.5% 1D: SMR 2D: SSVEP Trejo et al. [27] 6 1D & 2D: hit a target 1D (KPLS) 2D (KPLS) 1D: 88% 2D: 80–100% SMR+SSVEP Allison et al. [28] 10 2D: hit 8 targets LDA on both EEG paradigms 60% SMR+P300 Li et al. [29] 6 2D: hit an arbitrary target SVM classification 90.3% SMR+P300 Long et al. [30] 11 2D: hit an arbitrary target SVM classification 93.99% Real hand movement or SMR Kayagil et al. [31] 4 2D: hit an arbitrary target through obstacles Bhattacharyya distance of PSD 86% Motor execution Huang et al. [32] 5 2D: hit an arbitrary target among obstacles GA-based MLD classifier on the ERD/ERS features 85% SSVEP+P300 Bi et al. [33] 8 2D: hit 8 targets RBF SVM for SSVEP + LDA for P300 85% Open in a separate window ERS/ERD: event-related synchronization/desynchronization. LMS: least square method. SVM: support vector machine. PSD: power spectral density. LDA: linear discriminant analysis. KPLS: kernel partial least squares classification. MLD: Mahalanobis linear distance. RBF: radial basis function. GA: genetic algorithm. User’s acceptance has been always a concern in every BCI platform. Thus, a customized questionnaire was designed based on some of the well-known usability questionnaires such as NASA Task Load Index (NASA-TLX), Quebec User Evaluation of Satisfaction with assistive Technology (QUEST 2.0) and Intrinsic Motivation Inventory (IMI) [20, 21] to evaluate the usability of the BCI platform. NASA-TLX has been used to evaluate cognitive workload, QUEST 2.0 has been designed to evaluate users’ satisfaction, and IMI is customized to measure the user experience on a device. We aim to evaluate the usability of the BCI by taking into consideration of abilities of visualization, user’s self-perception of controllability [22], and confounding factors pertinent to user experience. While the most important factor in BCI systems is the control accuracy, other confounding factors can easily make or break the adoption of the system in an individual. The user experience and usability of cursor control for a small number of users were reported in several studies [23, 24]. Kubler et al. [21] investigated a user-centered design for BCI-controlled applications. They investigated usability in terms of effectiveness, efficiency, and satisfaction on some of the well-established BCI applications, including cursor control with motor imagery. We refer readers to this work as a comprehensive usability study in BCI. The user experience can significantly impact the accuracy of BCI systems’ outcome. The work implies an explicit interdependence between user experience and controllability of a BCI system. As an example, it has been shown that motivation has a large impact on BCI performance [25]. It is observed that users can manipulate a BCI system better if they are more motivated to do so. A number of confounding variables were considered across all participants, including various measures of user experience, user performance, and their interactions. We aimed to explore the impact of those variables on the performance of BCI. Go to: II. Materials and methods A. Participants All experimental procedures were approved by the Institutional Review Board at the University of Tennessee, Knoxville. The participants were recruited mainly from engineering departments at the University of Tennessee. The study included 28 healthy participants (7 females and 21 males; with age of 22.7 ± 3.5) with no prior experience of using BCI. The participants had no recorded neurophysiological disorder. Twenty-five participants were right-handed, two were lefthanded, and one was ambidextrous. B. Experimental Procedures Each participant followed the same procedure that consists of a 1) pre-interview questionnaire (≈1 minute), 2) a cursor control BCI experiment (≈ 20 minutes), and 3) a post-interview questionnaire (≈2 minutes)., The entire experimental procedure, including the duration for filling out the consent form and the duration for EEG headset adjustment on average lasts for 30 minutes. The BCI experiment includes data collection for the training phase, model calibration phase, and target acquisition or testing phase. 1) Pre-interview questionnaire on visualization and vigilance After collecting informed consent and demographic information, the participants were asked to respond to a pre-interview questionnaire. The questions primarily aimed at evaluating self-perception of visualization and focused attention (vigilance) as confounding variables; see Table II. TABLE II Questions related to the selected confounding variables in the pre-interview questionnaire Index Question 1 How would you describe your ability to visualize or imagine the movement of your hands or body? Choose one: Far below average, Moderately below average, Slightly below average, Average, Slightly above average, Moderately above average, Far above average. 2 How would you describe your ability to maintain attention over time? (Response options are same as in question 1.) Open in a separate window 2) Cursor control experiment A computer workstation with dual screens was used for the experiment: one screen for the experimenter to administer the experiment and the other placed in front of the participants. The participants were asked to sit at an arm’s length distance (60 cm) from the screen. They were instructed to rest their hands on the laps while avoiding excessive eye blinks. The cursor diameter was chosen to be 1.5cm (0.20% of the workspace), and the movement workspace was 33 cm × 33 cm in size. EEG signals were acquired with a wireless water-hydrated 14-channel headset named Emotiv EPOC. The headset collects EEG signals over AF3, F7, F3, FC5, T7, P7, O1, O2, P8, T8, FC6, F4, F8, AF4 according to 10–20 standard with a sampling rate of 128Hz. We assured the electrodes-scalp impedance less than 10KΩ for all 14 electrodes throughout the experiment for all participants. The EPOC TestBench software was used to monitor the quality of EEG signals during the data collection sessions. The same headset has been used for different BCI applications such as motor imagery [34] and motor rehabilitation [35], and P300 speller [36]. We used BCI2000 [37, 38] combined with MATLAB software for EEG data collection during training, calibration, and testing phases. A band-pass filter with cut-off frequencies of 0.2 Hz and 30 Hz [39] was applied during the training phase EEG recording. During all phases of the experiment, a preparation cue was used to prime participants into the task. Training phase: From previous studies [4, 5, 7, 40–44], we found that 10 minutes of training suffice to acquire acceptable controllability for EEG-based computer cursor control. The participants were asked to sit comfortably in a fixed chair with their hands resting on their laps. A moving cursor was shown on the screen while they were asked to imagine moving the cursor with their dominant hand. The participants were instructed to track the cursor by mimicking moving an imaginary computer mouse with the palm of the dominant hand. They were also asked to maintain normal eye movement and to keep their focus only on the cursor while avoiding excessive body movement and eye blinks. The trajectory, velocity, and acceleration of the cursor were identical for all participants. The training phase included 5 trials of horizontal and 5 trials of vertical cursor movement; each lasted for 60 seconds. EEG signals were recorded to capture the neural activity underlying the voluntary cursor movement during the experiment. Afterward, the EEG signals were exhausted to train a personalized multiple linear regression (MLR) model (see equations 1 and 2) to predict the cursor velocity (see details in subsection 4 in the following). The model was utilized to control the cursor velocity in 2-dimensional (2D) space in real-time during the target acquisition phase. Calibration phase: The goal of this phase was to calibrate the MLR model by adjusting the multipliers Kx and Ky (see equations 1 and 2). Immediately following the training phase, each participant was asked to control the computer cursor the same way they imagined moving the cursor in the training phase. The participants were instructed to imagine moving the cursor to sweep the 2D cursor workspace, entirely. Participants perform a cursor control for about 2–3 minutes using the personalized MLR model with arbitrary multipliers. The multipliers were adjusted so that the participants perceive acceptable controllability over the cursor movement. The calibrated model was used in the target acquisition phase. Target acquisition phase: The participants were instructed to imagine moving the cursor in a 2D space to hit the designated targets within 15 seconds (s) of each trial. The targets were designed to take 2.4% of the workspace with the width 8% and the length 30% of the screen dimensions. The targets appeared randomly on the four edges of the workspace, specifically at the top, bottom, left, or right side. The participants were given 15 seconds to hit a target; otherwise, a new trial would be started after 2s of inter-trial. In total, there were 40 consecutive trials of cursor control task, and it took about 10 minutes for a participant to perform the task. Fig. 1 shows a schematic of the experimental setup for the target acquisition phase. A schematic of a sample target acquisition phase is shown in Fig. 2. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0001.jpg Fig. 1. Schematic of the EEG-based BCI platform in cursor control task An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0002.jpg Fig. 2. A sample schematic of target acquisition phase. The phase starts with a 5s preparation cue followed by 40 trials, each of which lasts no more than 15s. There is a 2s interval period between the trials. A fixation cue of a blank screen was displayed during the interval period. 3) Post-interview questionnaire After completing the cursor control experiment, each participant was asked to fill out a post-interview questionnaire. The questions aimed to evaluate different confounding variables, including the participant’s level of energy, level of focus, and self-perception of control over the computer cursor movement. Table III lists the questions. TABLE III Questions related to selected confounding variables in the post-interview questionnaire Index Question 1 How would you describe your energy level during this experiment? Choose one: Far below average, Moderately below average, Slightly below average, Average, Slightly above average, Moderately above average, Far above average. 2 How would you describe your focus level during this experiment? (Response options are the same as in question 1) 3 How would you describe the length of this experiment? Choose one: Far too short, Moderately too short, Slightly too short, Neither too short nor too long, Slightly too long, Moderately too long, Far too long. 4 How would you rate the level of controllability over cursor you had during the experiment (overall rate, individual target control)? Choose one: Extremely bad, Moderately bad, Slightly bad, Neither bad nor good, Slightly good, Moderately good, Extremely good. 5 Did you feel that eye movement had any effect on your ability to control the cursor? Choose one: Definitely yes, Probably yes, Might or might not, Probably not, Definitely not. 6 If you visualized hand movement, how difficult would you say this visualization was to maintain? Choose one: Extremely challenging, Very challenging, Moderately challenging, Slightly challenging, Not challenging at all. 7 How do you feel physically after the experiment? Choose one: Extremely uncomfortable, Moderately unconformable, Slightly uncomfortable, Neither uncomfortable nor comfortable, Slightly comfortable, Moderately comfortable, Extremely comfortable. 8 Describe your level of comfort during the test in relation to the lab environment and headset? (Response options are the same as in question 7) Open in a separate window C. Decoding EEG signals In this work, we adopted an MLR model depicted in Equations (1) and (2) to translate imagery kinematics of a computer cursor while evaluating the usability of the BCI setup. [Math Processing Error] (1) [Math Processing Error] (2) The model maps the acquired EEG signals to horizontal (x) and vertical (y) cursor velocities. We know from the literature that the natural body kinematics are encoded on the very low-frequency fluctuations of EEG signals [3]. Thus, we applied a zero-phase, fourth-order, low-pass Butterworth filter with a cutoff frequency of 1 Hz to the EEG signals. Here, output velocities at time sample t in x and y direction are represented by u[t] and v[t], respectively. Variables a and b are the weights that could be obtained through multiple linear regression, en[t − k] is the voltage measured at EEG electrode n at time t-k, the total number of EEG sensors is N = 14, and number of time lags is K= 13. The time lag was chosen to be ≈100 ms (≈ 13×1/128 seconds) to include the contributing neural components in the past to optimally predict the upcoming cursor kinematics [3, 4, 40]. We applied leave-one-out cross-validation to avoid overfitting of the MLR model, i.e., four trials for training the model and one trial for testing. The personalized model was implemented as the average linear regression coefficients of all five cross-validated models. We also introduced a custom Goodness-of-Fit (GoF) measure to evaluate the predictive capabilityof the personalized model; see Equation (3). We divided the data of a trial into segments of 5 seconds and computed the Pearson correlation between the observed and the corresponding predicted velocities for each segment. Then, the averaged Pearson correlation over chunks of a trial is defined as GoF for the entire trial: [Math Processing Error] (3) Where [Math Processing Error] and [Math Processing Error] represent the decoded velocity and the observed velocity for the ݅th segment, respectively. Since a trial is 60 second, the number of segments for each trial is M = 12. Go to: III. Results The results are summarized in the following sections. The first and second sections describe the outcomes of the training (see Fig. 2 and ​and3)3) and target acquisition phases (see Fig. 4 and ​and5),5), respectively. The third section examines the correlation between performance measures of the training and target acquisition phases (see Fig. 6). The final section reports the pre-(see Figs. 7–10 and Table II) and post-interview (see Fig. 11 and Table III) outcomes and their interdependence with training and target acquisition phases. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0003.jpg Fig. 3. A sample trial of actual cursor velocity (dashed curve) and predicted velocity from participant Sub1 in a) horizontal and b) vertical directions. Data is based on training phase of the experiment. Vx represents the horizontal velocity, and Vy represents the vertical velocity both in pixel/second. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0004.jpg Fig. 4. Distribution of GoF of all trials from all participants in (a) horizontal and (b) vertical directions. The total number of trials is 140 in each direction (28 participants and 5 trials each). The GoF has a mean [STD] of 68.98 [27.72] and 42.09 [26.76] in horizontal and vertical directions, respectively. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0005.jpg Fig. 5. Mean cursor trajectories during the target acquisition phase for participants a) Sub1 and b) Sub2. Mean trajectory is the average length-normalized of the successful trials. Orange, blue, yellow, and purple curves are the mean trajectories corresponding to the left, right, top, and bottom targets, respectively. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0006.jpg Fig. 7. Boxplot of GoF with respect to self-assessed visualization and imagination ability (Question 1 in Table II). GoF has mean [STD] of 52.13[23.79], 57.71[21.61], and 52.78[31.72] for average, slightly above average, and moderately above average, respectively. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0009.jpg Fig. 10. Boxplot of hit rate grouped according to the level of attention span (Question 2 in Table II). The hit rate has mean [STD] of 64.64[17.17], 62.47[18.55], and 73.34[15.30] for below average, average, and above average groups, respectively. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0010.jpg Fig. 11. Boxplot of hit rate with respect to the level of controllability over cursor during hitting all four targets (Question 4 in Table III). The hit rate has a mean [STD] of 64.13[13.59], 66.05[18.14], and 71.18[19.56] for poor, fair, and good, respectively. Details are summarized in Table V 1) Training phase evaluation Fig. 3 shows the representative results from a single participant. The cursor velocity (blue dashed-line) as ground truth and the predicted cursor velocity (constant red line) on one sample trial with a cross-validated personalized MLR model in horizontal and vertical directions using EEG signals is shown. Fig. 4 summarized the calculated GoF for all participants in horizontal and vertical directions, respectively. A one-way t-test comparison between the GoF measures in the two directions yields a p-value < 0.0001, which shows a significant difference in the model prediction between horizontal and vertical directions. The distribution in the horizontal direction can be closely approximated by an exponential distribution (dashed curve in Fig. 4(a)) and the distribution in the vertical direction can be approximated by a uniform distribution (dashed line in Fig. 4(b)). The similar observation has also been reported in the motor imager cursor control[3, 40, 45] plus for the actual hand movement [46]. 2) Performance during the target acquisition phase Fig. 5 shows the mean trajectories of the brain-controlled cursor traveled by two sample participants. All the traveled cursor trajectories to the four targets were linearly interpolated to calculate the mean cursor trajectories. Then, the mean trajectories were calculated by averaging the interpolated trajectories. Table IV shows the average performance measure of all participants during the target acquisition phase. The average rate of success in hitting individual targets and the corresponding standard deviation are reported. Furthermore, the mean travel time (MT) for all targets were also calculated. The overall performance is comparable to the previous invasive and noninvasive studies [1, 3, 8, 12]. TABLE IV Mean [STD] of hit rate and Cursor movement time for ALL participants Targets Hite Rate (%) MT (second) Left Target 66.90 [23.84] 5.91 [2.22] Right Target 65.59 [23.22] 6.50 [2.02] Top Target 67.85 [24.39] 6.59 [2.21] Bottom Target 65.68 [28.62] 5.87 [2.38] All Targets 66.51 [24.78] 6.21 [2.21] Open in a separate window 3) Evaluation of self-perception and user experience Before performing the BCI experiment, participants were asked to self-assess their level of attention and aptitude for imagination using a pre-interview questionnaire (see Table II). Additionally, several other factors such as self-perception of controllability over the cursor were asked in a post-interview questionnaire (see Table III). In the following sections, the collected information from questionnaires will be discussed. 3.1) Pre-interview analysis Self-assessed visualization and imagination ability level (see question 1 in Table II): Since we used the imagination of body movement to control the cursor, this question aimed to evaluate the ability of imagination. Fig. 7 shows a boxplot demonstrating the GoF of the training phase grouped according to the self-reported capability of visualization and imagination (No one responded for “far below average”, “moderately below average”, “slightly below average”, and “far above average”). Meanwhile, the mean hit rate with respect to the self-assessed capability of visualization and imagination is illustrated in a boxplot for all participants in Fig. 8. No participants self-assessed less than the average ability of imagination. Because of the unequal variances and sample sizes for the answer groups, we ran a Games-Howell test as a non-parametric post hoc significance test. The test shows a significant difference (p-value < 0.05) between the “Average” and “Moderately above average” self-assessed ability of imagination in the target acquisition phase (Fig. 8) while the same measure in the training phase (Fig. 7) does not reveal any significant difference (p-value > 0.05) between different levels of self-assessed visualization and imagination ability. Observations in Fig. 7 and Fig. 8 indicate that even though self-assessed levels of visualization and imagination are strongly correlated with the participants’ hit rate, such correlations do not appear in the training phase. In other words, BCI users (regardless of their performance in training phase) who perceived higher confidence in their ability to imagine would have better control over the cursor movement. More importantly, these results suggest that judgment on the ability to control as assistive BCI (cursor control) may not be solely possible through collected information in the training phase. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0007.jpg Fig. 8. Boxplot of hit rate with respect to self-assessed visualization and imagination ability (Question 1 in TableTable II). GoF has mean [STD] of 59.14[19.15], 70.00[14.40], and 80.83[7.64] for average, slightly above average, and moderately above average, respectively. Level of attention span (see question 2 in Table II): As another confounding variable, one’s ability to maintain attention was assessed with respect to training and target acquisition phases. Fig. 9 and Fig. 10 illustrate the boxplots of GoF and hit rate for different levels of self-assessed attention span (maintaining attention over time), respectively (There was no response for “far below average”. “Moderately below average” and “slightly below average” were combined as “below average” in Fig. 9 and Fig. 10. “Slightly above average”, “moderately above average” and “far above average” were combined as “above average” in Fig. 9 and Fig. 10). Although Games-Howell test among all three groups does not reveal a significant difference, the mean value of GoF and hit rate among participants with different self-assessment of attention (groups with response: below average, average and above average) showed a moderate positive correlation between the self-perceived level of attention and the users’ performance in the training and target acquisition phases. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0008.jpg Fig. 9. Boxplot of GoF grouped according to the level of attention span (Question 2 in Table II). GoF has mean [STD] of 50.93[24.76], 51.71[22.89], and 64.23[17.30] for below average, average, and above average groups, respectively. 3.2) Post-interview analysis Energy level during the experiment (see question 1 in Table III): Our previous studies [4, 47] found that some participants could not control the cursor because of fatigue on the day of the experiment. We surveyed the energy level in the post-interview questionnaire. Games-Howell test does not show a significant difference between the self-perceived energy level and either the training phase (GoF) or the target acquisition phase (hit rate) outcome measures. Focus level during the experiment (see question 2 in Table III): As another confounding variable, self-perception of the level of focus was evaluated in the post-interview questionnaire. Games-Howell test does not show a significant difference between the self-perceived focus level and either the training phase (GoF) or the target acquisition phase (hit rate) outcome measures. Duration of the experiment (see question 3 in Table III): The participants were asked to assess the duration of the experiment. Except for three, who reported that the duration of the experiment was too long, all other participants reported that the duration was short and acceptable. Level of controllability over cursor (see question 4 in Table III): As another confounding variable, the level of controllability over the cursor was also evaluated in the post-interview questionnaire. Except for some preliminary and uncompleted studies [48], in the authors’ best knowledge, this study is the first to evaluate controllability of an assistive BCI over the cursor control paradigm. Fig. 11 shows the boxplot of hit rate for different levels of self-assessed controllability (No one responded, “extremely bad” and “extremely good”. “Moderately bad” was named “poor” in Fig. 11. “Slightly bad”, “neither bad nor good”, and “slightly good” were combined as “fair” in Fig. 11. “Moderately good” was renamed as “good” in Fig. 11). There was no significant difference between the hit rate and different levels of self-perceived controllability. There has been a debate in the literature that a high hit rate might be associated with random hitting [3]. While we cannot fully rule out the effect of randomness, this post-interview question showed that most participants perceived a moderate to a high level of controllability over the cursor. Eye movement effect (see question 5 in Table III): The impact of various artifacts, especially eye movement, has been a valid question on all types of BCIs, particularly assistive BCIs. Due to the current technology, the real-time removal of the different artifact may not always be a viable option. This may pose as a challenge and may impact the ultimate outcome. So, we were interested in evaluating the self-perceived impact of eye movement on the controllability of the cursor. The participants were allowed to have natural eye movement as they were allowed to follow the cursor with their eyes. Two out of twenty-eight participants reported that eye movement did not associate at all with the cursor controllability (mean [STD] of the hit rate is 77.50 [7.07]). The other 26 participants (mean [STD] of the hit rate is 65.66 [17.61]) reported that eye movement had a positive role and enhanced the controllability of the cursor. While some studies [3] allowed participants to have eye movement during the IBK experiments, other studies [10] asked participants to suppress eye movement during the BCI experiment. We believe that whether eye movement may or may not help with the cursor controllability, it is impractical to ask users to fully avoid it in real-life scenarios. Indeed, eye movement is exhibited as a high-amplitude signal projected in neural activity and has a distinctive independent component. Similar studies showed the impact of electrooculogram (EOG), particularly when using linear models [11, 45]. Our results also suggest a positive role of eye movement in the cursor controllability, which is consistent with the findings in the literature [11, 45]. The difficulty of imaginary hand movement (see question 6 in Table III): The participants were asked to indicate the difficulty in imagining hand movement in the training and the target acquisition phases. All participants reported a moderate to high difficulty while engaged in the motor imagery task. We believe the reason may be due to the short training time they had to modulate their brain activities. However, Games-Howell test does not reveal a significant difference between the self-perceived difficulty of imaginary hand movement and either the training phase (GoF) or the target acquisition phase (hit rate) outcome measures. The physical feeling after the experiment (see question 7 in Table III): among all 28 participants, 26 of them reported that the experiment did not lead to any level of discomfort. Level of comfortability in lab environment and headset (see question 8 in Table III): The user comfort has always been a contributing aspect in BCI and its acceptance by users. It includes the comfort of the EEG headset during the BCI task. All participants reported comfortability of the lab environment. Only 5 out of 28 participants reported that the headset was uncomfortable for them. 4) Correlation between performances in the training phase and target acquisition phase We investigated the relationship between the outcome measures in training (GoF) and the target acquisition phases (Hit rate) to evaluate how the training phase may be correlated with the testing phase. Fig. 12 illustrates the correlation between hit rate and GoF in horizontal and vertical directions, both separately and combined. The plots suggest a positive correlation between the outcome measures in both phases. However, few participants exhibited high accuracy in terms of GoF in the training phase but low controllability in terms of hit rate in the target acquisition phase which may suggest the existence of a phenomenon named BCI illiteracy [49, 50]. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0011.jpg Fig. 12. Correlation between outcome measures of training (GoF) and target acquisition phases (hit rate) in (a) horizontal direction, (b) vertical direction, and (c) both directions combined. The correlation coefficient is 0.30 in (a), 0.37 in (b), and 0.25 in (c). Go to: IV. DISCUSSION We observed higher accuracy in participants’ kinematics prediction from the horizontal compared to the vertical direction (Fig. 3) during the training phase. One possible reason for this phenomenon may be explained by the fact that eye blink as a vertical eye movement may interfere with the model and interpreted as the actual cursor kinematics during the training of vertical cursor movement while the same is not true for the horizontal movement. Kinematics direction of eye blink is almost orthogonal to the horizontal eye movement, and it may not interfere easily with the actual cursor movement in the linear model [45]. Also, the measure of hitting rate collected in the target acquisition phase (Table IV) is not closely correlated with the outcome of the training phase. We think that the inherent difference of the requested task and the associated visual stimulation between the training and the target acquisition phases. The participants were trained to imagine moving the cursor in horizontal and vertical directions separately, but they were asked to imagine moving the cursor in two dimensions, concurrently in the target acquisition phase. The distinction in the training and testing phases may spell out the uncorrelated outcome measure in both phases. Also, it has been suggested that training is a reciprocal and bidirectional process [51]. Although having a personalized model is pivotal for a BCI platform, users adaptation and motor learning usually take place over multiple sessions of experiment [51]. Finally, using the target hit rate as a measure of controllability for testing mind-controlled cursor control may be debatable, since it is possible to attain a certain target hit rate without necessarily enough controllability over the cursor. We tried to address the issue by having a questionnaire to specifically collect the information. It is crucial to resolve the issue in future works. Eye movement and muscle activity: One of the main questions in the BCI is to what extent different independent components, including neural activity, eye movement, and muscle activities may contribute to the experiment outcome. Eye movement artifacts can be removed using approaches such as Independent Component Analysis (ICA) [52]. However, applying ICA in real-time cursor control experiment may be impractical because the methods require multiple iterations to converge which is not feasible in real-time. In the present study, we asked the participants to prevent overt body movement during the training and the target acquisition phases to reduce the impact of non-neural activities. However, there is an undeniable contribution of different neural and non-neural components. Brain activity measured over each EEG channel is a linear mixture of distinct cortical and non-cortical sources of activities. We utilized EEGLAB [53] for channel and source space processing. A high-pass filter with cut-off frequency of 0.3Hz was applied to remove the baseline drift and to reduce the ICA bias towards high amplitudes. We applied the Infomax ICA method [54] to extract independent contributing sources of signals. Then, we applied ICLabel plug-in [55] to classify the derived sources into neural activity, EOG, and others, including muscle activity, heartbeat, line noise, and channel noise classes. It is noteworthy that we cannot ensure the absolute removal of EOG using ICA algorithm. Also, the precision of the ICLabel to classify ICs is not guaranteed. Fig. 13 illustrates the post-processing pipeline to evaluate the impact of neural/EOG components on to the model predictability. Fig. 14 shows a sample estimated heatmap of independent components for neural activity, vertical, and horizontal eye movement. We evaluated the predictability of 1) EEG with removed EOG components, 2) only neural and EOG components and 3) raw EEG signals. Fig. 15, and Fig. 16 show the predictability of the MLR model with different signal sources, for horizontal and vertical cursor movements, respectively. Our results suggest that the MLR model highly relies on the EOG-related components. Removing eye movement significantly decreases the predictability of the model (see Fig. 15 and Fig. 16). Comparing the predictability of neural and EOG related activities with EOG-excluded activities on the imagined body kinematics shows a substantial increase in decoding accuracy. The comparison illustrates to what extent EOG may contribute to decoding imagined body kinematics when a linear model is employed. Kim and colleagues observed a similar outcome with a linear regression model [56]. Comparing the performance of MLR on raw EEG signals with neural plus EOG and EOG-excluded activities further suggest that cursor kinematics (here velocity) is not only encoded in neural and EOG components, but other signal components may carry the information. The finding may imply the need for investigating nonlinear modeling of EEG signals for future work. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0012.jpg Fig. 13. The schematic of the post-processing pipeline to evaluate the impact of neural/EOG components on to the model predictability An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0013.jpg Fig. 14. Estimated independent components of a) neural activity, b) vertical eye movement, and c) horizontal eye movement from a sample participant. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0014.jpg Fig. 15. Violin plot of Goodness of Fit (%) for horizontal cursor velocity showing the contribution of different components. An external file that holds a picture, illustration, etc. Object name is nihms-1612220-f0015.jpg Fig. 16. Violin plot of Goodness of Fit (%) for vertical cursor velocity showing the contribution of different components. Comparison of the usability with previous BCI studies: Usability of a BCI setup can be measured with a different objective and subjective measures. Task accuracy and information transfer rate are the most common objective measures while satisfaction, cognitive workload, and ease of use are the most common subjective measures [20]. There are a few usability studies on BCI-controlled applications. Using an SMR-based MI experiment, Nijboer et al. [57] investigated the effect of motivational factors such as confidence and fear of incompetence on the performance of BCI among individuals with amyotrophic lateral sclerosis (ALS). Although they could not show that motivational factors are always correlated with the BCI performance, they found that the higher training duration and lower information transfer rate on SMR-based MI compared to P300-based BCI may adversely impact the user’s motivation. With a user-centered perspective, Kubler et al. [21] conducted a comprehensive usability evaluation on various communication and assistive BCI applications. The work also includes suggestions on the type of questionnaire and the associated BCI applications. They queried user’s satisfaction on a Likert-type scale. The results showed that the users favored P300-based BCI over SMR-based MI BCI due to the learnability. However, the users reported equal satisfaction on those paradigms when using BCI for entertainment despite the lower effectiveness (accuracy) and efficiency (information transfer rate) in SMR-based BCI. An inherent limitation of using a questionnaire is the susceptibility to self-perception, interviewer/observer, and recall biases. Although we attempted to minimize the impact of the biases, having a case-control design may help to address the limitation further. We may conduct a case-control design in the future study using random kinematics for the computer cursor in the control group to better examine the exposure to different confounding variables in tandem with the experiment outcome (controllability over the cursor). Go to: V. CONCLUSION Even though noninvasive assistive BCIs have recently gained momentum, the opportunities and limitations of the technology are yet to be understood. Since the demodulation of brain activities into two concurrent dimensions using 2D training is arduous, we designed a simple, efficient training protocol with separate training in horizontal and vertical directions to minimize the training time. As it is shown in Table I, the success rate (target hit rate) in the current work (66.5%) is comparable to previous IBK study (73%). The main difference between our work and other relevant works in terms of efficiency may be related to the EEG headset. Unlike previous BCI studies with EEG headsets with typically many sensors (32 or 64 electrodes) and a higher sampling rate, we were interested to use a wireless and convenient portable headset with only 14 electrodes. Observations showed that participants with higher self-perception ability of imaginary movement could gain higher controllability over the cursor to hit the targets. Due to the limitations of sample size and the case-control design, the current study may not be able to explain the user-to-user variability and the possible motor learning effect, which are important topics to investigate in future research. A longitudinal case-control design in future may better describe the potential user’s adaptation effect. ICA analysis also suggest a significant contribution of eye movement on the model predictability revealed by a significant drop of GoF from an average 80% to 20%. We consider such an influence from eye movement as a limitation of the current study. We also conducted a preliminary study to evaluate the possibility of transfer learning among the population [41]. The findings show the potential to advance the BCI literature towards a transparent guideline for developers and to standardize the prospective BCI platforms. ​ TABLE V Mean [STD] of hit rate versus self-reported controllability over left, right, top, and bottom targets corresponding to Fig. 11 Target Poor Fair Good Significance Left 50.00 [28.28] 66.30 [23.22] 80.00 [17.89] Yes (P_value < 0.05) Right 64.44 [28.74] 65.16 [25.13] 68.00 [14.83] No Top 70.00 [14.14] 67.61 [27.54] 66.67 [11.55] No Bottom 66.67 [25.17] 55.83 [28.53] 86.25 [19.96] Yes (P_value < 0.05) Open in a separate window Go to: VI. Acknowledgment The authors would like to thank students in Iranian Students Association of UTK. Their participation was crucial for the test and improvement of the BCI platform used in this work. This work was in part supported by a NeuroNET seed grant and in part by the NIH under grants AG028383 and UL1TR000117. Go to: Contributor Information Reza Abiri, Dept. of Neurology at University of California, San Francisco/Berkeley and Dept. of Mechanical, Aerospace, and Biomedical Engineering at the University of Tennessee, Knoxville. Soheil Borhani, Department of Mechanical, Aerospace, and Biomedical Engineering, The University of Tennessee, Knoxville, TN 37996 USA. Justin Kilmarx, Department of Mechanical, Aerospace, and Biomedical Engineering, The University of Tennessee, Knoxville, TN 37996 USA. Connor Esterwood, College Communication and Information at the University of Tennessee, Knoxville, TN, USA. Yang Jiang, Department of Behavioral Science, College of Medicine, at University of Kentucky, Lexington KY, USA. Xiaopeng Zhao, Department of Mechanical, Aerospace, and Biomedical Engineering, University of Tennessee, Knoxville, TN 37996 USA. Go to: References [1] Wolpaw JR and McFarland DJ, “Control of a two-dimensional movement signal by a noninvasive brain-computer interface in humans,” Proceedings of the National Academy of Sciences of the United States of America, vol. 101, no. 51, pp. 17849–17854, 2004. [PMC free article] [PubMed] [Google Scholar] [2] McFarland DJ, Sarnacki WA, and Wolpaw JR, “Electroencephalographic (EEG) control of three-dimensional movement,” (in eng), Journal of neural engineering, vol. 7, no. 3, p. 036007, June 2010, doi: 10.1088/1741-2560/7/3/036007. [PMC free article] [PubMed] [CrossRef] [Google Scholar] [3] Bradberry TJ, Gentili RJ, and Contreras-Vidal JL, “Fast attainment of computer cursor control with noninvasively acquired brain signals,” Journal of neural engineering, vol. 8, no. 3, p. 036010, 2011. [PubMed] [Google Scholar] [4] Abiri R, Heise G, Schwartz F, and Zhao X, “EEG-based control of a unidimensional computer cursor using imagined body kinematics,” in Biomedical Engineering Society Annual Meeting (BMES 2015), 2015. [Google Scholar] [5] Kilmarx J, Abiri R, Borhani S, Jiang Y, and Zhao X, “Sequence-based manipulation of robotic arm control in brain machine interface,” International Journal of Intelligent Robotics and Applications, pp. 1–12, 2018. [Google Scholar] [6] Bai O, Lin P, Huang D, Fei D-Y, and Floeter MK, “Towards a user-friendly brain–computer interface: initial tests in ALS and PLS patients,” Clinical Neurophysiology, vol. 121, no. 8, pp. 1293–1303, 2010. [PMC free article] [PubMed] [Google Scholar] [7] Abiri R, Kilmarx j., Borhani S, Zhao X, and Jiang Y, “A Brain-Machine Interface for a Sequence Movement Control of a Robotic Arm,” in Society for Neuroscience (SfN 2017), 2017. [Google Scholar] [8] Xia B. et al., “A combination strategy based brain–computer interface for two-dimensional movement control,” Journal of neural engineering, vol. 12, no. 4, p. 046021, 2015. [PubMed] [Google Scholar] [9] Abiri R, Borhani S, Sellers EW, Jiang Y, and Zhao X, “A comprehensive review of EEG-based brain-computer interface paradigms,” Journal of neural engineering, 2018. [PubMed] [Google Scholar] [10] Ofner P. and Müller-Putz GR, “Using a noninvasive decoding method to classify rhythmic movement imaginations of the arm in two planes,” IEEE transactions on biomedical engineering, vol. 62, no. 3, pp. 972–981, 2015. [PubMed] [Google Scholar] [11] Kim J-H, Biessmann F, and Lee S-W, “Decoding Three-Dimensional Trajectory of Executed and Imagined Arm Movements from Electroencephalogram Signals,” 2014. [PubMed] [Google Scholar] [12] Hochberg LR et al., “Neuronal ensemble control of prosthetic devices by a human with tetraplegia,” Nature, vol. 442, no. 7099, pp. 164–171, 2006. [PubMed] [Google Scholar] [13] Bradberry TJ, Gentili RJ, and Contreras-Vidal JL, “Decoding three-dimensional hand kinematics from electroencephalographic signals,” (in eng), Conference proceedings : … Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference, vol. 2009, pp. 5010–3, 2009, doi: 10.1109/iembs.2009.5334606. [PubMed] [CrossRef] [Google Scholar] [14] Ofner P. and Muller-Putz GR, “Decoding of velocities and positions of 3D arm movement from EEG,” (in eng), Conference proceedings : … Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Conference, vol. 2012, pp. 6406–9, 2012, doi: 10.1109/embc.2012.6347460. [PubMed] [CrossRef] [Google Scholar] [15] Borhani S, Kilmarx J, Saffo D, Ng L, Abiri R, and Zhao X, “Optimizing Prediction Model for a Noninvasive Brain Computer Interface Platform using Channel Selection, Classification and Regression,” IEEE journal of biomedical and health informatics, 2019. [PubMed] [Google Scholar] [16] Thomas KP, Guan C, Lau CT, Vinod A, and Ang KK, “Adaptive tracking of discriminative frequency components in electroencephalograms for a robust brain–computer interface,” Journal of neural engineering, vol. 8, no. 3, p. 036007, 2011. [PubMed] [Google Scholar] [17] Das A, Suresh S, and Sundararajan N, “A discriminative subject-specific spatio-spectral filter selection approach for EEG based motor-imagery task classification,” Expert Systems with Applications, vol. 64, pp. 375–384, 2016. [Google Scholar] [18] Lv J, Li Y, and Gu Z, “Decoding hand movement velocity from electroencephalogram signals during a drawing task,” (in eng), Biomedical engineering online, vol. 9, p. 64, 2010, doi: 10.1186/1475-925x-9-64. [PMC free article] [PubMed] [CrossRef] [Google Scholar] [19] Zhang J, Wei J, Wang B, Hong J, and Wang J, “Nonlinear EEG Decoding Based on a Particle Filter Model,” BioMed Research International, vol. 2014, 2014. [PMC free article] [PubMed] [Google Scholar] [20] Choi I, Rhiu I, Lee Y, Yun MH, and Nam CS, “A systematic review of hybrid brain-computer interfaces: Taxonomy and usability perspectives,” PloS one, vol. 12, no. 4, p. e0176674, 2017. [PMC free article] [PubMed] [Google Scholar] [21] Kübler A. et al., “The user-centered design as novel perspective for evaluating the usability of BCI-controlled applications,” PLoS One, vol. 9, no. 12, p. e112392, 2014. [PMC free article] [PubMed] [Google Scholar] [22] Barbero Á and Grosse-Wentrup M, “Biased feedback in brain-computer interfaces,” Journal of neuroengineering and rehabilitation, vol. 7, no. 1, p. 34, 2010. [PMC free article] [PubMed] [Google Scholar] [23] Plass-Oude Bos D, Gürkök H, Van de Laar B, Nijboer F, and Nijholt A, “User experience evaluation in BCI: mind the gap!,” 2011. [Google Scholar] [24] Volosyak I, Valbuena D, Luth T, Malechka T, and Graser A, “BCI demographics II: How many (and what kinds of) people can use a high-frequency SSVEP BCI?,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 19, no. 3, pp. 232–239, 2011. [PubMed] [Google Scholar] [25] Kleih S, Nijboer F, Halder S, and Kübler A, “Motivation modulates the P300 amplitude during brain–computer interface use,” Clinical Neurophysiology, vol. 121, no. 7, pp. 1023–1031, 2010. [PubMed] [Google Scholar] [26] Wolpaw JR and McFarland DJ, “Multichannel EEG-based brain-computer communication,” Electroencephalography and Clinical Neurophysiology, vol. 90, no. 6, pp. 444–449, 6// 1994, doi: 10.1016/0013-4694(94)90135-X. [PubMed] [CrossRef] [Google Scholar] [27] Trejo LJ, Rosipal R, and Matthews B, “Brain-computer interfaces for 1-D and 2-D cursor control: designs using volitional control of the EEG spectrum or steady-state visual evoked potentials,” Neural Systems and Rehabilitation Engineering, IEEE Transactions on, vol. 14, no. 2, pp. 225–229, 2006. [PubMed] [Google Scholar] [28] Allison BZ, Brunner C, Altstätter C, Wagner IC, Grissmann S, and Neuper C, “A hybrid ERD/SSVEP BCI for continuous simultaneous two dimensional cursor control,” Journal of neuroscience methods, vol. 209, no. 2, pp. 299–307, 2012. [PubMed] [Google Scholar] [29] Li Y. et al., “An EEG-based BCI system for 2-D cursor control by combining Mu/Beta rhythm and P300 potential,” Biomedical Engineering, IEEE Transactions on, vol. 57, no. 10, pp. 2495–2505, 2010. [PubMed] [Google Scholar] [30] Long J, Li Y, Yu T, and Gu Z, “Target selection with hybrid feature for BCI-based 2-D cursor control,” IEEE Transactions on biomedical engineering, vol. 59, no. 1, pp. 132–140, 2012. [PubMed] [Google Scholar] [31] Kayagil TA et al., “A binary method for simple and accurate two-dimensional cursor control from EEG with minimal subject training,” Journal of neuroengineering and rehabilitation, vol. 6, no. 1, p. 1, 2009. [PMC free article] [PubMed] [Google Scholar] [32] Huang D, Lin P, Fei D-Y, Chen X, and Bai O, “Decoding human motor activity from EEG single trials for a discrete two-dimensional cursor control,” Journal of neural engineering, vol. 6, no. 4, p. 046005, 2009. [PubMed] [Google Scholar] [33] Bi L, Lian J, Jie K, Lai R, and Liu Y, “A speed and direction-based cursor control system with P300 and SSVEP,” Biomedical Signal Processing and Control, vol. 14, pp. 126–133, 2014. [Google Scholar] [34] Fakhruzzaman MN, Riksakomara E, and Suryotrisongko H, “EEG wave identification in human brain with Emotiv EPOC for motor imagery,” Procedia Computer Science, vol. 72, pp. 269–276, 2015. [Google Scholar] [35] Kranczioch C, Zich C, Schierholz I, and Sterr A, “Mobile EEG and its potential to promote the theory and application of imagery-based motor rehabilitation,” International Journal of Psychophysiology, vol. 91, no. 1, pp. 10–15, 2014. [PubMed] [Google Scholar] [36] Duvinage M. et al., “A P300-based quantitative comparison between the Emotiv Epoc headset and a medical EEG device,” Biomedical Engineering, vol. 765, no. 1, pp. 2012–2764, 2012. [Google Scholar] [37] Schalk G, McFarland DJ, Hinterberger T, Birbaumer N, and Wolpaw JR, “BCI2000: a general-purpose brain-computer interface (BCI) system,” IEEE Transactions on biomedical engineering, vol. 51, no. 6, pp. 1034–1043, 2004. [PubMed] [Google Scholar] [38] Graimann B, Allison BZ, and Pfurtscheller G, Brain-computer interfaces: Revolutionizing human-computer interaction. Springer Science & Business Media, 2010. [Google Scholar] [39] Schalk G, McFarland DJ, Hinterberger T, Birbaumer N, and Wolpaw JR, “BCI2000: a general-purpose brain-computer interface (BCI) system,” Biomedical Engineering, IEEE Transactions on, vol. 51, no. 6, pp. 1034–1043, 2004. [PubMed] [Google Scholar] [40] Abiri R, McBride J, Zhao X, and Jiang Y, “A real-time brainwave based neuro-feedback system for cognitive enhancement,” in ASME 2015 Dynamic Systems and Control Conference (Columbus, OH), 2015. [Google Scholar] [41] Borhani S, Abiri R, Zhao X, and Jiang Y, “A Transfer Learning Approach towards Zero-training BCI for EEG-Based Two Dimensional Cursor Control,” in Society for Neuroscience (SfN2017), 2017. [Google Scholar] [42] Abiri R, Borhani S, Zhao X, and Jiang Y, “Real-time brain machine interaction via social robot gesture control,” in ASME 2017 Dynamic Systems and Control Conference, 2017: American Society of Mechanical Engineers, pp. V001T37A002-V001T37A002. [Google Scholar] [43] Borhani S, Yu J, Cate J, Kilmarx J, Abiri R, and Zhao X, “Clash of Minds: A BCI Car Racing Game in Simulated Virtual Reality Environment,” in 2018 Biomedical Engineering Society (BMES) Annual Meeting, 2018. [Google Scholar] [44] Saffo D, Kilmarx J, Borhani S, Abiri R, Zhao X, and Albert M, “Convolutional Neural Networks for a Cursor Control Brain Computer Interface,” in 2018 Biomedical Engineering Society (BMES) Annual Meeting, 2018. [Google Scholar] [45] Úbeda A, Azorín JM, Chavarriaga R, and Millán J. d. R., “Evaluating decoding performance of upper limb imagined trajectories during center-out reaching tasks,” in Systems, Man, and Cybernetics (SMC), 2016 IEEE International Conference on, 2016: IEEE, pp. 000252–000257. [Google Scholar] [46] Lv J, Li Y, and Gu Z, “Decoding hand movement velocity from electroencephalogram signals during a drawing task,” Biomedical engineering online, vol. 9, no. 1, p. 64, 2010. [PMC free article] [PubMed] [Google Scholar] [47] Abiri R, Zhao X, Heise G, Jiang Y, and Abiri F, “Brain computer interface for gesture control of a social robot: An offline study,” in Electrical Engineering (ICEE), 2017 Iranian Conference on, 2017: IEEE, pp. 113–117. [Google Scholar] [48] Evans N, Gale S, Schurger A, and Blanke O, “Visual feedback dominates the sense of agency for brain-machine actions,” PloS one, vol. 10, no. 6, p. e0130019, 2015. [PMC free article] [PubMed] [Google Scholar] [49] Kindermans P-J, Schreuder M, Schrauwen B, Müller K-R, and Tangermann M, “True zero-training brain-computer interfacing–an online study,” PloS one, vol. 9, no. 7, p. e102504, 2014. [PMC free article] [PubMed] [Google Scholar] [50] Abiri R, Kilmarx J, Raji M, and Zhao X, “Planar Control of a Quadcopter Using a Zero-Training Brain Machine Interface Platform,” in Biomedical Engineering Society Annual Meeting (BMES 2016), 2016. [Google Scholar] [51] McFarland DJ and Wolpaw JR, “Brain-computer interfaces for communication and control,” Commun. ACM, vol. 54, no. 5, pp. 60–66, 2011, doi: 10.1145/1941487.1941506. [PMC free article] [PubMed] [CrossRef] [Google Scholar] [52] Mennes M, Wouters H, Vanrumste B, Lagae L, and Stiers PJP, “Validation of ICA as a tool to remove eye movement artifacts from EEG/ERP,” vol. 47, no. 6, pp. 1142–1150, 2010. [PubMed] [Google Scholar] [53] Delorme A. and Makeig S, “EEGLAB: an open source toolbox for analysis of single-trial EEG dynamics including independent component analysis,” Journal of neuroscience methods, vol. 134, no. 1, pp. 9–21, 2004. [PubMed] [Google Scholar] [54] Langlois D, Chartier S, and Gosselin D. J. T. i. Q. M. f. P., “An introduction to independent component analysis: InfoMax and FastICA algorithms,” vol. 6, no. 1, pp. 31–38, 2010. [Google Scholar] [55] Pion-Tonachini L, Makeig S, and Kreutz-Delgado K, “Crowd labeling latent Dirichlet allocation,” J Knowledge information systems, vol. 53, no. 3, pp. 749–765, 2017. [PMC free article] [PubMed] [Google Scholar] [56] Kim J-H, Bießmann F, and Lee S-W, “Decoding three-dimensional trajectory of executed and imagined arm movements from electroencephalogram signals,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 23, no. 5, pp. 867–876, 2015. [PubMed] [Google Scholar] [57] Nijboer F, Birbaumer N, and Kübler A, “The influence of psychological state and motivation on brain–computer interface performance in patients with amyotrophic lateral sclerosis–a longitudinal study,” Frontiers in neuroscience, vol. 4, 2010. [PMC free article] [PubMed] [Google Scholar]

Attached Files

9/8/2024

JavaScript is not enabled in your browser. Most features and paste content is missing . Switch to full experience by editing url from /nojs/[link] to /share/[link]