Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

# Neurobehavioural signatures in race car driving: a case study

## Abstract

Recent technological developments in mobile brain and body imaging are enabling new frontiers of real-world neuroscience. Simultaneous recordings of body movement and brain activity from highly skilled individuals as they demonstrate their exceptional skills in real-world settings, can shed new light on the neurobehavioural structure of human expertise. Driving is a real-world skill which many of us acquire to different levels of expertise. Here we ran a case-study on a subject with the highest level of driving expertise—a Formula E Champion. We studied the driver’s neural and motor patterns while he drove a sports car on the “Top Gear” race track under extreme conditions (high speed, low visibility, low temperature, wet track). His brain activity, eye movements and hand/foot movements were recorded. Brain activity in the delta, alpha, and beta frequency bands showed causal relation to hand movements. We herein demonstrate the feasibility of using mobile brain and body imaging even in very extreme conditions (race car driving) to study the sensory inputs, motor outputs, and brain states which characterise complex human skills.

## Introduction

One of the hallmarks of being human is our unique ability to develop skills and expertise. While all animals develop skills like walking, running, fruit picking or hunting, we as humans can develop a much broader and more diverse set of skills. With practice, most of us can learn to play a musical instrument, play a sport, or do arts and crafts. Nevertheless, only some of us can reach the highest level of expertise. Unlike the widespread view that this is entirely driven by practice1, there is accumulating evidence that practice is not enough2, making individual musicians, artists, athletes, and craftspeople who take this expertise to new heights of particular research interest. Novel technology for mobile brain and body imaging now enables us to study neurobehaviour in real-world settings3,4,5. When carried out in natural environments, these measures can enable a meaningful understanding of human behaviour while performing real-life tasks. Studying the relation and inter-dependencies between brain activity and body movements of experts, while they perform their expert skills in real-world settings, can enable us to unpack this enigma.

## Methods

### Experimental setup

The experiment took place at the Dunsfold Aerodrome (Surrey, UK), commonly known as the Top Gear race track. The driver (co-author) was Formula E champion, Lucas Di Grassi (Audi Sport ABT Schaeffler team), with over 15 years of professional racing experience, which include karting, Formula 3, Formula One and Formula E racing. The participant, LDG, is an author on this paper and gave informed consent to participate in the experiment and to publish the information and images in an online open-access publication. While all methods used in the study were approved by the Imperial College Research Ethics Committee and performed in accordance with the declaration of Helsinki, this study was a self-experimentation by an author17. The test drive was prescheduled for video production purposes by the racing team, who race in these conditions frequently. It enabled us this unique scientific observation of motor expertise in the wild. Although unlikely to be needed, emergency response units were present. A promo video of the film by Averner Films18 is accessible here: https://vimeo.com/248167533.

The driver was equipped with: a 32-channels wireless EEG system (LiveAmp, Brain Products GmbH, Germany) with dry electrodes (actiCAP Xpress Twist, Brain Products GmbH, Germany); binocular eye-tracking glasses (SMI ETG 2W A, SensoMotoric Instruments, Germany); and four inertial measurement units (IMUs) on his hands and feet (MTw Awinda, Xsens Technologies BV, The Netherlands); shown in Fig. 1A. The car was equipped with a GPS and a camera recording the inside of the car. The car driver assistance systems were turned off. The full architecture of the experimental setup is presented in Fig. 1C.

The Top Gear race track has specific curve types that subjugate drivers’ to different driving challenges. In particular, in the south-west of the track (left side of of the image in Fig. 1B) is the Hammerhead curve. As the name suggests, it is a hammerhead-shaped tricky curve, designed to test cars and drivers’ skill for it is technically challenging19. This is considered the critical curve of the track, being used in this work to assess the driver’s performance in challenging scenarios. This extreme curve was selected to show highlight features of the driver’s neuromotor behaviour. The car was driven with a mean driving speed of 120 Km/h and a top speed of 178 Km/h on a track 2.82 Km long with 12 curves. The experiment consisted of 6 sets of 2 to 3 laps each, stopping after every set for system check-up and recalibration.

### Data processing

All data were pre-processed using custom software written in MatLab (R2017a, The MathWorks, Inc., MA, USA). During the recording in the car, all data streams were synchronised using the computer’s real-time clock (RTC). We verified the synchronisation offline and corrected for any drifts and offsets by computing temporal cross-correlations. Since the car acceleration had an equal effect on the accelerometers of the IMUs on the limbs and the built-in IMU in the GPS and the EEG headset, timestamps were synchronised across all data these streams using cross-correlation between the accelerations recorded by all systems and minor offsets between timestamps were corrected for each six sets. The gaze data was synced to the motion capture sensors (Xsens) using the video from the eye tracker’s egocentric camera, in which the driver’s hands are clearly visible. During the race, the driver was repeatedly in states of almost complete stillness (while driving on a straight path) which were followed by rapid hand movements (as the car started to slip on the road or drove into a sharp curve). We automatically detected these extreme peaks in hand acceleration and then looked at the video frames around those peaks and corrected for the minor offsets (of at most 33 msecs, based on 30 fps of the egocentric camera) between timestamps. Then data streams were segmented to the laps, and laps with missing data in either stream were removed.

#### Car movement data

The car GPS was obtained by the placement of an iPhone inside the car. This system’s acceleration and rotation recordings were resampled to a stable 100 Hz sampling rate, for timestamp synchronisation and for cleaning the body data. The data was filtered resorting to a zero phase-lag fourth-order Butterworth filter with 10 Hz cut off frequency20,21.

#### Body movement data

Body data was collected by 4 Xsens MTw Awinda IMUs placed on hands and feet. These sensors recorded rotation and acceleration. The data collected has a stable sampling rate of 100 Hz. Since the driver was in a moving car, the motion captured by the motion capture system was the combination of the car movement and the driver’s limb movements inside the car. Linear regression was implemented in order to clean the car movement (based on the GPS recordings) from the motion tracking. The regression’s residuals capture the movement of the limbs that do not fit the car movement, for rotation and acceleration data separately. The data was then filtered with resorting to a zero phase-lag fourth-order Butterworth filter with 10 Hz cut off frequency20,21.

#### Gaze data

The eye gaze was acquired with binocular SMI ETG 2W A eye-tracking glasses which use infrared light and a camera to track the position of the pupil. Using the pupil centre and the corneal reflection (CR) creates a vector in the image that can be mapped to coordinates22,23. The use of eye-tracking glasses simplify the analysis of gaze-targets inside a free moving head in a freely driving car, as those are not calibrated to a fixed physical coordinated system but only to the device’s built-in egocentric camera that captures the visual scene that the subject sees. The glasses and the camera move with the head and thus no correction for head movements is needed. The egocentric camera captured the scene view and therefore enabled the reconstruction of the frame-by-frame gaze point in the real-world scene (e.g. visual relations of gaze-target to the side of the road). The gaze data was collected at 120 Hz and included multiple standardised eye measurements, such as pupil size, eye position, point of regard, and gaze vector. When measuring gaze behaviour in-the-wild, fixation and saccades become an under-specified concept24. For example, a free-moving head tracking a location on a moving road involves smooth eye movements and saccades during fixation, which are considered mutually exclusive in fixed head settings. Therefore, conventional definitions of these types of eye-movements and ways of analysis do not apply. Thus, our analysis focused on the point of regard, obtained from the RMS between the point of regard binocular measured in X and Y axis; the gaze vector, computed for the right and left eyes, separately, through the RMS of XYZ; and the change in gaze vector, calculated using two consecutive measurements. These measures were resampled to 100 Hz, and linear interpolation was applied to account for some missing data points in the gaze recording.

#### EEG data

The brain activity was recorded using a 32 channels EEG cap with dry electrodes (displayed accordingly to the 10-20 system), sampled at 250 Hz stable sampling rate. EEG data was analysed with EEGLAB toolbox (https://sccn.ucsd.edu/eeglab;25) . The processing steps included (i) high pass filtering with a cutoff frequency of 1 Hz26,27,28; (ii) line noise removal in the selected frequencies of 60 Hz and 120 Hz26; (iii) bad channels removal for those with less than 60% correlation to its own reconstruction based on its neighbour channels26; (iv) re-referencing the EEG dataset to the average (of all channels) to minimise the impact of a channel with bad contact in the variance of the entire dataset26; (v) Independent Component Analysis (ICA) in order to separate signal sources29,30,31; (vi) Artefacts removal using runica, infomax ICA algorithm from EEGLAB, for identification and removal of head and eye movements and blinking artefacts. The algorithm searches for the maximal statistical independence between sources. Artefacts sources were identified based on a predefined set of criteria for scalp topographies and spectrum analysis (e.g. source location by the ears and power spectrum with high and spiky power in high frequencies indicates movement artefacts, source location between the eyes suggest eye blink artefacts, source location in the eye indicates lateral eye movement artefacts)25. EEG data was then transformed in the time-frequency domain to power in decibels (dB) in 100 Hz (to match the other data streams), in delta ($$\delta$$) 0.5 to 4 Hz; theta ($$\theta$$) 4 to 8 Hz; alpha ($$\alpha$$) 8 to 12 Hz; and beta ($$\beta$$) 12 to 30 Hz frequency bands. The transformation was applied separately to the individual IC located over the left motor cortex and to the mean brain activity, averaged across the cleaned ICs.

## Results

The results section of this case-study paper were written to characterise the neuromotor behaviour of a professional driver while driving in extreme condition, which can be used as a reference point for future driving studies. Initial analysis was aimed to understand the interdependencies between the different neurobehavioural data streams and their level of complexity. The analysis then focused on specific driving events, such as response to challenging conditions (skidding, curves and straights), in order to assess if there is a distinguishable behaviour upon those moments. Lastly, we addressed the causality across different data streams.

### Data characteristics

The distributions of the car speed, the right-hand rotation, and the right-hand accelerations were assessed in the entire track, the straight segments before and after the hammerhead curve, and the hammerhead curve itself (Fig. 2). The average speed throughout the experiment was 120Km/h. As expected, in the curves speed was relatively low (between 54 and 82 Km/h), while in straight segments speed was much higher with broader distribution, as the car decelerated towards a curve and accelerated after the curve. The number of frames considered hammerhead critical curve was 11.3% of the total recording, and straights corresponded to 25.5%.

Since the hand movements were highly correlated here, we show only the right hand. Both gyroscope and accelerometer distributions present similar tendencies, with a narrow distribution during the straight segments and a slightly wider one in curves. The result considering the whole data set lies in-between. The gyroscope values for the abrupt responses (skidding) have a mean of 3rad/s, considerably superior to the 1rad/s found in the literature for normal forearm movement32, which is expected considering the intense car handling. Data considered as abrupt responses corresponds to 6% of the dataset.

Eye gaze data showed a strong tendency of tracking the tangent point of the curve, as illustrated in Fig. 3 (top), and reported for non-expert drivers33. In the top figure the eye gaze search for the tangent of the curve can be seen (internal and external), marked in white on the road, where it remains throughout the entire curve. During straight segments, the eye gaze focuses straight ahead, with a stable distance in the horizon, with minor saccadic deviations, as illustrated in Fig. 3 (bottom). The heat map was built using data recorded during the critical curve (top) or the straights before and after that curve (bottom). The gaze point position from the geocentric view was annotated at a ten frames cadence in an overlapping position matrix. The heat map was built resorting to the percentual annotations occurrence in this matrix.

### Global dataset assessment

For an overall understanding of the interdependencies between the neurobehavioral data streams, a correlation matrix was computed (Fig. 4A). The variables considered were: the RMS of the acceleration and the rotation over the four limbs, the car, and the head; EEG band power for the mean brain activity, and the right hand IC; and eye movement data including the point of regard, the gaze vectors, and the change in gaze vector for both eyes. The right and the left hands are strongly correlated, as expected since the driver had both hands on the steering wheel. There are other correlations within domains (e.g. neighbouring band powers are correlated), and also weaker, but statistically significant, correlations between domains. Most importantly, between the EEG band powers and the hand movements. We found a positive correlation for the hands’ acceleration and rotation with the alpha and beta power in the right hand IC, and a negative correlation with the delta power (P < 0.001). Cross-correlation shows that while the delta band is synced with the movement, the neural signal in the alpha and beta band precedes the movement by  100 ms.

#### Granger causality

To address the sequence of causality between variables, we used Granger causality34. Granger causality is based on the idea that causes precede and help predict their effects. This technique tries to identify direct interactions between time series, in both time and frequency domains. Granger causality was calculated using EEGLAB toolbox MVGC multivariate Granger causality (mvgc_v1.0)35,36. The toolbox uses autoregressive vector modelling to find linear interdependencies between time series based on their past values. The Granger causality analysis was conducted on the hands’ acceleration and gyroscope rotation data, and EEG bands power from the IC of the right hand. The results are presented in Fig. 4B as a pairwise matrix of causality between causal sources to causal sinks (darker indicates more robust causality) for statistically significant causality links (p<0.05 after Bonferroni correction). We found one-directional causality where the EEG bands power are causing hand movements, specially delta and beta.

### Specific driving related events

Here we considered a comparison between the neurobehavioral signature of the challenging Hammerhead curve and the straight segments leading to it and out of it. Based on the GPS data and the egocentric videos, we annotated the segments of the critical Hammerhead curve in the track and the straight paths leading to it and out of it. Correlation matrix from hands IMUs and EEG data streams for these segments shows differences in the correlation structure between the two types of segments (Fig. 5A). While in the straight segments, there was no correlation between the hands’ rotation and acceleration to the EEG power, in the curves, there was a negative correlation between these brain and body measurements. Statistical analysis of the differences between the segments shows a decrease in delta power and an increase in alpha and beta during the curve (Fig. 5B, C). The data from the segments before and after the curve also showed differences where delta power was lower before and higher after the curve, while alpha power showed the opposite trend. Eye gaze vector also changed more during the curves.

## Discussion

This work is an attempt to assess neurobehavioural signatures of real-world motor behaviour in-the-wild. We demonstrate the feasibility of simultaneously measuring brain activity, eye and body movement behaviours in-the-wild under extreme conditions (race track driving), assessing neurobehavioural interdependencies and inferring causal relations. First, we have demonstrated that body movements, eye movements, and wireless EEG data can be collected in very extreme conditions in-the-wild. Second, we have demonstrated that these measurements are meaningful not only from a purely descriptive experience but also by showing how they are predictive of each other, how they are related to car performance and race track location. And third, we used this data to characterise the neuromotor behaviour of an expert while driving in extreme conditions, which can be used as a reference point for future driving studies. This proof of “possibility” will act as a foundation to enable further research in real-world settings.

Our results show changes in the EEG power and the gaze characteristic during sharp curves, where the control of the car is most challenging. While many previous studies found correlations between hand movements and brain activity in lab-based repeated trials tasks37, here we show such correlations in continuous movement in-the-wild. Moreover, in a controlled lab experiment, there is a clear trial order where the timing of stimuli appearance, go-cue, etc. are well defined. Accordingly, the direction of the causality (if it exists) is clear -neural activity after a stimulus can be caused by it but cannot cause it. At the same time, neural activity before movement can cause the movement but cannot be caused by it. In-the-wild, causal relationships may reverse or be bi-directional. Here we show not only the correlation but the causality from the brain activity to the body movement in an unconstrained setting. Interestingly, the EEG power changes are in line with previous results on general creative solution finding and interventions38.

Comparing the driver’s neurobehaviour between the sharp curves and the straight segments before and after, enabled us to assess world-championship-level skilled responses while facing extreme driving conditions. Our results suggest seeming differences in the EEG power, point of regard, and gaze change vector, between the different segments: before-during-after sharp curves. The difference between the driving segments means that we can detect neurobehavioural differences between more and less demanding segments of the drive from on-going in-the-wild EEG and gaze recordings. It suggests possible neurobehavioural matrices for task demand that would presumably be different in expertise. During the curves, the driver showed a power increase in the alpha and beta bands and a decrease in the delta band. The increase in alpha is potentially a signature of the increased creativity demand in these segments38,39. The alpha and beta power increase we observed are also in line with previous work showing an increase in left-hemisphere alpha and beta power of expert rifle shooters during the preparatory pre-shot period8.

Neurobehavioural data collection in-the-wild is subject to more noise sources and interference than standard data collection in-the-lab. This concern is specifically worrying for the EEG signal, which is always contaminated by noise, and any EEG recording during movement is subject to movement and muscle artefacts. Thus, we find our cross-correlation and Granger causality results very encouraging, as those suggest the EEG activity precedes the movement and predicted it and not the other way around. If the EEG results were simply movement artefacts, we would have expected to see the opposite causality - the movement would precede and predict the EEG movement artefact. Thus, since the EEG activity precedes the movement, we believe the EEG results cannot be rejected as noise artefacts.

The driver’s gaze during curves followed the tangent point of the curve, as suggested in the classic paper by Land and Lee33. During the straight segments, his gaze was entirely focused on the centre of the road which led to the more stability in straight segments relative to curves, though during both segments type the driver’s gaze is exceptionally stable, as illustrated in Fig. 3.

Being able to collect real-world data which capture a significant portion of the sensory input to the brain (visual scene and locus of attention), the motor output of the brain (hand, head and arm movements during driving) as well as the state of the brain (EEG signals), is a further realisation of our human ethomics approach. This is not only insightful for understanding the brain and its behaviour, but also for devising artificial intelligence to improve driving safety for autonomous and semi-autonomous cars. In recent work40, we demonstrated how human drivers in a virtual reality driving simulator generated gaze behaviour that we used to train a deep convolutional neural network to predict where a human driver was looking moment-by-moment. This human-like visual attention model enabled us to mask “irrelevant” visual information (where the human driver was not looking) and train a self-driving car algorithm to drive successfully. Crucially, our human-attention based AI system learned to drive much faster and with a higher end-of-learning performance than AI systems that had no knowledge of human visual attention. The work we present here takes this en passant human-annotation of skilled behaviour to the next level, by collecting real-world data of rich input and output characteristics of the brain. Similarly to the way we used drivers’ gaze in a driving simulator to train a self-driving car algorithm in that simulator, we can use rich neurobehavioural data from an expert driver in extreme conditions to train a real self-driving car algorithm to response successfully in extreme conditions. Likewise, on the side of control systems, we showed for example how using ethomic data obtained from natural tasks (movement data41, electrophysiological data42, decision making data43) can be harnessed to boost AI system performance. The neurobehavioural approach demonstrated here suggests how we can succeed in future to close-the-loop between person and vehicle.

In summary, we demonstrated the feasibility of studying the neurobehavioral signatures of real-world expertise in-the-wild. We showed evidence of specific brain activity and gaze patterns during driving in extreme conditions, in which, presumably, the expertise of the driver makes a crucial difference. Future work is required to generalise these findings from this single case study.

## References

1. Ericsson, K. A., Krampe, R. T. & Tesch-Römer, C. The role of deliberate practice in the acquisition of expert performance. Psychol. Rev.100, 363 (1993).

2. Hambrick, D. Z. et al. Deliberate practice: Is that all it takes to become an expert?. Intelligence45, 34–45 (2014).

3. Haar, S., van Assel, C. M. & Faisal, A. A. Kinematic signatures of learning that emerge in a real-world motor skill task. bioRxiv 612218, https://doi.org/10.1101/612218 (2019).

4. Haar, S. & Faisal, A. A. Neural biomarkers of multiple motor-learning mechanisms in a real-world task. bioRxiv 2020.03.04.976951, https://doi.org/10.1101/2020.03.04.976951 (2020).

5. Haar, S., Sundar, G. & Faisal, A. A. Embodied virtual reality for the study of real-world motor learning. bioRxiv 2020.03.19.998476, https://doi.org/10.1101/2020.03.19.998476 (2020).

6. Park, J. L., Fairweather, M. M. & Donaldson, D. I. Making the case for mobile cognition: Eeg and sports performance. Neurosci. Biobehav. Rev.52, 117–130 (2015).

7. Muraskin, J., Sherwin, J. & Sajda, P. Knowing when not to swing: Eeg evidence that enhanced perception-action coupling underlies baseball batter expertise. NeuroImage123, 1–10 (2015).

8. Janelle, C. M. et al. Expertise differences in cortical activation and gaze behavior during rifle shooting. J. Sport Exerc. Psychol.22, 167–182 (2000).

9. Cooke, A. et al. Preparation for action: Psychophysiological activity preceding a motor skill as a function of expertise, performance outcome, and psychological pressure. Psychophysiology51, 374–384 (2014).

10. Busso, C. & Jain, J. Advances in multimodal tracking of driver distraction. Digital Signal Process. In-Vehicle Syst. Saf.253–270, https://doi.org/10.1007/978-1-4419-9607-7_18 (2012).

11. Baldwin, C. L. et al. Detecting and Quantifying Mind Wandering during Simulated Driving. Front. Hum. Neurosci.11, 1–15. https://doi.org/10.3389/fnhum.2017.00406 (2017).

12. Lal, S. K., Craig, A., Boord, P., Kirkup, L. & Nguyen, H. Development of an algorithm for an EEG-based driver fatigue countermeasure. J. Saf. Res.34, 321–328. https://doi.org/10.1016/S0022-4375(03)00027-6 (2003).

13. Zhao, C., Zhao, M., Liu, J. & Zheng, C. Electroencephalogram and electrocardiograph assessment of mental fatigue in a driving simulator. Accid. Anal. Prev.45, 83–90. https://doi.org/10.1016/j.aap.2011.11.019 (2012).

14. Li, W., He, Q. C., Fan, X. M. & Fei, Z. M. Evaluation of driver fatigue on two channels of EEG data. Neurosci. Lett.506, 235–239. https://doi.org/10.1016/j.neulet.2011.11.014 (2012).

15. Li, G. & Chung, W. Y. Combined EEG-Gyroscope-tDCS brain machine interface system for early management of driver drowsiness. IEEE Trans. Hum.-Mach. Syst.48, 50–62. https://doi.org/10.1109/THMS.2017.2759808 (2017).

16. Borghini, G., Astolfi, L., Vecchiato, G., Mattia, D. & Babiloni, F. Measuring neurophysiological signals in aircraft pilots and car drivers for the assessment of mental workload, fatigue and drowsiness. Neurosci. Biobehav. Rev.44, 58–75. https://doi.org/10.1016/j.neubiorev.2012.10.003 (2014) (arXiv:1011.1669v3).

17. Hanley, B. P., Bains, W. & Church, G. Review of scientific self-experimentation: Ethics history, regulation, scenarios, and views among ethics committees and prominent scientists. Rejuvenation Res.22, 31–42 (2019).

18. The Human Machine | Audi R8. An Averner Films project, featuring Audi Sport and Formula E Champion Lucas Di Grassi. https://www.averner.com/work.

19. BBC Top Gear Track Plan. https://www.bbc.co.uk/programmes/articles/1jckx859NGhPCNrL6vQD9Wl/track-plan.

20. Ostry, D. J., Cooke, J. D. & Munhall, K. G. Velocity curves of human arm and speech movements. Exp. Brain Res.68, 37–46. https://doi.org/10.1007/BF00255232 (1987).

21. Atkeson, C. G. & Hollerbach, J. M. Kinematic features of unrestrained vertical arm movements. J. Neurosci.5, 2318–2330 (1985).

22. Kogkas, A. A., Darzi, A. & Mylonas, G. P. Gaze-contingent perceptually enabled interactions in the operating theatre. Int. J. Comput. Assist. Radiol. Surg.12, 1131–1140. https://doi.org/10.1007/s11548-017-1580-y (2017).

23. Morimoto, C. H. & Mimica, M. R. Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst.98, 4–24. https://doi.org/10.1016/j.cviu.2004.07.010 (2005).

24. Lappi, O. Eye movements in the wild: oculomotor control, gaze behavior & frames of reference. Neurosci. Biobehav. Rev.69, 49–68 (2016).

25. Delorme, A. & Makeig, S. Eeglab: an open source toolbox for analysis of single-trial eeg dynamics including independent component analysis. J. Neurosci.methods134, 9–21 (2004).

26. Bigdely-Shamlo, N., Mullen, T., Kothe, C., Su, K.-M. & Robbins, K. A. The PREP pipeline: standardized preprocessing for large-scale EEG analysis. Fronti. Neuroinform.9, 1–20. https://doi.org/10.3389/fninf.2015.00016 (2015).

27. Dias, N. S., Carmo, J. P., Mendes, P. M. & Correia, J. H. Wireless instrumentation system based on dry electrodes for acquiring EEG signals. Med. Eng. Phys.34, 972–981. https://doi.org/10.1016/j.medengphy.2011.11.002 (2012).

28. Gargiulo, G. et al. A new eeg recording system for passive dry electrodes. Clini. Neurophysiol.121, 686–693 (2010).

29. Sahonero-Alvarez, G. & Calderón, H. A comparison of SOBI, FastICA, JADE and Infomax algorithms. Proceedings of the 8th International Multi-Conference on Complexity, Informatics and Cybernetics 17–22 (2017).

30. Lourenço, P. R., Abbott, W. W. & Faisal, A. A. Supervised EEG ocular artefact correction through eye-tracking. Biosyst. Biorobot.12, 99–113. https://doi.org/10.1007/978-3-319-26242-0_7 (2016).

31. Lee, T. .-W., Girolami, M. & Sejnowski, T. . J. Independent component analysis using an extended infomax algorithm for mixed subgaussian and supergaussian sources. Neural Comput.11, 417–441. https://doi.org/10.1162/089976699300016719 (1999) (arXiv:1011.1669v3).

32. Hasan, Z. Optimized movement trajectories and joint stiffness in unperturbed, inertially loaded movements. Biol. Cybern.53, 373–382. https://doi.org/10.1007/BF00318203 (1986).

33. Land, M. F. & Lee, D. N. Where we look when we steer. Nature369, 742–744. https://doi.org/10.1038/369742a0 (1994).

34. Granger, C. W. Investigating causal relations by econometric models and cross-spectral methods. Econometrica 424–438 (1969).

35. Barnett, L. & Seth, A. K. The MVGC multivariate Granger causality toolbox: a new approach to Granger-causal inference. J. Neurosci. Methods223, 50–68. https://doi.org/10.1016/j.jneumeth.2013.10.018 (2014) (arXiv:NIHMS150003).

36. Seth, A. K., Barrett, A. B. & Barnett, L. Granger causality analysis in neuroscience and neuroimaging. J. Neurosci.35, https://doi.org/10.1523/JNEUROSCI.4399-14.2015 (2015).

37. Morash, V., Bai, O., Furlani, S., Lin, P. & Hallett, M. Classifying eeg signals preceding right hand, left hand, tongue, and right foot movements and motor imageries. Clin. Neurophysiol.119, 2570–2578 (2008).

38. Fink, A. & Benedek, M. Eeg alpha power and creative ideation. Neurosci. Biobehav. Rev.44, 111–123 (2014).

39. Fink, A., Graif, B. & Neubauer, A. . C. Brain correlates underlying creative thinking: eeg alpha activity in professional vs. novice dancers. NeuroImage46, 854–862 (2009).

40. Makrigiorgos, A., Shafti, A., Harston, A., Gerard, J. & Faisal, A. A. Human visual attention prediction boosts learning & performance of autonomous driving agents. arXiv preprint arXiv:1909.05003 (2019).

41. Belić, J. J. & Faisal, A. A. Decoding of human hand actions to handle missing limbs in neuroprosthetics. Front. Comput. Neurosci.9, 27 (2015).

42. Xiloyannis, M., Gavriel, C., Thomik, A. A. & Faisal, A. A. Gaussian process autoregression for simultaneous proportional multi-modal prosthetic control with natural hand kinematics. IEEE Trans. Neural Syst. Rehabil. Eng.25, 1785–1801 (2017).

43. Gottesman, O. et al. Guidelines for reinforcement learning in healthcare. Nat. Med.25, 16–18 (2019).

## Acknowledgements

First, we would like to acknowledge Alex Verner and Joel Verner (Averner Films) whose video production idea and work enabled this research. We thank them and the race team for inviting us to participate in their production and collect the data for this study. We also thank Pavel Orlov for his help with the gaze data. The study was enabled by financial support to a Royal Society-Kohn International Fellowship (NF170650; SH and AAF) and by eNHANCE (http://www.enhance-motion.eu) under the European Union’s Horizon 2020 research and innovation programme Grant Agreement No. 644000 (SH and AAF).

## Author information

Authors

### Contributions

A.A.F. conceived and designed the scientific study; L.D.G. preformed the experiment; I.R.L. and A.A.F. acquired the data; I.R.L., A.A.F. and S.H. analyzed the data; I.R.L., S.H. and A.A.F. interpreted the data; I.R.L. and S.H. drafted the paper; I.R.L., S.H., L.D.G. and A.A.F. revised the paper.

### Corresponding author

Correspondence to A. Aldo Faisal.

## Ethics declarations

### Competing interests

The experimental data collected and analysed here was as part of a promotional video production for Roborace—a racing competition for autonomously driving, electrically powered vehicles. L.D.G. is the CEO of Roborace. Averner Films was commissioned by Roborace to produce the video. L.D.G. and A.A.F. appeared in the video. I.R.L. and S.H. declare no competing financial interests. Within the domain of this paper, A.A.F. has consulted for Airbus, Averner Films and Celestial Group. A.A.F. lab has received within the domain of this paper research funding and donations from Microsoft and NVIDIA. This scientific publication was not commissioned nor expected as part of the film production.

### Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

## Rights and permissions

Reprints and Permissions

Rito Lima, I., Haar, S., Di Grassi, L. et al. Neurobehavioural signatures in race car driving: a case study. Sci Rep 10, 11537 (2020). https://doi.org/10.1038/s41598-020-68423-2

• Accepted:

• Published:

• DOI: https://doi.org/10.1038/s41598-020-68423-2