I am thrilled to share that our team has won the 3rd place in the data analysis track of the hackathon. I would like to express my gratitude towards the organizers and my teammates for their support and contribution to this project. I would also like to thank all participating teams for sharing their insights and making this a great learning experience.
ECoG Video Watching Analysis Analyze an ECoG data-set from an epilepsy person watching a video. The ECoG was recorded from regions on the temporal base that is coding colors, black/white, shapes, faces and much more. Try to optimize pre-processing, feature extraction and classification algorithms. Compare your results with state-of-the-art algorithms.
Our winning project was focused on ECoG data of an epilepsy person watching a video. I am proud of the hard work and dedication put in by our team, and this experience has been a great learning opportunity for all of us. I look forward to sharing more details of the project with the wider community and hope that it can inspire more innovation in the field of neuro-data analysis.
The entire hackathon project is inspired by Kapeller et al., 2018 which based their research on the knowledge that the ventral temporal cortex contains specialized regions that process visual stimuli. Different types and colors of visual stimulation that were presented to four human participants, and they demonstrated a real-time decoder that detects and discriminates responses to untrained natural images
We were given a dataset of an ECoG data-set from an epilepsy person watching a video, and in the video, various stimuli were presented. We manually labelled the timestamp (in ms) of the onset of each stimulus and extracted 600 ms of data as samples (equivalent to 720 frames of video / ECOG data). Based on the 160 channels of ECoG data, we applied data preprocessing workflow and tried different classifiers. And lastly, we used SHAP to visualize the impact of each channel/correlation of channels.