event-icon
Description

Abstract: While continuous EEG monitoring is increasingly used to improve outcomes of critical care patients, reviewing the EEG record in a timely manner is challenging. Before automated algorithms can be developed and compared with human clinicians, accurately annotated data is required. This project implements software system to enable a crowdsourcing solution to annotating EEG while providing quantitative estimates of the labelers’ accuracy and the overall accuracy of consensus annotations.

Learning Objective 1: Reader will be able to describe an approach to annotate EEG records using a crowdsourcing approach where annotators are EEG professionals.

Authors:

Andrew Nguyen (Presenter)
University of San Francisco

William Bosl (Presenter)
University of San Francisco

Susan Herman, Beth Israel Deaconess Medical Center
Tobias Loddenkemper, Boston Children's Hospital

Presentation Materials:

Keywords