The MEGAN protocol is wearable neuron-responsive assessment device. The project was a BIG IDEAS finalist and awarded $5k of funding in 2024.
Project Logo
Introduction
In today's world, as researchers push the boundaries of what is medically possible, one thing remains a mystery: the brain. Many neurological diseases are still difficult to diagnose and poorly understood. Conditions ranging from Alzheimer's to the after-effects of traumatic brain injuries continue to pose challenges for too many people across the globe. If we can build better diagnosis tools, we can start to better understand how these diseases develop and how to stop them. The MEGAN Protocol is a wearable diagnosis tool which aims at tracking the onset and progression of such diseases.
This device was developed in partnership with my colleague Max, who is studying Molecular and Cellular Neurobiology at Cal. With his expertise, we were able to create a system which performs visual and auditory tests. The device is a Bluetooth-enabled Arduino which fits onto a patient's wrist and connects to a laptop.

Device mounted to wrist. Placed on both hands (dominant, non-dominant)
In this quick demo, the patient performs two gestures. One for a "visual" test and one for an "audio" test. Both gestures are performed correctly.
Device Demo
Device Overview
The goal of the MEGAN protocol is to evaluate a patient's audio and visual responsiveness. It does this by performing a series of tests with audio and visual cues. The test is centered around 4 gestures (RAISE, CROSS, TWIST, FLEX). Before the test begins each gesture is randomly assigned to one of four colors (YELLOW, GREEN, RED, BLUE). The patient must memorize these associations and perform the right gestures as the test proceeds.

Standard test setup. 10 trials.
During the test, the Arduino records the following variables:
- Attempt #
- Color / gesture association
- Gesture requested
- Gesture performed (TinyML AI Model)
- Reaction time
- Gesture completion time
A standard output looks like this:
| Attempt | Color | Gesture Detected | Gesture Requested | Success | Reaction Time | Move Time |
|---|---|---|---|---|---|---|
| 1 | BLUE | RAISE | CROSS | False | 1036 | 1288 |
| 2 | BLUE | RAISE | CROSS | False | 580 | 873 |
| 3 | BLUE | RAISE | CROSS | True | 344 | 653 |
The test is repeated as many times as the config specifies. The test is built to be fully customizable; everything from length, colors, dominant hand, test order, and more can be customized.
Hardware
An Arduino nano is connected to the patient's wrist, along with a small 3.3V battery. For tests involving both hands, the setup is duplicated for the non-dominant hand. A computer is set on an adjacent table with the testing program loaded. The user must interact with the computer to begin the test, but everything else is automatic. The accelerometer and gyroscope are both sampled at 104 Hz. This gives the software a profile of the X, Y, and Z orientations in space.


Accelerometer & gyroscope measurements.
A python script connects to the Arduino via a BLE server. Each "characteristic" contains one byte of information. One byte is used to indicate the start / end of the experiment. The other characteristics convey information about that trial.
Software

Communication occurs via a BLE server.
For the code, see GITHUB. The code relies on the Bluetooth Low Energy protocol (BLE). A python script uses OpenCV and other graphical libraries to display a GUI. Once it is ready to receive the data from the Arduino, it updates the "ble_experiment_start" variable. For each trial, the software performs the following:
Setup:
- Being running Python program on laptop
- Scan for BLE devices and connect to Arduino (dominant / non-dominant hand)
- Display and read instructions
- Randomize colors and gestures, display them, and wait for user confirmation.
Testing:
- Display audio or visual cue
- Ping BLE server; notify Arduino of test start
- Record accelerometer and gyro data
- Calculate reaction times, infer gesture
- Upload resulting data to BLE Server
This process is repeated for each trial. As mentioned before, the trials are fully customizable (see main_script.py in source code).
Source Code: MEGAN PROTOCOL GITHUB
