Software / Documentation / Eye Tracker

Eye Tracker Module

Pupil Labs Neon integration with gaze data, pupil diameter, IMU data, and scene video capture.

The EyeTracker-Neon module captures gaze data and scene video from Pupil Labs Neon eye tracking glasses. It records where participants are looking in real-time, synchronized with other data streams for multi-modal research.

Getting Started

  1. Power on your Pupil Labs Neon glasses
  2. Connect via WiFi — Same network as host, or USB tethering
  3. Enable the EyeTracker-Neon module — From the Modules menu
  4. Wait for device connection — Status shows "Connected"
  5. Calibrate if needed — Using the Neon Companion app
  6. Start a session — To begin recording

User Interface

Preview Display

Shows the scene camera feed with gaze overlay:

  • Red circle indicates current gaze position
  • Scene video shows participant's view
Eye Tracker module showing scene video with gaze overlay, eye cameras, and eye event metrics

Device Status Panel

Field Description
Device Connected device name
Status Connection state (Connected/Disconnected)
Recording Current recording state (Active/Idle)

Controls

  • Configure — Open device settings dialog

Hardware Specifications

The Neon device streams at fixed specifications that cannot be changed via API:

Stream Resolution Frame Rate Notes
Scene Camera 1600x1200 px 30 Hz 103° x 77° field of view
Eye Cameras 384x192 px (192x192 per eye) 200 Hz Infrared
Gaze Data N/A Up to 200 Hz Varies by companion phone

Our module receives these streams and can downsample/resize locally for smaller file sizes.

Recording Sessions

Starting Recording

When you start a recording session:

  • Gaze data recording begins
  • Scene video capture starts
  • Optional: Eye camera video, IMU, audio, and events

During Recording

Each sample captures:

  • Gaze coordinates (x, y) in scene camera view
  • Pupil diameter for each eye
  • Confidence values for gaze estimation
  • Scene video with embedded timestamps

Data Output

File Location

{session_dir}/EyeTracker-Neon/

Files Generated

File Description
{prefix}_GAZE.csv Extended gaze data with pupil diameter (36 columns)
{prefix}_WORLD_{w}x{h}_{fps}fps.mp4 World/scene video (participant's view)
{prefix}_EYES_384x192_{fps}fps.mp4 Eye camera video
{prefix}_EVENTS.csv Eye events (fixations, saccades, blinks) (24 columns)
{prefix}_IMU.csv Head motion (accelerometer, gyroscope, orientation) (19 columns)
{prefix}_AUDIO.wav Scene microphone audio (optional)

Scene Video Format

Property Value
Container MP4
Codec H.264
Resolution Configurable (default 1280x720, downsampled from 1600x1200)
Frame Rate Configurable (default 10 fps, downsampled from 30 Hz)

GAZE CSV Columns

Extended gaze data with 36 columns including eye position and eyelid metrics:

Column Description
trialTrial number (integer)
moduleAlways "EyeTracker"
device_idDevice identifier
labelOptional trial label
record_time_unixSystem timestamp (Unix seconds, 6 decimals)
record_time_monoMonotonic time (seconds, 9 decimals)
device_time_unixDevice timestamp (Unix seconds)
device_time_nsDevice timestamp (nanoseconds)
stream_typeData stream classification
wornGlasses worn status
x, yCombined gaze position (normalized 0-1)
left_x, left_yLeft eye gaze position
right_x, right_yRight eye gaze position
pupil_diameter_left, pupil_diameter_rightPupil diameters (mm)
eyeball_center_left_x/y/zLeft eyeball 3D center position
optical_axis_left_x/y/zLeft eye optical axis orientation
eyeball_center_right_x/y/zRight eyeball 3D center position
optical_axis_right_x/y/zRight eye optical axis orientation
eyelid_angle_top/bottom_left, eyelid_aperture_leftLeft eyelid metrics
eyelid_angle_top/bottom_right, eyelid_aperture_rightRight eyelid metrics

EVENTS CSV Columns

Eye events with 24 fixed columns:

Column Description
trialTrial number
moduleAlways "EyeTracker"
device_idDevice identifier
labelOptional trial label
record_time_unixSystem timestamp (Unix seconds)
record_time_monoMonotonic time (seconds)
device_time_unixDevice timestamp
device_time_nsDevice timestamp (nanoseconds)
event_typefixation, blink, or saccade
event_subtypeEvent subtype classification
confidenceEvent confidence score
durationEvent duration
start_time_ns, end_time_nsEvent start/end (nanoseconds)
start_gaze_x/y, end_gaze_x/yGaze position at start/end
mean_gaze_x, mean_gaze_yMean gaze position during event
amplitude_pixels, amplitude_angle_degSaccade amplitude
mean_velocity, max_velocityVelocity metrics

IMU CSV Columns

Head motion data with 19 columns including orientation:

Column Description
trialTrial number
moduleAlways "EyeTracker"
device_idDevice identifier
labelOptional trial label
record_time_unixSystem timestamp (Unix seconds)
record_time_monoMonotonic time (seconds)
device_time_unixDevice timestamp
device_time_nsDevice timestamp (nanoseconds)
gyro_x, gyro_y, gyro_zGyroscope (rad/s)
accel_x, accel_y, accel_zAccelerometer (m/s²)
quat_w, quat_x, quat_y, quat_zOrientation quaternion
temperatureSensor temperature

Timing and Synchronization

Timestamp Types

Timestamp Source Use Case
gaze_timestamp Pupil Labs device clock Primary gaze timing
record_time_unix Host system wall clock Cross-system time reference
record_time_mono Host monotonic clock Cross-module synchronization (best for this)

Cross-Module Synchronization

Use record_time_mono for precise cross-module sync with:

  • Camera encode_time_mono
  • Audio write_time_monotonic
  • DRT Unix time in UTC

Video-Gaze Alignment

Use FRAME CSV to correlate video frames with gaze data:

  1. Find frame_index for desired video position
  2. Match capture_timestamp to gaze_timestamp in GAZEDATA
  3. Gaze samples between frames belong to that time period

Data Interpretation

Gaze Position (norm_pos_x, norm_pos_y)

Normalized coordinates (0-1) in scene camera view:

  • (0, 0) = top-left corner
  • (1, 1) = bottom-right corner

To convert to pixel coordinates:

pixel_x = norm_pos_x * scene_width
pixel_y = norm_pos_y * scene_height

Confidence

Quality of gaze estimate (0-1). Higher values indicate more reliable tracking. Low confidence may occur when:

  • Eyes are partially closed
  • Glasses are slipping
  • Infrared reflections interfere

Pupil Diameter

Measured in millimeters. Changes in pupil size can reflect:

  • Cognitive load (larger during mental effort)
  • Emotional response
  • Lighting conditions (smaller in bright light)

Calibration

For accurate gaze data, calibrate before each session:

  1. Open the Neon Companion app — On the connected phone
  2. Select appropriate calibration method
  3. Follow on-screen instructions
  4. Verify accuracy — With validation targets

Recalibrate if:

  • Glasses are repositioned on the participant's face
  • Significant time has passed
  • Gaze accuracy appears poor

Configuration

Click "Configure" to access device settings.

Setting Default Description
Scene Resolution 1280x720 Output video resolution (downsampled from 1600x1200)
Scene FPS 10 Output frame rate (downsampled from 30 Hz)
Eyes FPS 30 Eye camera output rate (downsampled from 200 Hz)
Preview Preset 4 (640x480) Live preview resolution (0-8 scale)
Gaze Overlay Enabled Draw gaze position on recorded video
Audio Recording Disabled Record scene microphone audio

Troubleshooting

Device not detected
  1. Check USB cable connection or WiFi network
  2. Verify Neon Companion app is running on the phone
  3. Check network settings if using WiFi (both devices on same network)
  4. Restart the module if needed
No gaze data appearing
  1. Ensure calibration was completed
  2. Check pupil detection in Neon Companion app
  3. Verify adequate lighting conditions
  4. Clean eye camera lenses (infrared cameras on inside of frame)
Scene video not recording
  1. Check scene camera connection
  2. Verify camera is not in use by another app
  3. Check available disk space
  4. Review module logs for errors
Poor gaze accuracy
  1. Recalibrate the tracker
  2. Ensure glasses fit snugly (not slipping)
  3. Check for reflections on lenses
  4. Verify pupil detection is stable in Companion app