Real Eyes Realize Faster: Gaze Stability and Pupil Novelty for Efficient Egocentric Learning
Eye-tracking signals on AR headsets can curate egocentric video without any vision model. We gate by gaze for quality and then rank by pupil for novelty. 10% of frames matches full-stream performance.
Project Aura: Intent-Aware AR Interfaces
An open-source framework for building AR systems that detect cognitive states from pupil dilation before conscious input, enabling zero-latency intent prediction.
SVA: Physiological Signals as Alignment Data
Introducing the Salience-Valence-Affect framework. Continuous physiological monitoring provides higher-fidelity training signal for value alignment than text-based RLHF.
Engagement Crystallization in Transformer Layers
Physiological engagement signals map reliably to directions in LLM embedding space, with phase transitions emerging in layers 2–3.