Real Eyes Realize Faster: Gaze Stability and Pupil Novelty for Efficient Egocentric Learning
Eye-tracking signals on AR headsets can curate egocentric video without any vision model. We gate by gaze for quality and then rank by pupil for novelty. 10% of frames matches full-stream performance.
Riding Brainwaves in LLM Space: Understanding Activation Patterns Using Individual Neural Signatures
Frozen LLMs contain stable, person-specific neural directions in their deep layers. Person-specific EEG probes outperform population probes ninefold, providing a geometric foundation for EEG-driven personalization.
Project Aura: Intent-Aware AR Interfaces
An imagination of AR systems that detect cognitive states from pupil dilation before conscious input, enabling zero-latency intent prediction and adaptive interfaces.