Cognixion One AR + BCI Applications
Company: Cognixion
Provided: design direction, people management, UI/UX, sound/haptic design
Date: 2022 - 2023
At Cognixion I directed the design of interactive experiences, including input, sensory feedback, software applications and hardware, for Cognixion One, the world's first accessibility-focused augmented reality headset with a brain computer interface. I managed the efforts of our Design Team of full-time, contract and intern designers, prototypers and researchers, that I assembled from the ground up to join along this journey.

One of the Design Team’s frist efforts was updating the design of the Speakprose AR application for assisted communications that runs on our Cognixion One headset via headpose interactions. The user (often person with motor / vocal disabilities, such cerebral palsy), is able to use head-pose targeting / dwelling on UI to compose messages on a virtual keyboard, and project them to be sonified and projected out a front-facing display for co-located conversation, or as commands for smart home devices (e.g. television).




Pictured above: Cognixion One user Chris Benedict communicating via the device’s front-facing display via Speakprose AR application. Below are a head-pose UI screens for Keyboard, Phrase Manager and Settings shown via the internal AR display.
A later effort effort was working on AR applications that use BCI as input. The user, in this case person with late-stage ALS (amyotrophic lateral sclerosis) focuses their mind on labeled SSVEP (Steady State Visually Evoked Potential) stimuli to provide Yes/No/Rest response or freeform text reponse to communicate with a caretaker.


Pictured above: In this application the headset wearer communicates with a caretaker by focusing their mind on visual stimuli (SSVEP) labeled yes and no. When selected, a yes or no response is sonified and projected from the display as text. The rest stimuli, accompanied by ambient sound and particle system, is used to take a break.
A talk from the Cognixion design team sharing their work under my direction with the XR accessibility community.
To access more content from this project, please contact