We are developing virtual environments for a neuroscience applications to study neurological patients. The patients perform spatial navigation tasks using VR headsets. Our goal is to combine the VR with eye position tracking by programming the built-in eye tracking system of the HTC Vive headset. The project involves designing and programming virtual environments (rooms, buildings, natural scenes, water, fog, etc.) using Unity game engine and implementing the eye tracking using the provided API. The program will will track the motion of the avatar and the gaze direction of the player. The project leaves a lot of room for creativity, We look for devoted individuals who are technically skilled in programming and may want to get involved in brain research. The first task should be completed within 6 months, but the project spans over years beyond this task. Flexible work hours with pseudo-regular progress meetings. We provide the headsets for program development and testing. If you have any questions please don't hesitate to contact.
Unity and C#, or Matlab or Python.
One semester or longer.
Developing code, consultation with the advisor on a pseudo-regular basis. Working from home. When the coding involves working closely with the HTC Vive headset, the work can be done at UT. The PI provides a separate office for the work, as a precaution during the Covid-19 alertness.