BrainGenix Environment Rendering System (ERS) is a distributed multi-GPU rendering system to implement virtual body and environment simulation functionality. ERS will perform sensory translation to convert simulated sensory data into action potentials that can be sent directly to the emulation. ERS is designed to be used in conjunction with other BrainGenix systems, but can also be used as a standalone game engine. For more information, view our Technical Specifications document.


Our primary goal with ERS is to allow emulations to be embodied in virtual, controllable environments. In order to becoming functioning individuals, emulations must be able to interact with the world; simulations produced by ERS will give them the chance to be immersed in an environment. They will be connected to virtual avatars that allow them to access sensory input through neuroprosthetic-like feedback. ERS will also accept action potentials from the emulated brain and translate these into skeletal movement.


ERS is constantly evolving as new features are added. To date, ERS renders 3D models that can be scaled and flexibly adjusted. Keep up with our current progress on the ERS GitHub page and the Trello Board.