InMoov Arm: For our first project, we 3D printed and assembled an outsourced robotic arm model. It has fishing wires that run through the fingers to a set of servos in the forearm. These servos were connected to a human forearm via electrodes, myoware, and Arduino and Raspberry Pi boards and programmed so that they move in response to human muscle contraction. This was presented at the Imagine RIT Fair in April of 2022. The arm experienced some technical issues, thus we plan on fixing this issue and preparing code such that the arm can finally be controlled via EMG. (This will be the introductory project running alongside the foot drop effort).
Orthotic Foot Drop: Following our success at the NeXT Pitch Competition, we decided we would like to start the R&D process of the foot drop orthotic proposed in our pitch. This semester, we will primarily focus on researching the technology needed to make this possible. For this, we will be continuing the efforts of NXT’s former fabrics team.
Joint Design: After several weeks of CAD projects, we will begin joint design projects. This will enable new and old members to familiarize themselves with CAD and will allow for more hands-on work, while we do research for the foot drop orthotic.
Team Lead: Luca Chiossone
Team Lead: Eric Choi
Team Lead: Leanna Frasch
The NeuroTechnology Games Team is a group dedicated to the implementation of neurotechnology in games in conjuction with game design to allow for more fun, immersive, and interesting experiences. We mainly consist of two sections of work, one side working on the neurotech software and the other side working on the games, which is done in the Unity game engine.
In the past, we have fully built a racing game that implements electromyography which allows for the player to turn a car on a screen left and right by flexing their right and left arms respectively. Which was showcased at the Strong Museum of Play.
Currently, we are working on finishing up the process of fully integrating the Neurotech software into Unity and once that is done we plan to build more games using electromyography and do some research on electroencephalography so we can start work on EEG games. We plan to use all of these games as stepping stones to more complex ideas and concepts.
Team Lead: John Haley
Team Lead: Mike Elia
Team Lead: Hillary Le
The Neurological Database for Machine Learning (or NDML for short) research team is working to make advancements in neurotech easier by constructing a comprehensive database of EEG scans over a consistent set of subjects, doing a varied set of tasks involving activity in many areas of the brain. We’re also working to make advancements in AI/ML neurologic technologies easier by ensuring that said database is optimized for AI/ML with clear data labeling, organization, and preprocessed versions, as well as creating and training some AI models ourselves.
Team Lead: Bryce Gernon
NeurGear is a company is developing an ultrasound wearable device to enhance focus and memory. Founded in 2021 by Vice President of club at the time Jon Hacker, an opportunity is given to NXT members to apply for internships with the active neurotech startup. This provides a real opportunity to put your skills to the test, and build real-life experience in the field. Find more information about the company at www.neurgear.com, or at @neurgear on instagram.
Those interested need only contact Jon Hacker at firstname.lastname@example.org to apply and join us in changing the world, one brain at a time.
Team Lead: Jon Hacker
Thought Keyboard Project
The Neurotechnology Exploration Thought Keyboard project will focus on the development of a hands-free virtual keyboard and mouse. This will be the start of a process towards making simple interfaces that bring the kind of accessibility and ease of use that only Brain Computer Interfaces can offer to all sorts of devices.
This research project has gone through many iterations, and little projects like the Wizard101 interface. Eventually it was decided that the interface shouldn't be confined to anything and simply act as a digital alternative to its physical counterpart. After going through some redesigns, we decided to make a virtual overlay that would appear over any ongoing activity on the device to type, select or any other inputs. Specifically, this overlay will consist of a mouse that will appear first, intended to select, or prompt for the radial keyboard (circular, expanding).
Right now, Kivy is being used to develop an app that can be used on PC and Phones for that intended purpose. We're focused on the software development stage of the UI and making it future proof for any additions in case its scope were to grow. Soon, we'll move onto implementing biosensing technology and SSVEP, to realize the other half of this project, in allowing this interface to receive input from the brain. The plan is to have this overlay follow the directional and locational goals of any subject. From here, this flexible interface will be used for many things, from browsers to games.
Team Lead: Antione Abraham
The wheelchair team has a motorized wheelchair that we are converting to take input from a microcontroller. After getting that working we’ll be working in stages to get the wheelchair to work with our EEG headsets. The purpose of this project is to help people with ALS better control their own wheelchair.
Team Lead: Alex Burbano