What’s the future of an AR-VR headset that connects to your brain?

AR-VR headset

With major improvements on the horizon, the fields of virtual reality (VR) and augmented reality (AR) are constantly changing. The utilization of hand and eye tracking for inputs is currently being prioritized over physical controllers. But there are even more exciting possibilities in the future, especially in the field of neural connections. In terms of creating tools for non-invasive brain-computer interface (BCI) technology, OpenBCI, a Brooklyn-based company, is in the vanguard. They took their knowledge of sensory systems and put it into a mixed-reality headset called Galea. Galea is an AR-VR headset that is tied to the brain. We think that this cool headset will come out later this year, giving people a new and immersive experience.

Brain-Linked AR-VR Headset: A New Sensor Platform for Virtual Reality

Galea was made by OpenBCI, and it uses a lot of different sensors to make a complete system. Electroencephalography (EEG), Electromyography (EMG), Electrodermal Activity (EDA), Photoplethysmography (PPG), and Eye Tracking are some examples of these sensors.

EEG sensors measure the brain waves’ electrical activity. OpenBCI uses sensors with rubber tips that can be put close to the head. Even when dry, these sensors work well. But to get the best signal clarity, it’s important to make sure hair doesn’t get in the way too much.

EMG sensors, on the other hand, are used to measure the electrical activity of nerves and muscles. In Galea, these sensors are built into the headset’s facemask and are placed around the forehead, eyes, and cheeks.

Even the smallest movements of my facial muscles are picked up by Galea’s sensors and turned into measurable readings. Note that this is not like the Quest Pro or other VR glasses. The second type uses face cameras to track specific movements. Keep in mind that Galea’s readings are all electrical, which means it could pick up on very small movements that look like simple nerve signals.

Another company, Meta, is also working on wristbands with EMG technology. They are making wristbands that people will be able to use with speakers in the future. But this wrist technology only measures the movements of the fingers and hands through the wrist. OpenBCI’s sensors, on the other hand, are designed specifically to track facial movements.

EDA (electrodermal activity) sensors, which record the electrical signals made by sweat on the skin, are also built into Galea. This gives an extra layer of knowledge for a fuller understanding of how the user’s body is reacting.

Other Important Things to Know

Most of the time, the EDA sensors in Fitbit’s Sense smartwatch are used to measure stress levels. OpenBCI has also put EDA sensors in the part of the Galea headset that goes over the face. These sensors can also measure stress.

PPG, which stands for “photoplethysmography,” is a way to measure heart rate using light. Most smartwatches use this method. In the final form of Galea, forehead PPG measurements also play a part. The Galea sensor array works with a VR-AR gear already on the market, like the Varjo XR-3. This is also true of the Varjo Aero, which costs less. For the system to work, it needs to be connected to a computer that runs the software and analyzes the data.

The advanced features of Varjo’s high-resolution monitor and passthrough video mixed reality help the sensor array in OpenBCI. This also opens up a lot of software options for VR and AR situations. But it’s important to remember that OpenBCI’s sensors can be used in many different ways. So that they can be used without a VR gear.

The Apple Vision Pro is an interesting tool for OpenBCI. This is mostly because of how fast it works and how well it works on its own. The CEO and co-founder of OpenBCI, Conor Russomanno, sees the possibility of working with systems like Vision Pro or future AR and VR platforms. He also compares how Apple recently focused on the computer parts of mixed reality to how OpenBCI sees the possibilities in this field.

Goals for accessibility: Brain-Linked AR-VR Headset

The OpenBCI sensory array can look into many different options at the same time. Instead of focusing on a single goal, the system’s sensors have the potential to help with research and computer interaction. Christian Beyerlein, a hacker with spinal muscular atrophy, used OpenBCI’s sensory array to control a drone with facial muscle impulses.

This was an important partnership. This amazing demonstration, which was given as a TED talk, showed how brain-computer interfaces can make virtual and real-world technologies more accessible and give people more control over them.

EMG technology is used to pick up on very small electrical signals. We’re talking about accuracy to the point where you can’t see your muscles moving. But sensors, algorithms, and human input may need to be fine-tuned over time to work together in a smooth way at this level. The wide array of sensors used by OpenBCI has the potential to collect a lot of data that could help guide future research and lead to the creation of new interfaces.

A sensor platform that has the potential to be used for more than just headsets

Galea AR-VR headset which was made by OpenBCI, is both a VR and an AR headset. The sensor array of Galea can also be used independently, which is a significant factor in its compatibility with Varjo’s hardware. When you think about the potential of a future where wearable devices talk to each other, enhancing our daily interactions, this aspect is especially interesting.

Even though this idea of advanced wearables with sensors is still a long way off, the sensor technologies that OpenBCI has built into Galea seem to be the first steps toward this future. At the moment, it is still hard to persuade people of the value of VR, AR, and wearable visual technologies.

The key to the development of VR/AR into something profoundly significant, albeit potentially unsettling, may lie in enhancing our interactions with spatial computing and the physical world. As personal technology keeps getting better, it seems to be getting closer to our feelings and brains, but what we’ve seen so far is just the beginning of what’s possible.

Related posts

Dallas Mavericks vs Timberwolves: Player Stats, Match Analysis, and Highlights

Pacers vs Knicks: Match Analysis, Player Stats, and Key Highlights

The Insider’s Guide to Guest Blogging Success