There is a new breakthrough in human brain computer access, Apple s headset Vision Pro is not fragra

Mondo Technology Updated on 2024-02-01

Reporter Light Pillar.

Musk has made a bitter mockery of Apple's new AR headset Vision Pro that is up to $3,500.

He uploaded a ** with this new AR headset on one side and a plastic bag full of psychedelic mushrooms on the other, with the caption "Chemically alter your carbon vibrations for instant contact with UFOs and aliens." He contrasted the $20 "augmented reality" with Apple's $3,500 offering, which is undoubtedly hilarious.

In fact, he may already have a new idea for AR and put it into practice.

On January 30, Beijing time, Musk announced on the social platform X that the first human received an implant from the brain-computer interface company Neuralink yesterday and is currently recovering well. Preliminary results show good promise for neuron spike detection.

In recent years, crazy brain scientists with Musk as the spiritual leader have decided to manipulate a robot and put a coin-sized implant into the human brain to create a group of super humans with enhanced memory, immortal souls, half-human and half-machine, but this road is not easy to follow.

The concept of brain-computer interface originated from the idea of Jacques Vidal, a computer scientist at the University of California, Los Angeles in 1973, that is, electrodes placed on the scalp can detect real-time signals from the brain and be translated into control computers.

2023 marks the 50th anniversary of the concept of brain-computer interface. This year, whether it is theoretical research or equipment research and development, brain-computer technology has ushered in rapid development: Musk's Neuralink has been approved to carry out clinical trials of implanting devices into the human brain in the United States; The research team at Stanford University and the University of California, San Francisco, used different types of implants to collect electrical signals from the volunteers' brains and interpret them using different algorithms; BitBrain has developed wearable brain sensing devices that can monitor EEG signals with the help of artificial intelligence.

The global brain-computer interface market will continue to expand in the coming years, mainly due to the growing demand for assistive technology, advances in neurotechnology and machine learning, integration with virtual and augmented reality systems, and collaboration with AI technology, according to a ** report released in October this year.

From the perspective of market prospects, McKinsey also gave an estimate, believing that the potential market size of global brain-computer interface medical applications is expected to reach $40 billion in 2030 and $145 billion in 2040. Among them, the potential application scale of serious medical care focusing on central nervous system diseases** is expected to be US$15 billion in 2030 and US$85 billion in 2040, while the potential application scale of consumer medical care focusing on emotion assessment and intervention is expected to be US$25 billion in 2030 and US$60 billion in 2040.

From the point of view of the a** field, today, the cooling of the human brain engineering concept stocks was affected by the news to usher in a long-lost **.

Apple headset Vision ProHidden brain-computer interface

Apple's Vision Pro came out, and the most amazing thing is that Vision Pro actually hides a kind of brain-computer connection. After the release of the headset, Sterling Crispin, a former researcher at Apple, reviewed his work on Twitter and revealed a lot of black technology about Vision Pro. Much of CRISPIN's work at Apple is bound by non-disclosure agreements, with a wide range of topics and methods designed. But in fact, some of these works have been patented by Apple and can be taken out to talk about.

According to CRISPIN, there are a lot of tricks involved in making a specific ** possible. The coolest ability is the ability to tap on something before they actually click on it.

And the main principle behind this technology is that before everyone does anything, the pupil reacts, and part of that is because you anticipate something to happen after you click. As a result, Apple can monitor your eye behavior through algorithms. And redesign the UI in real time to create more of this expected pupil response, creating biofeedback for the individual brain. This is actually a primary brain-computer interface implemented through the eye.

"It's a lot of work, and it's something I'm proud of," Crispin said. Invasive brain surgery at any time. As you can see, Apple's headset uses a lot of neuroscience to develop some of these key features, using different techniques to determine what a person is going to do.

Related Pages