By midyear, Elon Musk’s Neuralink had raised another USD 650 million, lifting its valuation to USD 9 billion. The company’s clinical focus, once limited to treating severe motor disabilities such as amyotrophic lateral sclerosis (ALS) and spinal cord injuries, has expanded to include functional restoration for patients with aphasia and blindness.
In August, Sam Altman and his team began building a new brain-computer interface (BCI) startup, Merge Labs, with plans to raise USD 250 million from OpenAI. The company is developing ultrasound-based technology that can read and modulate brain signals.
By November, Synchron had closed a USD 200 million Series D round, while Paradromics secured approval from the US Food and Drug Administration to begin human clinical trials for speech restoration using BCIs.
Compared with the scale of investment behind these overseas ventures, most Chinese BCI startups have taken a pragmatic, low-profile approach. Operating under the country’s stringent medical device regulations, they are advancing steadily by leveraging domestic clinical resources and medical expertise to refine products and push trials forward.
Among those making progress, implantable BCI systems from companies such as Neuracle and StairMed have been fast-tracked through the National Medical Products Administration’s (NMPA) “innovative green channel,” allowing them to begin human trials to verify safety and efficacy.
As artificial intelligence continues to evolve, the idea of human-machine integration is gaining momentum. From treating complex diseases and restoring lost functions to augmenting human abilities, questions arise: how far are BCIs from these milestones? Could they become the bridge connecting carbon-based and silicon-based life? And how might single-cell data from the brain’s cortex expand the ceiling of BCI applications?
To explore these issues, 36Kr interviewed Li Xue and Zhao Zhengtou, researchers at the Center for Excellence in Brain Science and Intelligence Technology under the Chinese Academy of Sciences. Both are also co-founders of StairMed.
In March, StairMed’s invasive BCI system was used in China’s first prospective clinical trial at Fudan University’s Huashan Hospital. In November, the system entered the NMPA’s green channel, potentially shortening the time from clinical validation to market approval.
The following transcript has been edited and consolidated for brevity and clarity.
36Kr: What boundaries exist for BCI applications?
Zhao Zhengtou (ZZ): There are three main categories of BCI applications.
First are systems that transmit information outward from the brain. These decode motor intentions in paralyzed patients or speech intentions in those who have lost the ability to speak, converting them into external outputs that enable movement or communication.
Second are neuromodulation interfaces, which write information into the central or peripheral nervous system to regulate abnormal neural activity. Examples include using deep brain stimulation to treat Parkinson’s disease or spinal cord stimulation to relieve chronic pain.
Third are sensory reconstruction interfaces, such as restoring hearing or vision. For patients who have lost sensory input, BCIs can convert external information into electrical signals that are fed back into the brain to recreate perception.
At StairMed, we are developing a BCI platform that combines precise neural sensing and writing, signal processing, wireless transmission, and decoding-encoding technologies. As a platform company, our goal is to maximize the value of BCIs across multiple application scenarios.
36Kr: The audience interested in BCIs has expanded from the medical community to the tech and consumer sectors. How do you see the timeline for clinical and consumer adoption?
ZZ: Within the next three to five years, BCIs will prove their clinical value by improving patients’ quality of life and functional independence.
In five to ten years, they will begin to show potential in consumer health. Because BCIs enable direct communication between humans and machines, they could redefine human-computer interaction and dramatically enhance efficiency.
Eventually, as BCIs connect with rapidly evolving AI systems and hardware, it will become possible to control intelligent devices using thought alone. At that point, BCIs will link the brain with external devices, bringing human-machine fusion into everyday life.
36Kr: What are the target indications for StairMed’s first implantable wireless BCI system? What value do you hope to deliver to patients?
ZZ: Our first implantable wireless BCI system targets patients with severe motor impairments caused by spinal cord injury, ALS, or stroke.
The implant allows them to control electronic or physical devices such as robotic arms or wheelchairs directly through brain signals.
Our goal is to improve patients’ quality of life. This includes basic activities like turning over or holding a cup, as well as connecting to the digital world to play games, send emails, and manage bank accounts.
We also want to help patients regain productivity. For instance, someone with an engineering background could use BCIs for 3D modeling, or an experienced e-commerce seller could manage an online store. Beyond restoring independence, BCIs can help patients return to the workforce and feel needed again.
36Kr: What does the system look like, and how is it implanted?
Li Xue (LX): The core component is our ultra flexible electrode array, which we have spent ten years developing. Each electrode filament is about 1% the thickness of a human hair, and its width is similar to that of a hair strand. The force generated when the electrode bends is comparable to the interaction force between two cells. This minimizes displacement as the body moves, allowing stable long-term signal collection from the same neural site.
The procedure involves thinning a small section of the skull by about five millimeters to create a bone groove where a coin-sized implant can fit. Then, through a small puncture in the skull, the flexible electrodes are inserted five to eight millimeters beneath the cerebral cortex. According to neurosurgeons, this is closer to a puncture procedure than to traditional open-skull surgery.
The electrodes record single-neuron activity from the cortex with extremely low latency, on the order of tens of milliseconds. Since natural brain-to-hand signaling already takes about 100 milliseconds, users barely perceive any delay. With this system, patients can use thought alone to control cross-platform devices such as smartphones, computers, or wheelchairs.
On December 4, StairMed unveiled its second-generation high-throughput wireless invasive BCI system. The new version increases the electrode channel count to 256 and expands applications beyond motor control to include speech reconstruction.
36Kr: StairMed seems to place great emphasis on single-cell brain datasets. Why? Can this be seen as a key part of a data flywheel for brain science?
ZZ: Exactly. We often describe the BCI ecosystem as an inverted pyramid. At the bottom are the neural interfaces, above that is full-system development, then clinical accessibility, and at the top is neuroscience, which represents the understanding of the brain itself. The degree to which we understand the brain determines how far we can develop BCIs.
The foundational interface system makes clinical data collection possible, and these single-cell datasets are the key to advancing cognitive science. In turn, that deeper understanding helps us design better products and discover new applications.
For any BCI platform, the company that accumulates the largest and richest brain data will be best positioned to define the next wave of applications and technologies.
For example, the long-running US BCI project BrainGate has implanted systems in about 40 to 50 patients, generating limited but highly valuable data. That data has already powered research breakthroughs in thought-based typing and speech decoding. Now imagine scaling that from dozens to thousands or even tens of thousands of implantations.
36Kr: What role do AI and algorithms play in this process?
ZZ: Once a large dataset is accumulated for a specific brain region and task, we can train a foundation model using those brain signals, similar to how large language models are trained. This model can then be applied to new patients, giving them stronger baseline decoding and control performance.
Neural network algorithms are well-suited for processing complex data. At the moment, the control signals we decode are relatively simple, say, a two-dimensional cursor movement or a few degrees of motion control. That means smaller models, such as recurrent neural networks with around 100,000 parameters, are sufficient.
In the future, as we scale to higher throughput and more complex controls, both model complexity and parameter counts will rise. I believe AI’s current progress is more than sufficient to meet BCI needs for now.
AI is also valuable on the application side. For instance, in language decoding, we don’t need to decode entire words. If we can extract a few dozen morpheme categories from neural signals, large language models can use context to predict intended words, greatly improving communication efficiency.
Ultimately, the relationship between BCIs, humans, and agentic AI will be dynamic and collaborative. Sometimes we will issue high-level commands through brain signals and let the AI agent execute them, while in other cases, we will fine-tune actions ourselves for greater precision.
36Kr: Based on the performance of existing implant recipients, do you think BCIs could eventually give humans capabilities beyond the ordinary?
LX: That’s actually what motivated us to pursue this technology in the first place. We’ve always envisioned consumer-grade applications. We started with medical use because it’s a necessary path: it meets urgent clinical needs and forces us to perfect the technology.
Our hope is not just to restore motor or sensory function, but also to push the boundaries of human control. In the past, we controlled limbs with our brains. In the future, we might control external devices via BCIs, potentially surpassing normal human limits.
For example, Neuralink once released a mini game to measure cursor control speed and accuracy. Ordinary users can reach eight to ten bits per second (BPS), but paralyzed patients using BCIs achieved 9.5 BPS after training, which is faster than most healthy users. That shows BCIs can already confer comparative advantages in specific tasks.
Another idea: today, all device control remains constrained by path dependency. To move a cursor from one corner of the screen to another, you must physically drag it. In the future, by decoding spatial memory signals from the hippocampus, we could “teleport” the cursor instead, which is impossible with current technology.
Similarly, conventional controllers or voice commands are crude tools for manipulating robotic arms with multiple degrees of freedom. BCIs, however, can encode far more nuanced neural information, allowing users to control robotic limbs as naturally as their own. Research in leading journals has already shown that trained BCI users can operate robotic arms with smooth, coordinated motion.
36Kr: With AI evolving so rapidly, many experts worry about its impact on human education, employment, and even existence. Do you think human-machine fusion is inevitable?
LX: AI is meant to serve humans, and human intent should always remain central. But we cannot rule out the possibility that AI may one day develop self-awareness.
To prevent humans from losing agency or relevance, deep integration between humans and machines is, in my view, inevitable. The key to that integration is the bridge, and that is exactly what BCIs represent: a channel for information exchange between humans and machines.
Without that bridge, silicon-based AI and carbon-based humans could drift apart. But if AI becomes part of us, or we become part of it, we could achieve a harmonious coexistence, one without communication barriers.
Think of autonomous driving: you can tell the car to go from point A to point B, and it will plan and navigate independently, yet you can always take the wheel. That is controlled evolution. What we must avoid is a future where AI or robots operate entirely beyond human control. Deep human-machine integration may be the safer, more balanced path forward.
36Kr: If BCIs become fully mature, would you personally get implanted?
LX: We’ve talked about it before. I’d love to try it myself, just to experience what it actually feels like.
KrASIA Connection features translated and adapted content that was originally published by 36Kr. This article was written by Hai Ruojing for 36Kr.
