Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

China develops SpikingBrain1.0, a brain-inspired AI model

The new LLM mimics human localized attention, runs 25–100× faster than traditional AI, and operates without Nvidia GPUs.

byKerem Gülen
September 10, 2025
in Artificial Intelligence

Chinese researchers from the Chinese Academy of Sciences have unveiled SpikingBrain1.0, described as the world’s first “brain-like” large language model (LLM). The model is designed to consume less power and operate independently of Nvidia GPUs, addressing limitations of conventional AI technologies.

Existing models, including ChatGPT and Meta’s Llama, rely on “attention,” a process that compares every word in a sentence to all others to predict the next word. While effective, this approach consumes large amounts of energy and slows processing for long texts, such as books.

Traditional models also depend heavily on Nvidia GPUs, creating hardware bottlenecks for scaling.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Brain-inspired approach

SpikingBrain1.0 uses a localized attention mechanism, focusing on nearby words rather than analyzing entire texts. This mimics the human brain’s ability to concentrate on recent context during conversations. Researchers claim this method allows the model to function 25 to 100 times faster than conventional LLMs while maintaining comparable accuracy.

The model runs on China’s homegrown MetaX chip platform, eliminating reliance on Nvidia GPUs. It selectively responds to input, reducing power consumption and enabling continual pre-training with less than 2% of the data needed by mainstream open-source models. The researchers note that in specific scenarios, SpikingBrain1.0 can achieve over 100 times the speed of traditional AI models.

The development of SpikingBrain1.0 comes amid U.S. technology export restrictions that limit China’s access to advanced chips required for AI and server applications. These restrictions have accelerated domestic AI innovation, with SpikingBrain1.0 representing a step toward a more self-sufficient AI ecosystem.


Featured image credit

Tags: Chinese Academy of SciencesSpikingBrain1.0

Related Posts

Google’s AI health coach debuts for Android Fitbit users

Google’s AI health coach debuts for Android Fitbit users

October 28, 2025
Grokipedia’s “AI-verified” pages show little change from Wikipedia

Grokipedia’s “AI-verified” pages show little change from Wikipedia

October 28, 2025
OpenAI data reveals 0.15% of ChatGPT users express suicidal thoughts

OpenAI data reveals 0.15% of ChatGPT users express suicidal thoughts

October 28, 2025
OpenAI makes ChatGPT Go free across India

OpenAI makes ChatGPT Go free across India

October 28, 2025
Pinterest rolls out AI fashion boards for personalized outfits

Pinterest rolls out AI fashion boards for personalized outfits

October 28, 2025
OpenAI adds scheduling powers to ChatGPT with new Tasks feature

OpenAI adds scheduling powers to ChatGPT with new Tasks feature

October 27, 2025

LATEST NEWS

183M Gmail passwords exposed via infostealer malware

Google’s AI health coach debuts for Android Fitbit users

Grokipedia’s “AI-verified” pages show little change from Wikipedia

OpenAI data reveals 0.15% of ChatGPT users express suicidal thoughts

OpenAI makes ChatGPT Go free across India

Goodbye: Pixel Watch gets its final update

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.