Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

China develops SpikingBrain1.0, a brain-inspired AI model

The new LLM mimics human localized attention, runs 25–100× faster than traditional AI, and operates without Nvidia GPUs.

byKerem Gülen
September 10, 2025
in Artificial Intelligence
Home News Artificial Intelligence
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

Chinese researchers from the Chinese Academy of Sciences have unveiled SpikingBrain1.0, described as the world’s first “brain-like” large language model (LLM). The model is designed to consume less power and operate independently of Nvidia GPUs, addressing limitations of conventional AI technologies.

Existing models, including ChatGPT and Meta’s Llama, rely on “attention,” a process that compares every word in a sentence to all others to predict the next word. While effective, this approach consumes large amounts of energy and slows processing for long texts, such as books.

Traditional models also depend heavily on Nvidia GPUs, creating hardware bottlenecks for scaling.

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Brain-inspired approach

SpikingBrain1.0 uses a localized attention mechanism, focusing on nearby words rather than analyzing entire texts. This mimics the human brain’s ability to concentrate on recent context during conversations. Researchers claim this method allows the model to function 25 to 100 times faster than conventional LLMs while maintaining comparable accuracy.

The model runs on China’s homegrown MetaX chip platform, eliminating reliance on Nvidia GPUs. It selectively responds to input, reducing power consumption and enabling continual pre-training with less than 2% of the data needed by mainstream open-source models. The researchers note that in specific scenarios, SpikingBrain1.0 can achieve over 100 times the speed of traditional AI models.

The development of SpikingBrain1.0 comes amid U.S. technology export restrictions that limit China’s access to advanced chips required for AI and server applications. These restrictions have accelerated domestic AI innovation, with SpikingBrain1.0 representing a step toward a more self-sufficient AI ecosystem.


Featured image credit

Tags: Chinese Academy of SciencesSpikingBrain1.0

Related Posts

Rivian targets year-end release for in-house AI assistant

Rivian targets year-end release for in-house AI assistant

December 10, 2025
Instagram secretly generates AI headlines for user posts

Instagram secretly generates AI headlines for user posts

December 10, 2025
Google Gemini powers new GenAI.mil platform for US military

Google Gemini powers new GenAI.mil platform for US military

December 10, 2025
OpenAI and Instacart launch grocery shopping inside ChatGPT

OpenAI and Instacart launch grocery shopping inside ChatGPT

December 9, 2025
Google details security guardrails for Chrome’s new AI agents

Google details security guardrails for Chrome’s new AI agents

December 9, 2025
Google adds shoppable AI video feed to Doppl try-on app

Google adds shoppable AI video feed to Doppl try-on app

December 9, 2025

LATEST NEWS

Rivian targets year-end release for in-house AI assistant

Windows 11 gets major gaming update withf aster load times

Instagram secretly generates AI headlines for user posts

Google Photos adds video templates and new editor for mobile

Google Gemini powers new GenAI.mil platform for US military

Google plans AI glasses launch in 2026 with Warby Parker

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.