Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
  • AI
  • Tech
  • Cybersecurity
  • Finance
  • DeFi & Blockchain
  • Startups
  • Gaming
Dataconomy
  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Baidu is Readying the World’s Largest Deep Learning System

byEileen McNulty
September 8, 2014
in Artificial Intelligence, News

Baidu is getting closer to finishing the world’s largest computer cluster for deep learning.
The system is reported to have 100 billion digitally simulated neural connections conceived by heavy graphics processing units (GPUs). To put things in perspective the human brain has around a hundred trillion neural connections.

Five years from now voice and picture based online searches will trump text based usership by more than half, owing to the need for easier modes of seeking information through mobile devices, explained Baidu Chief Executive Officer Robin Li.

Andrew Ng, the chief scientist at China’s biggest search engine company, Baidu, has previously worked on the “Google Brain” project as well as the Stanford Artificial Intelligence Laboratory project before he joined Baidu in May. He believes that the ‘Baidu system will be about 10 times more powerful than a 2013 computer cluster’ at the Stanford Lab, to serve provide more computing power and better image recognition. “The bigger you build these things, the better they perform,” said Ng. “Our initial task is to recognize images better, to create computer vision.”

In order to simulate brain function computers are structured with deep learning capabilities and Ng points out that such developments have in the recent past brought down speech recognition errors by 25 percent. The advent of smart mobile devices is ‘shifting demand for how users query for information;’ where it leads the industry might not remain a matter of conjecture.

Read more here.

Follow @DataconomyMedia

(Image credit: Flickr)

Stay Ahead of the Curve!

Don't miss out on the latest insights, trends, and analysis in the world of data, technology, and startups. Subscribe to our newsletter and get exclusive content delivered straight to your inbox.

Tags: BaiduDeep learningMachine Learning Newsletter

Related Posts

Could CTEM have prevented the Oracle Cloud breach?

Could CTEM have prevented the Oracle Cloud breach?

October 5, 2025
ChatGPT reportedly reduces reliance on Reddit as a data source

ChatGPT reportedly reduces reliance on Reddit as a data source

October 3, 2025
Perplexity makes Comet AI browser free, launches background assistant and Chess.com partnership

Perplexity makes Comet AI browser free, launches background assistant and Chess.com partnership

October 3, 2025
Light-powered chip makes AI computation 100 times more efficient

Light-powered chip makes AI computation 100 times more efficient

October 3, 2025
Free and effective anti-robocall tools are now available

Free and effective anti-robocall tools are now available

October 3, 2025
Choosing the right Web3 server: OVHcloud options for startups to enterprises

Choosing the right Web3 server: OVHcloud options for startups to enterprises

October 3, 2025
Please login to join discussion

LATEST NEWS

Could CTEM have prevented the Oracle Cloud breach?

ChatGPT reportedly reduces reliance on Reddit as a data source

Perplexity makes Comet AI browser free, launches background assistant and Chess.com partnership

Light-powered chip makes AI computation 100 times more efficient

Free and effective anti-robocall tools are now available

Choosing the right Web3 server: OVHcloud options for startups to enterprises

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy

Follow Us

  • News
    • Artificial Intelligence
    • Cybersecurity
    • DeFi & Blockchain
    • Finance
    • Gaming
    • Startups
    • Tech
  • Industry
  • Research
  • Resources
    • Articles
    • Guides
    • Case Studies
    • Glossary
    • Whitepapers
  • Newsletter
  • + More
    • Conversations
    • Events
    • About
      • About
      • Contact
      • Imprint
      • Legal & Privacy
      • Partner With Us
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.