Dataconomy
  • News
  • AI
  • Big Data
  • Machine Learning
  • Trends
    • Blockchain
    • Cybersecurity
    • FinTech
    • Gaming
    • Internet of Things
    • Startups
    • Whitepapers
  • Industry
    • Energy & Environment
    • Finance
    • Healthcare
    • Industrial Goods & Services
    • Marketing & Sales
    • Retail & Consumer
    • Technology & IT
    • Transportation & Logistics
  • Events
  • About
    • About Us
    • Contact
    • Imprint
    • Legal & Privacy
    • Newsletter
    • Partner With Us
    • Writers wanted
Subscribe
No Result
View All Result
Dataconomy
  • News
  • AI
  • Big Data
  • Machine Learning
  • Trends
    • Blockchain
    • Cybersecurity
    • FinTech
    • Gaming
    • Internet of Things
    • Startups
    • Whitepapers
  • Industry
    • Energy & Environment
    • Finance
    • Healthcare
    • Industrial Goods & Services
    • Marketing & Sales
    • Retail & Consumer
    • Technology & IT
    • Transportation & Logistics
  • Events
  • About
    • About Us
    • Contact
    • Imprint
    • Legal & Privacy
    • Newsletter
    • Partner With Us
    • Writers wanted
Subscribe
No Result
View All Result
Dataconomy
No Result
View All Result

Data-Centric Computing to Expedite US Department of Energy Operations

by Eileen McNulty
November 17, 2014
in News
Home News
Share on FacebookShare on TwitterShare on LinkedInShare on WhatsAppShare on e-mail

In the USA, the Department of Energy have awarded $425 million to IBM and NVIDIA to develop supercomputers with data-centric computing architecture.

“Data-centric computing, or the idea of locating computing resources all places where data exists to minimize data movement, is a necessary and critically important architectural transition that IBM is announcing with the Coral project,” explained David Turek of IBM in a recent video, “because not only are the government labs in the US experiencing a dramatic impact of huge amounts of data, but also are industries around the world.”

U.S. Secretary of Energy Ernest Moniz stated, “High-performance computing is an essential component of the science and technology portfolio required to maintain U.S. competitiveness and ensure our economic and national security”. The bottom line is to have an upper hand in data processing and computing technology over any other country.

About $325 million has been sanctioned for two high-performance supercomputers, capable of performing 10^18 floating point operations per second, also known as one exaflop. The remaining $100 million will go toward a program called FastForward2 to develop next-generation supercomputers that are 20 to 40 times faster than current supercomputers.

The two computers — Summit, to be built at Oak Ridge National Laboratory, and Sierra, built at Lawrence Livermore — will have peak performance of around 150 petaflops when they are completed in 2017-2018. Summit, is expected to be five times more powerful than the current system at Oak Ridge, Titan. And Sierra supercomputer is expected to be at least seven times more powerful than Lawrence Livermore’s current machine, Sequoia. The joint Collaboration of Oak Ridge, Argonne, and Lawrence Livermore (CORAL) was established in early 2014 to leverage supercomputing investments, streamline procurement processes and reduce costs to develop such supercomputers.


Join the Partisia Blockchain Hackathon, design the future, gain new skills, and win!


“DOE and its National Labs have always been at the forefront of HPC and we expect that critical supercomputing investments like CORAL and FastForward 2 will again lead to transformational advancements in basic science, national defence, environmental and energy research that rely on simulations of complex physical systems and analysis of massive amounts of data,” Moniz stated.

IBM Power Architecture, NVIDIA’s Volta GPU and Mellanox’s Interconnected technologies will collaborate to advance key research initiatives for national nuclear deterrence, technology advancement and scientific discovery.

Jen-Hsun Huang, CEO of Nvidia, said. “Scientists are tackling massive challenges from quantum to global to galactic scales. Their work relies on increasingly more powerful supercomputers. Through the invention of GPU acceleration, we have paved the path to exascale supercomputing, giving scientists the tool for unimaginable discoveries.”.

Dave at IBM stated “Data-centric computing has been set up as a new architectural paradigm which is meant to deal with the problem of big data and that no one is immune to it. The issue of big data, cuts across all market segments and technologies and will eventually affect all consumers using smart devices”.

Read more here.

Follow @DataconomyMedia

(Image credit: Thomas Hawk)

 

Tags: Department of EnergyibmNvidiaSupercomputerSupercomputing

Related Posts

How did ChatGPT passed an MBA exam

How did ChatGPT passed an MBA exam?

February 2, 2023
Google code red: ChatGPT and You.com like AI-powered tools threatening the search engine. Moreover, latest Apple Search rumors increased the danger.

Google code red: ChatGPT, You.com and rumors of Apple Search challenge the dominance of search giant

February 2, 2023
T-Mobile data breach 2023 explained: Learn how did the leak happen and explore T-Mobile data breach history. It is not the first time of the company

T-Mobile data breach 2023: The telecom giant got hacked eight times in the last six years

January 20, 2023
Microsoft layoffs 2023: Amazon job cuts that affect 11,000 employees explained. Big tech layoffs continue... Learn why and what will happen next.

Microsoft layoffs will affect more than 11,000 employees

January 18, 2023
Medibank Data Breach Class Action: Compensation can reach up to $20,000 per person

Medibank Data Breach Class Action: Compensation can reach up to $20,000 per person

January 16, 2023
What is DoNotPay AI Lawyer? The world's first robot lawyer ready to give $1 million to represent you. How does it work? Keep reading.

DoNotPay AI lawyer is ready to give $1 million for any case in US

January 12, 2023

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

LATEST ARTICLES

Cyberpsychology: The psychological underpinnings of cybersecurity risks

ChatGPT Plus: How does the paid version work?

AI Text Classifier: OpenAI’s ChatGPT detector indicates AI-generated text

A journey worth taking: Shifting from BPM to DPA

BuzzFeed ChatGPT integration: Buzzfeed stock surges after the OpenAI deal

Adversarial machine learning 101: A new cybersecurity frontier

Dataconomy

COPYRIGHT © DATACONOMY MEDIA GMBH, ALL RIGHTS RESERVED.

  • About
  • Imprint
  • Contact
  • Legal & Privacy
  • Partnership
  • Writers wanted

Follow Us

  • News
  • AI
  • Big Data
  • Machine Learning
  • Trends
    • Blockchain
    • Cybersecurity
    • FinTech
    • Gaming
    • Internet of Things
    • Startups
    • Whitepapers
  • Industry
    • Energy & Environment
    • Finance
    • Healthcare
    • Industrial Goods & Services
    • Marketing & Sales
    • Retail & Consumer
    • Technology & IT
    • Transportation & Logistics
  • Events
  • About
    • About Us
    • Contact
    • Imprint
    • Legal & Privacy
    • Newsletter
    • Partner With Us
    • Writers wanted
No Result
View All Result
Subscribe

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy Policy.