The collaboration, known as Deep Computing Solutions, will bring a new dimension to Lawrence Livermore National Laboratory's High Performance Computing Innovation Center, combining both entities' computational science expertise to benefit clients, according to a press release. Building on IBM and Lawrence Livermore's 20-year relationship, this new agreement between the two companies is currently planned to last five years, but with intentions to make it permanent, according to IBM.
Sequoia, an IBM supercomputer at Lawrence Livermore National Laboratory in California, in June was named No. 1 on a list of the world’s fastest supercomputers. The company says it runs at more than 16 petaflops — or more than 16 quadrillion calculations a second. As part of the new collaboration, Lawrence Livermore has purchased a four petaflop version of the same computer to support unclassified, collaborative work with private industry. The smaller supercomputer, nicknamed Vulcan, is ranked No. 48 on the list of the world's top 500 fastest supercomputers. The hope is that participating companies and organizations will help underwrite the cost of the new project, Don Johnston, a Lawrence Livermore spokesman, told Sunlight. He said the partners are currently in talks with potential collaborators.
The announcement came at a Tuesday briefing on Capitol Hill, at which IBM also hosted a panel titled "Big Data – The New Natural Resource."
Sen. Dianne Feinstein, D-Calif., who made the opening remarks at the briefing, spoke at length about the advantages super computing brings to the United States in medicine, energy applications, national security and economic competitiveness. The senator proclaimed herself an advocate of increasing U.S. competitiveness by using the capabilities and expertise of the national laboratories.
IBM Research vice president of software, David McQueeney, also addressed the audience about big data as a natural resource, and the potential benefits of capturing and organizing this data. Benefits listed included:
- Better and quicker decision making through combining instinct and intution with analytics.
- Predictive analytics and continuous optimization of systems
- Better public safety: Data provides the opportunity to analyze crime and pinpoint crime hotspots so officers can know where and when to place patrols.
- Disaster preparedness
- Improved health care and healthcare systems
These points and more were further expanded upon in the panel discussion, which featured speakers Steven Ashby, the deputy director for Science and Technology at the Pacific Northwest National Laboratory, Shantenu Jha, the associate director of Research Cyberinfrastructure at Rutgers Discovery Informatics Institute and Tony Elder, the deputy chief of police in Charleston, South Carolina. Moderating the panel was former Under Secreatry of Energy for Science, Steven Koonin.
All the panelists agreed upon and discussed at length the benefits that the government could provide to society with the ability to make sense of big data.
But what about transparency? Could this big data be harnessed in order to form a more open government?
One of the last questions Koonin asked the panel was how this essentially open data will change "the game for government." The panel agreed that big data could be used to bring about more government transparency, but the topic came at the end of the conversation and participants said it wasn't in their areas of expertise.
Graphics credit: IBM Corp.