Following the launch of its 2020 Call for Code Global Challenge, IBM today announced that it will help coordinate an effort to provide over 330 petaflops of computing power to scientists researching COVID-19, the coronavirus that’s sickened over 300,000 people. The company anticipates that the capacity will be used to develop algorithms that assess how COVID-19 is progressing, and to model potential therapies in pursuit of a possible vaccine.
As part of a newly launched consortium — the COVID-19 High Performance Computing (HPC) Consortium — that includes the White House Office of Science and Technology Policy, the U.S. Department of Energy, MIT, Rensselaer Polytechnic Institute, Lawrence Livermore National Lab, Argonne National Lab, Oak Ridge National Laboratory, Sandia National Laboratory, Los Alamos National Laboratory, NASA, and the National Science Foundation, Microsoft, Google, and Amazon, IBM says it will assist in evaluating proposals from institutions and provide access to compute for projects that can “make the most immediate impact.” Teams will have at their disposal 16 systems with a combined 775,000 processor cores and 34,000 GPUs, which can perform around 330 thousand million million floating-point operations per second (330 petaflops).
Researchers will be able to apply through a website beginning later today. They must describe whether support from staff at the national labs or other facilities will be essential, helpful, or unnecessary for their project, and whether any restrictions might apply such as proprietary data sets or HIPAA restrictions.
“These high-performance computing systems allow researchers to run very large numbers of calculations in epidemiology, bioinformatics, and molecular modeling. These experiments would take years to complete if worked by hand, or months if handled on slower, traditional computing platforms,” wrote IBM Research director Dario Gil in a blog post. “Since the start of the COVID-19 pandemic we have been working closely with governments in the U.S. and worldwide to find all available options to put our technology and expertise to work to help organizations be resilient and adapt to the consequences of the pandemic.”
Amazon said it’s offering research institutions and companies technical support and promotional credits for the use of Amazon Web Services (AWS) programs to advance COVID-19 research on diagnosis, treatment, and vaccine studies. “[We hope to] accelerate our collective understanding of the novel coronavirus,” said Amazon in a statement. “Researchers and scientists working on time-critical projects can use AWS to instantly access virtually unlimited infrastructure capacity, and the latest technologies in compute, storage and networking to accelerate time to results.”
For the Rensselaer Polytechnic Institute’s part, it will enlist its Artificial Intelligence Multiprocessing Optimized System (AiMOS) computer at the Rensselaer Center for Computational Innovations, the 24th-most powerful supercomputer in the world, in the battle against the pandemic. Rensselaer says it is reaching out to the research community, including government entities, universities, and industry, to offer access to AiMOS in support of research related to COVID-19.
“In order to combat the devastating effects of this pandemic, we must be able to fully grasp the complexities and interconnectedness of biological systems and epidemiological data, as researchers work to develop therapeutic interventions and address gaps in our knowledge,” said Rensselaer president Shirley Ann Jackson. “This effort requires expertise, collaboration, and the ability to process incredible amounts of data, and Rensselaer is offering all three at this critical time. In particular, the ability to model at very large scales requires the unique capabilities of AiMOS.”
The announcement follows news that scientists tapped IBM’s Summit at Oak Ridge National Laboratory, the world’s fastest supercomputer, to simulate how 8,000 different molecules would interact with COVID-19, resulting in the isolation of 77 compounds likely to render the virus unable to infect host cells. Elsewhere, the Tianhe-1 supercomputer at National Supercomputer Centre in Tianjin was recently used to process hundreds of images generated by computed tomography and give diagnoses in seconds. And the Gauss Centre for Supercomputing, an alliance of Germany’s three national supercomputing centers, said it would help those working on COVID-19 gain expedited access to computing resources.
More recently, Folding@home, one of the largest crowdsourced supercomputing programs in the world, kickstarted an initiative to uncover the mysteries behind COVID-19’s spike protein, which the virus uses to infect cells. Since announcing in late February the new focus on the coronavirus, some 400,000 new volunteers joined the effort, according to project organizer and Washington University School of Medicine associate professor of biochemistry and molecular biophysics Greg Bowman.
Supercomputers have long been used to identify and test potential treatments for complex and chronic diseases. Researchers tapped the Texas Advanced Computing Center’s Lonestar5 cluster to simulate over 1,400 FDA-approved drugs to see if they could be used to treat cancer. Last June, eight supercomputing centers were selected across the E.U. to host applications in personalized medicine and drug design. And pharmaceutical company TwoXAR recently teamed up with the Asian Liver Center at Stanford to screen 25,000 drug candidates for adult liver cancer.
The hope is that supercomputers can reduce the amount of time it takes to bring novel drugs to market. Fewer than 12% of all drugs entering clinical trials end up in pharmacies, and it takes at least 10 years for medicines to complete the journey from discovery to the marketplace. Clinical trials alone take six to seven years on average, inflating the cost of R&D to roughly $2.6 billion, according to the Pharmaceutical Research and Manufacturers of America.
The White House previously partnered with Google parent company Verily to develop screening to build a triaging tool to help people find COVID-19 testing sites in the U.S., which is currently live for select locations in the San Francisco Bay Area. (Google is also working with the U.S. government to create self-screening tools for people wondering whether they should seek medical attention.) And last week, at the request of the White House Office of Science and Technology Policy, researchers and leaders from the Allen Institute for AI, Chan Zuckerberg Initiative, Microsoft, the National Library of Medicine at the National Institutes of Health, and others released a data set of over 29,000 articles about COVID-19, SARS-CoV-2, and the Coronavirus group.
This afternoon, U.S. President Trump gave Ford, GM, and Tesla the “go ahead” to make ventilators to help alleviate a shortage amid the pandemic, only days after Trump issued an executive order invoking the Defense Production Act. COVID-19 is a respiratory disease, and ventilators are a critical piece of medical equipment used to treat hospitalized patients. The Society of Critical Care Medicine projects that 960,000 coronavirus patients in the U.S. may need to be put on ventilators in the future, but the nation has only about 200,000 of the machines, and around half are around older models that might not be ideal for critically ill patients.