Supercomputers Can Save U.S. Manufacturing
The key to reviving manufacturing in the U.S. may lie in the nation’s supercomputers
By Donald Q. Lamb
March 9, 2012
The U.S. used to be a powerhouse in manufacturing. In the past quarter of a century we have relinquished this leadership position, in large part because we made a decision—consciously or unconsciously—that the service and financial sectors are sufficient to sustain our economy. But they are not. Service jobs pay little. The financial industry makes nothing of value and therefore cannot maintain, let alone raise, the nation’s standard of living.
The fate of manufacturing is in some ways linked to our prowess in the physical sciences. In the 1960s and 1970s high-performance computing (HPC) developed at the national labs made its way to the manufacturing sector, where it now powers much of the innovation behind our most successful commercial firms. Yet we are ceding leadership in the physical sciences, too. Canceling the Superconducting Super Collider in the 1990s ended U.S. dominance in particle physics. NASA’s decision to delay, and possibly eventually abandon, the Wide-Field Infrared Survey Telescope could do the same for cosmology.
Fortunately, the nation’s lead in high-performance computing still stands. HPC is the advanced computing physicists use to model the dynamics of black holes, meteorologists use to model weather and engineers use to simulate combustion. This expertise may also be our best chance to rescue U.S. manufacturing. If we can successfully deliver it to engineers at small firms, it might give the sector enough of a boost to compete with lower labor costs overseas.
We already know how useful HPC is for big firms. When Boeing made the 767 in the 1980s, it tested 77 wing prototypes in the wind tunnel. When it made the 787 in 2005, it tested only 11. In the future, Boeing plans to bring that number down to three. Instead of physical wind tunnels, it uses virtual ones—simulations run on supercomputers—saving much time and money and quickening the pace of new products development. HPC modeling and simulation has become an equally powerful tool in designing assembly lines and manufacturing processes in a broad range of fields—big manufacturers such as Caterpillar, General Electric, Goodyear and Procter & Gamble use it routinely. Small manufacturers could get similar benefits from these tools, if only they had access to them.
I first came to appreciate the potential of HPC to help small manufacturers in 2009 as part of the Obama transition team. Working with the Council on Competitiveness, we identified lack of software, cost of entry and shortages of expertise as the main obstacles to the use of HPC by small manufacturers and proposed a partnership among government, manufacturers and universities to help. The result is the National Digital Engineering and Manufacturing Consortium, or NDEMC, a pilot program created by the council and the federal government.
Recently NDEMC made HPC resources available to a handful of firms, including Jeco Plastic Products. This 25-employee firm in Plainfield, Ind., makes plastic pallets for packaging of auto parts. The plastic pallets are a less expensive alternative to steel pallets, which are heavier and prone to rusting. When Jeco makes a new product, its engineers build a prototype, test it in the lab to see how it bears up under the stress it is likely to encounter in the field and repeat the process until they arrive at the best design. Last December, however, Jeco engineers got a chance to tap expertise at Purdue University to develop simulations of a pallet designed for a German automotive company and ran them on hardware at the Ohio Supercomputing Center in Columbus. As a result, Jeco bypassed that trial-and-error process completely, arriving at a design in only a few hours of computer time.
Many other small firms could reap similar benefits. NDEMC’s goal is to find the best business models for getting HPC to these firms and eventually take the effort nationwide. Small manufacturers today are in some ways like farmers at the beginning of the 20th century, most of whom did not know what contour farming, crop rotation and fertilizers could do for productivity. When the U.S. agricultural extension service, in conjunction with land-grant universities, made the requisite expertise available, it triggered a revolution in agricultural productivity. A similar revolution could be in the cards for small manufacturers if we can get supercomputing technology into the hands of their engineers.
ABOUT THE AUTHOR
Donald Q. Lamb is Robert A. Millikan Distinguished Service Professor in the astro-physics and astronomy department at the University of Chicago and director of the Flash Center for Computational Science there.