Untitled Document
Not a member yet? Register for full benefits!

Username
Password
 Simulating turbulent combustion speeds design`

This story is from the category Pure Research
Printer Friendly Version
Email to a Friend (currently Down)

 

 

Date posted: 22/09/2011

Air and fuel mix violently during turbulent combustion. The ferocious mixing needed to ignite fuel and sustain its burning is governed by the same fluid dynamics equations that depict smoke swirling lazily from a chimney. Large swirls spin off smaller swirls and so on. The multiple scales of swirls pose a challenge to the supercomputers that solve those equations to simulate turbulent combustion. Researchers rely on these simulations to develop clean-energy technologies for power and propulsion.

A team led by mechanical engineers Joseph Oefelein and Jacqueline Chen of Sandia National Laboratories (Sandia) simulates turbulent combustion at different scales. A burning flame can manifest chemical properties on small scales from billionths of a meter up to thousandths of a meter, whereas the motion of an engine valve can exert effects at large scales from hundredths of a meter down to millionths of a meter. This multiscale complexity is common across all combustion applications—internal combustion engines, rockets, turbines for airplanes and power plants, and industrial boilers and furnaces.

Chen and Oefelein were allocated 113 million hours on Oak Ridge Leadership Computing Facility's Jaguar supercomputer in 2008, 2009, and 2010 to simulate autoignition and injection processes with alternative fuels. For 2011 they received 60 million processor hours for high-fidelity simulations of combustion in advanced engines. Their team uses simulations to develop predictive models validated against benchmark experiments. These models are then used in engineering-grade simulations, which run on desktops and clusters to optimize designs of combustion devices using diverse fuels. Because industrial researchers must conduct thousands of calculations around a single parameter to optimize a part design, calculations need to be inexpensive.

"Supercomputers are used for expensive benchmark calculations that are important to the research community," Oefelein said. "We [researchers at national labs] use the Oak Ridge Leadership Computing Facility to do calculations that industry and academia don't have the time or resources to do."

The goal is a shorter, cheaper design cycle for U.S. industry. The work addresses Department of Energy (DOE) mission objectives to maintain a vibrant science and engineering effort as a cornerstone of American economic prosperity and lead the research, development, demonstration, and deployment of technologies to improve energy security and efficiency. The research was funded by DOE through the Office of Science's Advanced Scientific Computing Research and Basic Energy Sciences programs, the Office of Energy Efficiency and Renewable Energy's Vehicle Technologies program, and the American Recovery and Reinvestment Act-funded Combustion Energy Frontier Research Center.

See the full Story via external site: www.physorg.com



Most recent stories in this category (Pure Research):

08/02/2017: New algorithms by U of T researchers may revolutionize drug discoveries

18/08/2014: RTI International develops novel lung-on-a-chip

30/04/2014: New lab-on-a-chip device overcomes miniaturization problems

25/03/2014: Robotic arm probes chemistry of 3-D objects by mass spectrometry

05/03/2014: First step towards “programmable materials“ - Sheet metal that never rattles

20/02/2014: Team Develops Multi-scale Simulation Software for Chemistry Research

17/10/2013: Fat Black Holes Grown up in Cities: Observational result using Virtual Observatory

26/09/2013: Simulation accurately captures the evolution of ancient complex societies