Meet an RC Researcher- Benjamin Brown
Benjamin Brown attended Harvey Mudd College as an undergraduate student before coming to the University of Colorado, Boulder for graduate school. He says, “I knew from my undergrad that I was interested in astrophysics and in fluid dynamics, and I didn’t know yet that I was interested in computation.” Brown began working with Juri Toomre, who introduced him to supercomputing, fluid dynamics and stellar interiors. He accepted postdoc positions in Wisconsin and then Santa Barbara before returning to CU Boulder as a faculty member in 2014. Explaining his choice of university, Brown says, “Boulder’s a really neat center for computation and for fundamental work on fluid dynamics problems. It’s one of the strongest spots for it in the United States.”
Brown researches fluid dynamics inside stars, including the sun. The interiors of these stars are composed of a plasma of hydrogen gas so hot that electrons have separated from its nuclei. On very small scales, this plasma behaves as a group of discrete atoms; on the macroscopic scale, which Brown studies, it behaves as a continuous fluid. “We want to understand what happens below stars’ surfaces and how their interiors move,” Brown explains. “We can then understand the way those motions create large-scale structures on the sun, including magnetic activity and sun spots.” He and his team use their knowledge of stellar structure to construct the supercomputer simulations that they use to study stars’ interiors. After determining the characteristics of the simulation, including its resolution, the number of grid points and the size of the area to be simulated, they run the simulation and analyze the data it produces. They compare this data to observations of stars’ large-scale properties, including brightness and rotation speed. Although observations of stars are somewhat “crude” due to their distance from Earth, comparing them to average measurements from the simulation gives an idea of the simulation’s effectiveness.
Measurements of the Sun are more detailed than those of any other star due to its proximity to Earth. A point-by-point property comparison between the Sun and the simulation enables Brown to analyze differences between the two. In the numerical simulations that Brown uses, the numerics themselves can influence the results of smaller-scale simulations, giving researchers “the right answer for the wrong reasons.” Simulations have recently entered the scale on which numerics cease to constrain results in models of the sun. “We now understand why those answers were coming out for the wrong reasons and how to fix that,” he says. “That’s very exciting.” When he branched out into modeling other types of stars, Brown’s results indicated that all types of stars have the same internal characteristics. He says, “It’s not clear to us yet whether all stars behave fundamentally very similarly in their insides or if our simulations of these stars all behave similarly while the stars themselves are very different. That’s a great puzzle we’re trying to crack right now.”
Brown’s simpler simulations require a few million CPU hours on the Janus supercomputer. He tries to limit the length of each simulation to about a week so that simulations can be re-run if code-level problems or incorrect assumptions are identified. Although he would like to answer questions that would require hundreds of millions of CPU hours, he is limited by current computational power. For example, an exact simulation of the interior of the sun will not be possible for 60-80 years, assuming that computational speed continues to increase in accordance with Moore’s Law. Obtaining the allocations he needs on Janus has been fairly easy; applications are simpler and less competitive than those required by national supercomputing facilities, and the probability of receiving the allocation is high. The lower level of competition enables Brown to be more innovative in his research. He explains, “Having our own resources here lowers the pressure, which lets us do more interesting, more speculative research. It also means I can afford to turn students loose on them, which allows them to make mistakes and to discover really exciting things that we hadn’t thought to look for.” At more risk-averse national computing centers, researchers have to use an extremely well-calibrated code that is highly accepted in the community; on Janus, Brown has been able to use a new code that he developed as part of a small team. “It would be much more challenging to get our foot in the door with this new tool on some of the national machines,” he says. “With continuing local resources, we’ll be able to continue asking question that no one has thought of and keep pushing out the frontier, rather than justifying ourselves to a large committee sorting through a large number of applications.”
Having access to the local Research Computing team has been highly advantageous for Brown in his work on Janus. “You know the staff, and you can get some things answered pretty quickly. You can work with them to try to figure out new ways of solving problems, and you get a faster interaction,” he says. “They have really good attitudes and they’re fun to work with.” He plans to take advantage of the new Summit supercomputer that Research Computing will deploy in the summer, which will immensely speed up the communication between the processors on which his simulations run. “It’s possible that this will cause a huge breakthrough for us,” Brown says. “We can ask the same question ten times, which gives us a bigger parameter space, or we can start asking different questions.” He will rely on Research Computing’s support as he explores these possibilities. “I’m very excited to see what Summit is like when we actually have it here,” he continues. “The real strength is going to be being able to visit with the Research Computing staff and work with them to figure out how to use this new resource.”