Categories
Research Uncategorized

Enter the parallel workflow

MATLAB Handle GraphicsExperimentalists from all over the world visit Argonne each year to use the ultra-bright X-ray photon beams produced here to peer inside materials. Research teams with beamline reservations, or ‘beamtime,’ are expected to set up, calibrate the detector, and man their experiments round-the-clock for days at a stretch. Any problem with the experimental setup is revealed only during the analysis phase, often rendering the entire dataset useless. Even a seemingly minor thing like a bad cable can cause terabytes of useless data. That’s a problem.
A recent collaborative effort involving high-performance computing is giving beamline users the ability to conduct fast ‘in-beam’ analysis of their initial data so that they can find and correct problems early on. Here’s a recent story about how computational methods and infrastructure at Argonne are boosting beamline performance and accelerating discoveries in materials science.

Categories
Argonne Leadership Computing Facility Publications Research

CiSE publishes first issue dedicated to Leadership Computing

GEI cover_edited-1Advances in Leadership Computing, the first of a two-part CiSE Special Issue on Leadership Computing, is now available online. In two consecutive publications, this special issue will explore nine projects that are using leadership systems to expand the frontiers of their fields.
The September/October issue features five articles on topics that include simulating the Universe, enhancing the understanding of wall-bounded turbulence, devising an approach that computes the energy dispatch of electrical power grid systems under uncertainty, gleaning new insights into fusion plasma turbulence, and a recent advance in quantum-mechanical computational methods that can be used to search for optimal materials such as batteries and photo-electrochemical cells.

Categories
Research

Bug data is big data

papka-data-356Digitized museum collections are the next ‘big data’ dataset
The Field Museum of Natural History in Chicago holds a massive pinned insect collection of roughly 4.5 million specimens, dating back at least a century and contributed by entomologists and private collectors from all over the world, including scientists at the Field. The collection currently occupies over ten thousand drawers of new high-density storage that can be opened as needed for study.
The collection, which continues to grow, represents a wealth of data for studies of taxonomy, biodiversity, invasive species and so on, and a significant public investment in research and applied environmental science. Digitizing the collection would not only preserve it, but also make it accessible to such studiers of the collection without having to visit the Field. However, no systems exist for digitizing large collections of objects, making large-scale analysis of the subsets of such collections extremely difficult.
Argonne scientists Mark Hereld and Nicola Ferrier are collaborating with Field Museum associate curator Petra Sierwald to devise an advanced pipeline for high-throughput digitization of the collection using a software-based approach. The bar is high: to digitize a collection this big in one year of 24/7 operations, the average time available to capture a single specimen is 7 seconds. Another challenge is deciphering the labels beneath each pinned specimen– labels that are closely packed, often partially obscured, and in many cases, hand-written. Moreover, the specimens are fragile and can’t be manipulated.
This summer, two students working with Hereld and Ferrier experimented with image-capture techniques to accommodate the different angles needed to quickly sample everything on the labels and extract from the target label information. The image pre-processing methods the team eventually develops will enable automatic reconstruction of label data. The team is working on methods to identify and track the drawers, unit boxes, and individual specimens through the digitization pipeline.
This is a difficult problem. Different approaches are more or less robust at finding correct solutions, depending on lighting, geometry, and details of the object being measured. The team is exploring a range of image analysis and 3D capture techniques, expecting the best solution to be a combination of existing and new algorithms.

Categories
Argonne Leadership Computing Facility Research

Summer start to new simulation science projects

membrane_alcfEach year, the DOE’s Advanced Scientific Computing Research program, or ASCR, dedicates roughly 30% of the computing resources at its three supercomputing facilities to projects pursuing DOE mission research. The yearlong ASCR Leadership Computing Challenge (ALCC) awards, which begin July 1, aim to advance clean energy technologies, to better understand climate and environmental systems, and to respond to potential disasters.
ASCR recently awarded 19 new ALCC projects a total of 1.64 billion core-hours at the Argonne Leadership Computing Facility, expanding both the scope of scientific simulation research happening at ALCF and the community of researchers that will be capable of using a leadership-class system. Read more about the individual projects here.
Image: Christopher Knight, Argonne National Laboratory

Categories
Argonne Leadership Computing Facility Research

Mira provides new insight into subatomic particles

ESP-Pieper-620_newA team of scientists has, for the first time, calculated several fundamental properties of the carbon-12 nucleus using one of the world’s fastest supercomputers, setting the stage for more reliable neutrino detector calibrations and better supernovae explosion simulations.
The work, published last summer in Physical Review Letters, involved researchers from Argonne, Los Alamos, and Jefferson national laboratories, Middle Tennessee State University, and Old Dominion University. The team, led by Argonne Senior Physicist Steven Pieper, was one of 16 that were granted early access to Mira last year, and used their core-hour allocation to prepare the Green’s function Monte Carlo (GFMC) code for the new machine’s scale and architecture in order to run the carbon-12 simulations.
In the past 15 years, researchers have developed the GFMC algorithm as a powerful and accurate method for computing properties of light nuclei. Understanding the many-body interactions within the nucleus is critical to a real understanding of the physics of nucleonic matter. Electron scattering experiments in the quasi-elastic regime, where the dominant process is knocking a single nucleon out of the nucleus, are underway at Jefferson Lab for a range of nuclei. Using Mira, the team has included new, complex interactions within the nucleus and predicted the results of a Jefferson Lab experiment, bringing theoretical prediction closer to experimental data in the high-momentum transfer tail. Read more about the work here.

Categories
Argonne Leadership Computing Facility Research

Beyond the Standard Model

utfit-fullThe discovery last year at CERN of the Higgs boson — a particle that may well be responsible for all the mass in the universe — was momentous to physicists everywhere. The revelation of Higgs is critical to validating a nearly five-decade-old fundamental physics theory, known as the Standard Model, which accounts for all known subatomic particles and their interactions. Scientists, meanwhile, continue their search for answers to weighty unexplained physical phenomena such as the existence of dark matter and what happened to all the antimatter since the Big Bang.
Fermilab theoretical physicist Paul Mackenzie is leading a multiyear project at the ALCF to shed light on the mysterious particles and forces associated with “physics beyond the Standard Model.” According to Mackenzie, the Standard Model has many complex and peculiar features that have led to the nearly universal belief that there is new, as yet undiscovered physics which will explain these features.
Mackenzie heads a national effort to leverage HPC resources to advance quantum chromodynamics (QDC), the study of how quarks and gluons interact. Supercomputers like Mira enable scientists to study quarks and gluons in situations that are not possible in accelerator and cosmic ray experiments, and have the computational power needed to give quark-antiquark pairs their proper, very light masses for the first time — removing one of the largest remaining uncertainties involved in QCD calculations. Read more about Mackenzie’s research at the ALCF here.

Categories
Argonne Leadership Computing Facility Research

Accelerating the discovery of alternative fuel sources

In many ways, biofuel research is like modern day alchemy. The transmutation of biomass materials — which includes anything from kitchen and latrine waste to stalky, non-edible plants — into a sustainable and renewable energy source involves catalysts and chemical reactions. The process promises to help meet the world’s critical energy challenges.
Biofuel research can also be thought of as the ultimate multi-scale, multi-physics research problem. It represents several interesting biological supply-chain management problems. Not surprisingly, biofuel research spans several domains here at Argonne, and takes place in wet labs and joint institutes across the lab campus. There is also an exciting INCITE research project going on in the ALCF aimed at finding a more effective way to convert plant materials that contain cellulose, such as wood chips and switchgrass, into sugars, and then converted into biofuels.
A science team from the National Renewable Energy Laboratory is using Mira to conduct large-scale simulations of the complex cellulose-to-sugar conversion process. Researchers are able to obtain data, such as the level of an enzyme’s binding free energy, which is difficult to obtain through conventional experimental approaches, helping to accelerate the process of screening and testing new enzymes. With such information, researchers will be able to identify potential enzyme modifications and then feed their discoveries into experiments aimed at developing and validating improved catalysts. Read the full research highlight here.

Categories
Research

Parallel GPGPU application takes aim at tumors

In order to precisely reconstruct images of a tumor, a proton CT scan must pinpoint the exact location where an individual proton enters and exits the human body. A new generation of particle detectors uses tiny scintillating fibers (photo) in combination with ultrasensitive photomultipliers made of silicon to detect a proton's path. Photo: Reidar Hahn

Protons, specifically proton beams, are increasingly being used to treat cancer with more precision. To plan for proton treatment, X-ray computed tomography (X-ray CT) is typically used to produce an image of the tumor site — a process that involves bombarding the target with photon particles, measuring their energy loss and position, and then using projection methods to establish the 3D shape of the target.
A new imaging method, which employs protons instead of photons, promises to deliver more accurate images while subjecting the patient to a lower dose of radiation. Proton computed tomography (pCT) employs billions of protons and multiple computationally intensive processes to reconstruct a 3D image of the tumor site. To achieve the required accuracy would take a long time on a single computer, and it’s not clinically feasible to require a patient to sit still for a long period to be imaged, or to wait a day for the images to be produced.
A group of computer scientists at Argonne and at Northern Illinois University has been working to accelerate pCT imaging using parallel and heterogeneous high performance computing techniques. The team so far has developed the code and tested it on GPU clusters at NIU and at Argonne with astounding results — producing highly accurate 3D reconstructions from proton CT data in less than ten minutes.
Image Credit: FermiLab

Categories
Argonne Leadership Computing Facility Research

Cracking the source of crackle in supersonic jet noise

The ASCR Leadership Computing Challenge projects that make up roughly 30% of the time awarded on ALCF supercomputers each year go to support “high-risk, high-payoff” simulations of interest to the DOE. Stanford’s Parviz Moin used his 60 million hour award to make a new and potentially industry-changing discovery about the source of supersonic jet noise. No ear protection needed.
Moin and his team ran large eddy simulations on Intrepid to determine the source of crackle in hot supersonic jet engines. Crackle is a major source of engine noise that causes hearing damage and impacts fuel efficiency. This particularly irritating phenomenon is associated with shock-like “N-shaped” acoustic waveforms consisting of sudden strong compressions followed by more gradual expansions. Because crackle occurs in the direction of peak jet noise, its elimination has the potential to help meet the U.S. Navy’s near-term jet noise reduction goal of 3 dB in the peak noise.
One way to make jets less noisy is to modify the shapes of the engine exhaust nozzles using pointed cutouts, called chevrons. Past ALCF allocations to Moin enabled a comprehensive study of the physical mechanisms by which chevrons affect the jet mixing and shock-associated noise. His current allocation was used to complete the simulations capturing crackle events and to develop new methods to identify and save such events for further study. Furthermore, with the source of the crackle noise now identified, new nozzle designs can be simulated.

Categories
Research

Mira science run probes turbulence physics

Earlier this month, University of Texas’s Robert Moser initiated the first full-scale production run on Mira, the ALCF’s new 10-petaflops system. Moser is examining the complex physics of a specific region of wall-bounded turbulence, which is central to understanding the energy losses inherent in transportation. Moser’s work addresses energy loss at many scales, from vehicles moving through air or water, to fluids transported through the pipes and ducts that comprise urban infrastructure. The team developed their code specifically to exploit Mira’s capabilities, and this recently-launched investigation aims to develop a nearly complete understanding of the phenomena dominating this type of turbulence. One week into his campaign and 47 million core-hours later, the team is closer to reaching that goal. The remaining jobs of Moser’s campaign executed in the coming days, over 100 million core-hours in total, should yield even better insights.