Actions

Statistical Methods for Exascale Performance Modeling

From Modelado Foundation

Revision as of 06:46, July 1, 2014 by imported>ToddGamblin (Created page with "== Principal Investigator == * http://people.llnl.gov/gamblin2, Lawrence Livermore National Laboratory == Summary == Large computer simulations are critical...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Principal Investigator

Summary

Large computer simulations are critical for a broad range of scientific disciplines. Despite this need, adapting a scientific simulation code to run efficiently on a new supercomputer is tedious and time consuming. For the most complex applications, the process can take six months or more. Predictive mathematical models of performance and power consumption could accelerate this process, but the behavior of modern adaptive codes can change, depending on the input data. This makes existing modeling techniques difficult to apply. This project will develop statistical models of applications that can represent adaptive, data‐dependent behavior in a scalable manner. The project will also develop techniques to reduce the complexity of application models so that they are easily understood by application developers. These models will provide simulation developers with insights that allow them to quickly optimize the performance of their code, ensuring that applications can take full advantage of the performance of future exascale machines.