About

A facility for online data analysis to support ongoing experiments and other time-critical activities has long been on the wishlist of many sciences: large experimental instruments, equipped with millions of sensors, and producing hundreds of terabytes of data per experiment will be used more efficiently if extended with a computational facility providing the scientist with ongoing insight into data. This need is becoming stronger as recently these sensors have left the lab and started multiplying at large: inexpensive and increasingly sophisticated sensor devices now allow scientists to instrument forests, oceans or cities turning our planet into an “instrument at large” and providing unprecedented insight into geophysical, environmental, and social phenomena. And finally, many scientific activities, such as “thought experiments”, brainstorming sessions, and critical thinking have always required online data analysis support.
Recent technology trends, such as the increasing focus on data management technologies and the emergence of sustainable on-demand computing and commercial cloud facilities provide initial steps and potential building blocks for creating such compute facility. How do we fill the gap between what we have now and the capabilities we need to make the vision of an online data facility a reality? What research challenges need to be addressed – and can be addressed – in the coming 5-10 years? How should we adjust our vision if they are not solved? Is such facility compatible with the existing and evolving exascale resources and/or how should they evolve to be compatible? How do “beyond Moore’s law” technologies enable advanced data analysis? This workshop is organized to address these and related questions.