Specialty Grand Challenge ARTICLE
Grand challenges in computational physics
- Computational Biology and Simulation Group, Department of Biology, Department of Computer Science, and Department of Physics, Technische Universität Darmstadt, Germany
1. Introduction: Computational Physics
The results of physics research had and have tremendous influence on society. It is regarded as the most advanced quantitative science due to its rigourous approach to mathematical modeling. The systems investigated were traditionally characterized by well defined scales, such as energy or time scales. Now, in the recent decade such conceptual barriers are more and more challenged.
The investigation of mixed time, length, and energy scales has rendered every physics research eventually research on complex systems. Such complex systems need special treatment (Kwapień and Drożdż, 2012) and new methodologies. Consequently, computational methodologies have become the third mode of natural sciences (Valdés-Péréz, 1993). The demand for and the development of new computational tools has put computational physics on the forefront for the usage of newly developed computational tools.
The results obtained are too numerous, the fields in which insight was developed too broad to try to give even a brief overview. The physical questions to be addressed by computational means are, therefore, also too numerous. This short article is not meant to give an overview of those, rather I would like to focus on an orthogonal question: what are the challenges in computational physics method-wise?
2. Computational Physics & Its First-Mover Role
The fact, that computational physics is an early adopter of computational infrastructure and often even takes the first-mover role, cannot better be illustruated by the “big data” hype of recent years. Eventually, “big data” has already been — unknown to the public — in existence for quite some time: in several branches of physics large databases, servers, and related IT-infrastructures have been built, e.g., in high-energy physics (Brumfiel, 2011), in astrophysics, in geophysics, or in structural biophysics (Berman et al., 2000).
This important example alone is evidence of the mind-set and skills of computational physics community. Indivudal researchers seem to be quite able to identify new and helpful computational tools and ideas.
3. Computational Physics & Its Challenges
Despite the above mentioned “early-adopterness” of computational physics and the ability of individuals to translate computer science into physics research, the community needs a (new) place to exchange ideas and insight. In the past, some attempts have been successful, e.g., the Program Library of Computer Physics Communications (cpc). But such repositories need to be extended in scope, communities involved, and beyond mere code libraries.
In addition, the gap between available and substantiated computational methodologies and their usage in computational physics widens naturally: computer scientists and applied mathematicians have never been more productive than now. How can the (computational) physics community at large cope with this accelerated progress in fields acting as a methodological basis?
Furthermore, as my co-editor, C. Klingenberg, noted in his “grand challenges in computational physics” article (Klingenberg, 2013) the applicability of computational has changed: from mere validation to an all encompassing role.
And, finally, while (computational) physicists have succeeded in the integration and sometimes even the driving of newly developed computer science questions into their own research, this was almost always restricted to immediate pressing needs. However, it seems to me the greatest challenge to establish a platform for an interdisciplinary exchange to increase the flow of information on new demands and of new ideas and approaches. Therefore, I would like to see Frontiers in Computational Physics not only as merely another physics journal, but – despite the different publication culture – also a place for computer scientists to publish their research.
To exemplify this, I would like to mention some particular interesting developments – already widely accepted tools (e.g., parallelization), but also new ones:
Parallelization Beyond MPI Parallelization is a well established method and idea in computational science. Eventually, the development of large computers is driven directly by the needs of parallelized codes. This was largely facilitated by the introduction of programming frameworks such as the Message-Passing-Interface (MPI) (Snir et al., 1998). At present, Graphics Processing Units (GPUs) are comparatively inexpensive hardware components that more or less can be used in almost every project. Therefore, CUDA, OpenMP, and OpenCL need to find their way not only into large-scale simulation packages like the ones in molecular dynamics (Berendsen et al., 1995; Kale et al., 1999; Lindahl et al, 2001) or quantum chemistry (Ufimtsev and Martinez, 2008; Yasuda, 2008), but rather in almost all simulation and computation projects. At times, this can imply interesting computer science questions (Waechter et al., 2012).
Programming Paradigms The procedural and object-oriented programming models are dominating the field of computational physics. Among others, functional programming (Hudak, 1989) offers interesting aspects, for example code testing and correctness proofs (King and Launchbury, 1995), as well as automated, rigorous proofs (The Coq development team, 2004).
Existing Frameworks Statistical testing, in particular, has developed advanced infrastructure — in the form of the
R framework and ecosystem (R Development Core Team, 2008). There would be a tremendous synergy if the physics community as a whole would adopt such platforms and contribute more code and insight.
Software Engineering and Professional Code Maintenance Source code quality can be a very important factor in re-usage of a code base, its validation, and — most importantly — for the question of reproducibility of computational results by others — thus for the scientific quality assurance process. Unit test (Committee et al., 1986) have become a standard practice to this end. Experience with and implementation of such approaches need to be more wildly discussed. Furthermore, software design patterns (Smith, 1987) can improve efficiency in implementing scientific computing ideas.
Experimental Validation and Computational Protocols Discussion and development of metrics and transferable benchmarks are the last challenge I would like to point to. The cross-comparison in accuracy and efficiency is often hindered by the fact that research adopt unique measure for such quantitative qualities of their simulation or numerical procedures.
Computer physics communications. Available online at: http://www.journals.elsevier.com/computer-physics-communications/editorial-board
King, D. J., and Launchbury, J. (1995). “Structuring depth-first search algorithms in haskell,” in Proceedings of the 22nd ACM SIGPLAN-SIGACT symposium on Principles of programming languages (New York, NY: ACM, 1995), POPL '95, 344–354. doi: 10.1145/199448.199530
Waechter, M., Hamacher, K., Hoffgaard, F., Widmer, S., and Goesele, M. (2012). “Is your permutation algorithm unbiased for n ≠ 2m?.” in Proceedings of the 9th international conference on Parallel Processing and Applied Mathematics - Volume Part I (Springer-Verlag, Berlin, Heidelberg, 2012), PPAM'11, 297–306.
Citation: Hamacher K (2013) Grand challenges in computational physics. Front. Physics 1:4. doi: 10.3389/fphy.2013.00004
Received: 28 May 2013; Accepted: 30 May 2013;
Published online: 09 July 2013.
Edited by:Alex Hansen, Norwegian University of Science and Technology, Norway
Copyright © 2013 Hamacher. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits use, distribution and reproduction in other forums, provided the original authors and source are credited and subject to any copyright notices concerning any third-party graphics etc.