Mini Review Volume 3 Issue 2
1National Research Council, Department of Physics Universit_a La Sapienza, Italy
2Laboratory of Theoretical Biochemistry, France
3Applications Institute Calcolo M Picone, National Research Council, Italy
4J Paulson School of Engineering and Applied Sciences, Harvard University, USA
Correspondence: Sauro Succi, CNR-IAC, Applications Institute Calcolo M Picone, National Research Council, 00185 Rome, Italy, J Paulson School of Engineering and Applied Sciences, Harvard University, Cambridge, USA
Received: June 29, 2017 | Published: April 23, 2018
Citation: Melchionna S, Sterpone F, Succi S. Computational explorations at the physics-chemistry-biology interface. Int J Mol Biol Open Access. 2018;3(2):75-76. DOI: 10.15406/ijmboa.2018.03.00054
Molecular biology is paramount to biological and medical research, and it will always be. However, it is nowadays increasingly acknowledged that, as biology moves from in-vitro lab experiments to in-vivo physiological conditions, molecular biology needs to interface to other disciplines, such as chemistry, uid dynamics and solid mechanics, which describe the interactions between biological molecules and their physiological environment. In particular, the function delivered by a given biological molecule, be it a protein or drug, is highly sensitive to the way such molecules move and dise across their physiological environment.1,2 For instance, the diffusion of proteins within the crowded milieu of the cell is vastly slower than their motion in isolation. Because of these transport phenomena, biochemical optimization does not necessarily translate into the best physiological in-vivo performance.3This presents a multi-disciplinary scenario that we have previously dubbed the PCB (Physics-Chemistry-Biology) interface.4 Owing to the major complexity of the phenomena taking place at the PCB interface, which entails the nonlinear cooperation/competition of different mechanisms (chemical reactions, diffusion across membranes...) acting on a broad spectrum of scales and often in complex geometries, the exploration of the PCB interface raises a major challenge to both theory and experiment. Analytical methods are clearly of limited use towards nonlinear phenomena in complex geometries, while experimental approaches like in-cell NMR, single molecule techniques, and even neutron scattering, notwithstanding tremendous progress in the last decades, may still suffer accessibility constraints. Under such state of affairs, computer simulations offer a powerful and flexible alternative/complement to the canonical routes above.5
Multiscale simulation
Computer simulation has witnessed an incredible boost for the last five decades, fueled on one side by the relentless growth of compute power, doubling every two years (Moore's law) and by the no less amazing progress in computer modelling techniques. As of today, the most powerful computer simulations can reach up to tens of Petaflops, which means of the order of ten millions of billions floating-point operations per second. Just to place this number in perspective, that means 10 floating-point operations per femto second, the time it takes to an electron to make or break a chemical bond. This power is now capable to disclose a microscopic visualization of the protein folding process up to milliseconds,6 and to appreciate the motion of proteins in cellular conditions.7 Or, perhaps more telling, with the prospect of EXASCALE computing at the horizon, the capability of simulating the operation of the full Golgi apparatus over a time span of a few milliseconds. It should be noted that such amazing performance does not come out of the blue; besides fast clocks, it requires clever combinations of different methodologies, each dealing with its own phenomena and corresponding spectrum of scales. Typically, continuum methods for fluid motion, as combined with particle methods for the biological agents; a paradigm known as multi-scale computing. In addition, in order to attain the top-performances mentioned above, such multiscale techniques must be organized in terms of efficient parallel implementations, i.e, the concurrent solution of multiple independent sub-problems, to be glued together at the end of the simulation to recover the full solution of the global problem.
Many such techniques are nowadays available,8,9 typically merging Navier-Stokes solvers for the fluid with Brownian or molecular dynamics for the molecules, depending on the required degree of chemical specificity.
For the last decade, several groups, including ours, have been developing an alternative path based on the combination of mesoscale kinetic methods (Lattice Boltzmann) for the aqueous solvent, as combined with a variety of deterministic and stochastic particle methods for the biological molecules, the so called LBPD (Lattice-Boltzmann-Particle-Dynamics) multi scale method. LBPD has been now used for more than a decade to investigate a number of PCB problems, such as multi scale hemodynamics at red-blood-cell resolution,10‒12 biopolymer translocation across biological membranes,13 protein folding and polypeptidic aggregation and formation of membranes within the cell.14‒17 Such multiscale strategies follow in the glorious tradition opened up in the late 60's by heroic methods such as QM/MM, whereby researchers set out to simulate cartoon-proteins using molecular dynamics18‒21 for the backbone, leaving only selected portions for a full quantum mechanical treatment. Originally limited to a few thousands particles, the above methods have flourished into a full computational endeavor, which nowadays allows to glean quantitative information on real biological molecules, a pioneering effort which culminated with the 2013 Nobel recognition14. We hope and wish that computer explorations at PCB may one day replicate a similar success story within the physiological context.
Why simulations? The chaperon story
In a recent Solvay Colloquium on these very matters, one of the authors (SS) was questioned by a structural biologist as follows: "What's the point of these highly sophisticated simulations for, say, protein folding, given that we know that nature has been endowed with chaperons?". This question seems to be shadowing the large number of molecular biologists, chemists, biophysicists who joined forces to unveil from the very experimental point of view the details (kinetics, thermodynamics, and molecular mechanism) of the spontaneous process of protein folding, thus occurring without the correctional support of chaperons. More directly related to our discussion, the question follows in the footsteps of a so-and-so healthy tradition of skepticism towards computer simulation by hard-core molecular biologists. The answers are many; depending on the degree of "polemic" one is willing to engage upon. For our side, perhaps the simplest answer goes like follows: computer simulations are meant to help asking questions and, based on the answers, hopefully develop some understanding. The chaperon is there to help proteins ending their way to a native state they would not be able to reach without assistance, fair enough. But, how does that work? To what extent do protein-specific and universal principles cooperate to achieve the desired task? And, how is the success rate affected by environmental conditions, such as temperature and hydrodynamic correlations? These are just some of the fundamental questions that computer simulation may help raising and answering, possibly even illuminating the basic mechanisms by which the chaperon manages to materialize in the first place. Empirical observation is paramount, but surely does not dispense from the need of understanding. This is the very point of computer simulation.
None.
The author declares no conflict of interest.
©2018 Melchionna, et al. This is an open access article distributed under the terms of the, which permits unrestricted use, distribution, and build upon your work non-commercially.