Coarse-grained modeling

http://dbpedia.org/resource/Coarse-grained_modeling

Coarse-grained modeling, coarse-grained models, aim at simulating the behaviour of complex systems using their coarse-grained (simplified) representation. Coarse-grained models are widely used for molecular modeling of biomolecules at various granularity levels. The Liouville theorem (sometimes also called Liouville equation) where Then,by maximisation for a given energy , i.e. linking with of the other sum equal to zero via a Lagrange multiplier , one obtains (as in the case of a lattice of spins or with a bit at each lattice point) and , Then in terms of ensemble averages , and rdf:langString
Modelowanie gruboziarniste – metoda modelowania molekularnego, w której stosuje się modele gruboziarniste, czyli takie, gdzie grupa atomów reprezentowana jest przez pojedyncze centrum oddziaływań zwane pseudoatomem lub zjednoczonym atomem. Metoda ta pozwala na uproszczenie wielu zależności występujących w układach biologiczno-chemicznych oraz ograniczenie liczby stopni swobody układu, co jednocześnie zmniejsza ilość oraz czas obliczeń potrzebnych w symulacji. Takie metody są szczególnie przydatne do modelowania układów o wysokim stopniu złożoności. Wykorzystywane są w symulacjach biomolekuł takich jak białka, kwasy nukleinowe, błony lipidowe, węglowodany lub woda. Modelowanie gruboziarniste jest często używane jako etap modelowania wieloskalowego. Po symulacji gruboziarnistej następuje sym rdf:langString
rdf:langString Coarse-grained modeling
rdf:langString Modelowanie gruboziarniste
xsd:integer 50956705
xsd:integer 1073929317
rdf:langString Coarse-grained modeling, coarse-grained models, aim at simulating the behaviour of complex systems using their coarse-grained (simplified) representation. Coarse-grained models are widely used for molecular modeling of biomolecules at various granularity levels. A wide range of coarse-grained models have been proposed. They are usually dedicated to computational modeling of specific molecules: proteins, nucleic acids, lipid membranes, carbohydrates or water. In these models, molecules are represented not by individual atoms, but by "pseudo-atoms" approximating groups of atoms, such as whole amino acid residue. By decreasing the degrees of freedom much longer simulation times can be studied at the expense of molecular detail. Coarse-grained models have found practical applications in molecular dynamics simulations. Another case of interest is the simplification of a given discrete-state system, as very often descriptions of the same system at different levels of detail are possible. An example is given by the chemomechanical dynamics of a molecular machine, such as Kinesin. The coarse-grained modeling originates from work by Michael Levitt and Ariel Warshel in 1970s. Coarse-grained models are presently often used as components of multiscale modeling protocols in combination with reconstruction tools (from coarse-grained to atomistic representation) and atomistic resolution models. Atomistic resolution models alone are presently not efficient enough to handle large system sizes and simulation timescales. Coarse graining and fine graining in statistical mechanics addresses the subject of entropy , and thus the second law of thermodynamics. One has to realise that the concept of temperature cannot be attributed to an arbitrarily microscopic particle since this does not radiate thermally like a macroscopic or ``black body´´. However, one can attribute a nonzero entropy to an object with as few as two states like a ``bit´´ (and nothing else). The entropies of the two cases are called thermal entropy and von Neumann entropy respectively. They are also distinguished by the terms coarse grained and fine grained respectively. This latter distinction is related to the aspect spelled out above and is elaborated on below. The Liouville theorem (sometimes also called Liouville equation) states that a phase space volume (spanned by and , here in one spatial dimension) remains constant in the course of time, no matter where the point contained in moves. This is a consideration in classical mechanics. In order to relate this view to macroscopic physics one surrounds each point e.g. with a sphere of some fixed volume - a procedure called coarse graining which lumps together points or states of similar behaviour. The trajectory of this sphere in phase space then covers also other points and hence its volume in phase space grows. The entropy associated with this consideration, whether zero or not, is called coarse grained entropy or thermal entropy. A large number of such systems, i.e. the one under consideration together with many copies, is called an ensemble. If these systems do not interact with each other or anything else, and each has the same energy , the ensemble is called a microcanonical ensemble. Each replica system appears with the same probability, and temperature does not enter. Now suppose we define a probability density describing the motion of the point with phase space element . In the case of equilibrium or steady motion the equation of continuity implies that the probability density is independent of time . We take as nonzero only inside the phase space volume . One then defines the entropy by the relation where Then,by maximisation for a given energy , i.e. linking with of the other sum equal to zero via a Lagrange multiplier , one obtains (as in the case of a lattice of spins or with a bit at each lattice point) and , the volume of being proportional to the exponential of S.This is again a consideration in classical mechanics. In quantum mechanics the phase space becomes a space of states, and the probability density an operator with a subspace of states of dimension or number of states specified by a projection operator . Then the entropy is (obtained as above) and is described as fine grained or von Neumann entropy. If , the entropy vanishes and the system is said to be in a pure state. Here the exponential of S is proportional to the number of states. The microcanonical ensemble is again a large number of noninteracting copies of the given system and , energy etc. become ensemble averages. Now consider interaction of a given system with another one - or in ensemble terminology - the given system and the large number of replicas all immersed in a big one called a heat bath characterised by . Since the systems interact only via the heat bath, the individual systems of the ensemble can have different energies depending on which energy state they are in. This interaction is described as entanglement and the ensemble as canonical ensemble (the macrocanonical ensemble permits also exchange of particles). The interaction of the ensemble elements via the heat bath leads to temperature , as we now show. Considering two elements with energies , the probability of finding these in the heat bath is proportional to , and this is proportional to if we consider the binary system as a system in the same heat bath defined by the function . It follows that (the only way to satisfy the proportionality), where is a constant. Normalisation then implies Then in terms of ensemble averages , and or by comparison with the second law of thermodynamics. is now the entanglement entropy or fine grained von Neumann entropy. This is zero if the system is in a pure state, and is nonzero when in a mixed (entangled) state. Above we considered a system immersed in another huge one called heat bath with the possibility of allowing heat exchange between them. Frequently one considers a different situation, i.e. two systems A and B with a small hole in the partition between them. Suppose B is originally empty but A contains an explosive device which fills A instantaneously with photons. Originally A and B have energies and respectively, and there is no interaction. Hence originally both are in pure quantum states and have zero fine grained entropies. Immediately after explosion A is filled with photons, the energy still being and that of B also (no photon has yet escaped). Since A is filled with photons, these obey a Planck distribution law and hence the coarse grained thermal entropy of A is nonzero (recall: lots of configurations of the photons in A, lots of states with one maximal), although the fine grained quantum mechanical entropy is still zero (same energy state), as also that of B. Now allow photons to leak slowly (i.e. with no disturbance of the equilibrium) from A to B. With fewer photons in A, its coarse grained entropy diminishes but that of B increases. This entanglement of A and B implies they are now quantum mechanically in mixed states, and so their fine grained entropies are no longer zero. Finally when all photons are in B, the coarse grained entropy of A as well as its fine grained entropy vanish and A is again in a pure state but with new energy. On the other hand B now has an increased thermal entropy, but since the entanglement is over it is quantum mechanically again in a pure state, its ground state, and that has zero fine grained von Neumann entropy. Consider B: In the course of the entanglement with A its fine grained or entanglement entropy started and ended in pure states (thus with zero entropies). Its coarse grained entropy, however, rose from zero to its final nonzero value. Roughly half way through the procedure the entanglement entropy of B reaches a maximum and then decreases to zero at the end. The classical coarse grained thermal entropy of the second law of thermodynamics is not the same as the (mostly smaller) quantum mechanical fine grained entropy. The difference is called information. As may be deduced from the foregoing arguments, this difference is roughly zero before the entanglement entropy (which is the same for A and B) attains its maximum. An example of coarse graining is provided by Brownian motion.
rdf:langString Modelowanie gruboziarniste – metoda modelowania molekularnego, w której stosuje się modele gruboziarniste, czyli takie, gdzie grupa atomów reprezentowana jest przez pojedyncze centrum oddziaływań zwane pseudoatomem lub zjednoczonym atomem. Metoda ta pozwala na uproszczenie wielu zależności występujących w układach biologiczno-chemicznych oraz ograniczenie liczby stopni swobody układu, co jednocześnie zmniejsza ilość oraz czas obliczeń potrzebnych w symulacji. Takie metody są szczególnie przydatne do modelowania układów o wysokim stopniu złożoności. Wykorzystywane są w symulacjach biomolekuł takich jak białka, kwasy nukleinowe, błony lipidowe, węglowodany lub woda. Modelowanie gruboziarniste jest często używane jako etap modelowania wieloskalowego. Po symulacji gruboziarnistej następuje symulacja pełnoatomowa. Pozwala to na skrócenie czasu symulacji oraz umożliwia modelowanie dużych układów.
xsd:nonNegativeInteger 16106

data from the linked data cloud