How is diffusion like entropy




















Since diffusing particles will be forces to move along fingers in our pattern, we know that locally the system is very anisotropic. However, on large time and length scales, the different anisotropies are expected to cancel to produce an approximately isotropic behavior. Furthermore, the solution assumed a single globally well-defined path dimension d m , while it is known that in noisy real systems this dimension can vary locally.

The predicted second moment of the solution Equation 3 was tested against the mean-square displacement of random walk simulations in the pattern with reflecting boundary conditions in previous work and was seen to agree well with the simulations, adding to its validity as an effective model [ 16 ].

Given the above solution Equation 3 the entropy of the diffusion process can be calculated analytically. What type of entropy we consider is not of great importance here, as long as it is the same entropy that is calculated later in section 3 in the numerical methods.

This is because at the end of the day, we are interested in using the numerical measurements of the entropy as an indirect measurement of the path dimension for the frictional fingers. From an information theoretic perspective there are dozens of entropies that could be considered, most of which can be thought of as an analytical continuation of the Shannon-Gibbs entropy which is recovered as some entropic parameter is tuned correctly [ 21 ].

We here consider the Shannon-Gibbs formula as it is not only readily calculated but also closer connected to the entropy familiar in extensive thermodynamics. The Shannon-Gibbs entropy for the particle density takes the form [ 22 ]. According to Equation 4 we then have the entropy in terms of a non-integer moment:.

Since our distribution is a simple shifted Gaussian a change of variables easily allows us to find this moment. Using the integral.

As expected, a higher path dimension, meaning a more disordered system geometry, will give a lower entropy production since the diffusion process is more hindered. Using the same integral formula as above it is also easily shown from the solution Equation 3 that the mean-square displacement takes the form. To calculate the numerical entropy we construct a simplified discrete random walk-based model for the diffusion process.

To make these simulations more efficient, we make some simplifying assumptions. The biggest simplification comes from applying a topological contraction on the pattern so that the finger widths are set zero, effectively turning the problem into a one dimensional one. The resulting skeletonized version of the pattern, showed in Figure 1B , is what we will release random walkers on.

This topological simplification will not change the main geometric features of the pattern, since the folding and connectedness of every branch is conserved. When performing the numerical simulations the one-dimensional skeletonized pattern is discretized before a discrete random walk process is released.

In the resulting discrete "morphological graph" of the pattern there are no additional inhomogeneities associated with transition probabilities over links as all the inhomogeneity we are interested in stems from the pattern itself. In practice, the discretization is obtained by pixelating the skeletonized pattern and treating the pixels as sites for the random walker.

A cartoon of the pixels are shown in Figure 1B. A random walker jumps to one of its neighboring pixels, including diagonal neighbors, with equal probability. Since the code is ran with a very large number of particles, we estimate the number of particles that move to a given neighboring pixel according to a binomial distribution.

Hence, at every time step we only need as many random numbers as there are neighbors for a given pixel rather than one number for every particle as in traditional random walk methods. To calculate the entropy numerically we use the Gibbs-Shannon formula for the discrete random walk.

This probability is straightforwardly calculated as the ratio of the number of particles at pixel i at time t to the total number of particles in the system. The system is initialized with all particles released at the same position, as the analytical solution assumes a Dirac delta-like initial condition. Figure 2 shows the entropy of the simulation for three different randomly chosen initial positions close to the center of the pattern.

We see that while the temporal scaling agrees, they have different zero-point entropies. By inspection of Equation 7 we see that it is possible to have the same temporal scaling but a different zero-point entropy is the diffusion constant D 0 is allowed to vary throughout the system.

Figure 2. The entropy associated with three randomly chosen initial positions inside the finger pattern. The dotted line is a reference line 0. The entropy shows a very convincing growth proportional to log t over several decades.

This value for the path dimension is consistent with the ones obtained in earlier work, although the value obtained through the entropy is much closer to the MST value [ 16 ]. This significantly strengthens our belief that the frictional finger pattern lies in the MST universality class and can be seen as a continuum analog of the lattice MST.

Figure 3. In this paper we have studied the entropy of diffusion in frictional finger patterns. In addition to being a measure of how fast the non-equilibrium process is evolving, the entropy is also considered as a tool for studying the systems coarser geometry as the diffusing particles explore the large-scale structure at late times.

This strengthens the current hypothesis that the frictional fingers belong to this class. The datasets generated for this study are available on request to the corresponding author. KO performed analytical calculations and wrote the paper.

JC developed numerical code crucial for the paper, analyzed the pattern, and aided in the writing process. The total heat energy of the universe will remain constant but will become so diffused that there will be no useable matter or energy left.

How does entropy relate to osmosis and diffusion? Chemistry Thermochemistry Entropy. Paul G. Jun 30, Explanation: Due to the kinetic nature of particles, their constant motion bumps particles around i. David Drayer. Jul 1, Explanation: Entropy is the basis of the second law of thermodynamics. Because of the size of the openings in the membrane small molecules can diffuse through the membrane but larger molecules cannot The second law of thermodynamics or entropy indicates that the entire universe is becoming more diffuse.

Related questions How does entropy change with pressure? How do you calculate entropy? This particular resource used the following sources:.

Skip to main content. Search for:. Solutions and Entropy Changes. Learning Objective Recall that entropy favors dissolution because the potential for randomness is increased. Key Points Entropy can be thought of as the randomness or spread-outedness of a group of molecules. Increasing randomness is favorable. There is an entropy change associated with the formation of a solution, an increase in entropy randomness that thermodynamically favors the solution over the two original states.

If the other energetics of dissolution are favorable, this increase in entropy means that the conditions for solubility will always be met.

Even if the energetics are slightly endothermic the entropy effect can still allow the solution to form. Show Sources Boundless vets and curates high-quality, openly licensed content from around the Internet.



0コメント

  • 1000 / 1000