Independent Sources of Information-Theoretic Descriptors of Electronic States

The need for resultant measures of the Information-Theoretic (IT) content of molecular electronic wavefunctions, combining the information contributions due to the probability and phase/current distributions, is reemphasized. Complementary measures of the state entropy (disorder) and information (order) contents are reexamined, the continuities of wavefunction components are summarized, and the probability acceleration concept is used to determine the current and information sources. The experimental elimination of the state uncertainties is discussed and limitations in this information-acquirement process imposed by the Heisenberg indeterminacy principle are commented upon.

In this work we shall focus on the modulus and phase components of electronic wavefunctions, and on the probability/current distributions they generate. We shall argue that these two degrees-of-freedom of molecular states represent independent sources of the overall information content in the state position representation, which can be combined in a single (resultant) entropy/information descriptor. The logarithmic separation of the state components will be emphasized an their dynamics will be formulated in terms of the associated continuity relations. The information gained in an experimental removal of the state uncertainties will be discussed, and limitations imposed on such process by the Heisenberg indeterminacy principle will be commented upon.

Independent Origins of Information Content in Quantum States
In molecular quantum mechanics (QM) electronic states are described by the complex wavefunctions exhibiting independent modulus and phase components. Consider, for simplicity the quantum state (t) of a single electron at time t, and the associated wavefunction in the position representation, (r, t) = r(t) = R(r, t) exp[i(r, t)], (1) defined by its (real) modulus R(r, t) and phase (r, t)  0 parts. The state logarithm separates (additively) these independent degrees-of-freedom of electronic states: ______________________________ * Professor Emeritus. 2ln(r, t) = 2lnR(r, t) + 2i(r, t) = ln(r, t) + 2i(r, t), (2) where [(r, t)] = rr  (r) = (r, t) 2 = R(r, t) 2 = (r, t) (3) denotes the (classical) position-probability density. In Eq. (2) the classical (probability) and nonclassical (phase) descriptors thus determine the real and imaginary components of the wavefunction logarithm.
This electronic state also defines the current distribution j(r, t), the expectation of the local current operator j(r), j(r, t) = [p(r) + (r)p]/(2m)  j(r) (4) where p stands for the particle momentum operator. The effective velocity V(r, t) of the spatial probability distribution introduced in the preceding equation, measures the local current-per-particle, reflecting density of the average current momentum p(r, t)  ħ k(r, t) or the associated wave-vector distribution k(r, t) = (r, t).
(6) It should be emphasized, that these local (dynamic) distributions of the probability velocity or the current momentum are available together with the spatial probability distribution, in a common position-representation of QM, i.e., for the parametrically (sharply) specified electronic location r(t) in the physical space.
In both the classical probability theory and in position-representation of QM the admissible locations {r} of an electron represent the complete set of (exclusive) elementary spatial events, which exhaust the whole physical space. The classical (infinite, continuous) probability scheme {r(r)} of Fig. 1 describes a state of the position uncertainty (indeterminacy) measured by Shannon's global entropy, since we know only the probabilities (r) = (r) 2 of alternative definite outcomes in the underlying particle-localization experiment. This average entropy descriptor measures a "spread" (uncertainty, indeterminacy) of the position probability distribution. Another suitable classical measure of the average information content in (r) is provided by Fisher's probability functional. This gradient measure of the "compactness" (hight, determinacy) of the probability distribution thus complements the Shannon "width" descriptor. The (classical) entropic measures thus reflect the uncertainty contained in the state ("static") probability distribution at specified time t = t 0 , (r, t 0 ), missing the (nonclassical) order/dissorder contents of the ("dynamical") distribution k(r, t 0 ) originating from (r, t 0 ). The consistent IT treatment of molecular states calls for simultaneously accounting for both these sources of quantum entropy/information. This suggests the need for an appropriate generalization of the complementary probability-descriptors of classical IT [18][19][20][21][22][23][24][25], e.g., the Shannon's global entropy S[] [24,25] and Fisher's gradient information I[] [19,20], into the corresponding wavefunctionfunctionals S[] and I[] applicable to reflect the overall entropy and information content of generally complex states in molecular QM [4,[60][61][62][63][64][65][66][67][68][69].
To summarize, the independent modulus and phase components of electronic wavefunction in the position representation determine physical descriptors of the (spatial) probability density and the associated distributions of probability velocity or current, respectively. They account for the "static" and "dynamic" (convection) aspects of the state probability distribution, i.e., its stuctures of "being" and "becoming" [70].  (9) These complementary descriptors measure the uncertainty S[] in localizing the particle in physical space and reflect the information I[] received by removing this source of spatial indeterminacy [71]. In resultant measures of Eq. (7) these classical IT functionals of the spatial probability density are supplemented by the corresponding nonclassical complements S[] and I[] = I[j], due to the wavefunction phase or the electronic current (velocity) distribution it generates. It should be observed that a finite local current diminishes a degree of spatial uncertainty by distinguishing this direction in space, thus increasing the amount of "order", i.e., diminishing "disorder" in the physical space. The minimum of this nonclassical lowering of the spatial indeterminacy should be observed in stationary states, when (r, t) =  s (t), j(r, t) = j s = 0, and hence S[ s ] = I[ s ] = 0. This indicates that the nonclassical contributions to the global and gradient measures of resultant entropy must be negative [4,[60][61][62][63][64][65][66][67][68][69].

Complementary Entropy/Information Measures
Consider now the suitable candidates for the phase/current complements to these classical entropy/information measures. The prudent way to proceed in this search is to recognize the separation of Eq. (2) of the state logarithm into the additive real (probability) and imaginary (phase) components, and to observe the mutual relation between the densities-per-electron of the classical Shannon entropy and Fisher information: (10) Thus, the squared gradient of the Shannon ("disorder") measure determines the associated Fisher ("order") descriptor.
One also realizes, that the phase-dependent entropy descriptor has to be complex in character [62], with the real part dependent on the modulus component of the wavefunction and recovering the classical Shannon's measure. This observation excludes the Hermitian operator S defining the resultant average entropy, since such an operator would imply its real eigenvalues and hence also the real expectation values.
In the two-fold quantum mapping schemes of Fig. 1 [71] the probability density of the classical scheme, defined by the wavefunction modulus component, is replaced by the wavefunction. A reference to Eq. (2) suggests that the quantum generalization of the classical entropy density (13) This generalization thus consists in the replacement R(r, t)(r, t). The non-Hermitian multiplicative operator S[(r, t)] generates the complex ("vector") average descriptor of the global entropy measure of Eq. (11), with the negative nonclassical (imaginary) contribution proportional to the state average phase  Indeed, a presence of the finite current pattern implies more "order" (predictability) in the state electronic structure, and hence less "disorder" (uncertainty) in the quantum state of a molecule. Of interest also is the corresponding overall ("scalar") entropy descriptor additively combining the classical and nonclassical uncertainty contributions: The same replacement R(r, t)(r, t) suggests the quantum generalization of the gradient information density, into its quantum analog: (17) This overall information density generates the associated average descriptor: The positive nonclassical information I[] is in accord with the negative entropy contribution S[] in Eq. (15), since they consistently imply more predictability (less uncertainty) due to nonvanishing local phase/current distributions.
Finally, using the quantum-generalized relation between densities of the global and gradient IT densities [Eq. (13)] one predicts the following gradient (information) analog of the resultant entropy density, and the associated average descriptor (20) Its negative nonclassical contribution is again in accord with an elementary intuition, since nonvanishing current pattern in the specified quantum state implies less uncertainty (entropy) content in the system wavefunction.
One further observes that there should correspond the Hermitian operator I to the (real) resultant information of Eq. (18): I[]  I. As already observed elsewhere [4,8,28,[66][67][68][69], this operator is related to the operator defining the dimensionless measure K[] of the average kinetic energy T[] of an electron, and the information operator in position-representation reads: Therefore, the non-Hermitian operator of Eq. (13) generates the complex average resultant entropy (indeterminacy) descriptor of Eq. (14), while the Hermitian gradient information operator of Eq. (22) gives rise to the real information (determinacy) descriptor of Eq. (18) and the associated entropy (uncertainty) measure of Eq. (20). The relation between the state resultant gradient information and electronic kinetic energy allows one to use the molecular virial theorem [72] in exploring the information redistributions in chemical processes.

Summary of Continuity Relations
It is of crucial importance in continuity laws of QM [73] to distinguish between the reference frame moving with the fluid particles (Lagrangian frame) and the reference frame fixed to the prescribed coordinate system (Eulerian frame). The total derivative d/dt is the time change appearing to an observer who moves with the fluid particle, while the partial derivative /t is the local time rate of change observed from a fixed point in the Eulerian reference frame. These derivatives are related to each other, where the velocity-dependent part V(r, t)  represents the probability "convection" (conv.) term. In Schrödinger's dynamical picture the state vector (t) introduces an explicit time-dependence of the system wavefunction, while the dynamics of the basis vector r(t) of the position representation is the source of an additional, implicit time-dependence of electronic wavefunction (r, t) = [r(t), t] due to the moving reference, monitoring point. This separation applies to wavefunctions, their components, and expectation values of physical observables. For example, applying the differential operator of Eq. (22) to the wavefunction (r, t) gives: The total derivative of the average value of the physical or information property determined by the derivative of its local density-per-electron f(r, t), determines the property source the -weighted average of the local net production of f(r, t), Above we have observed in the definition of the property current density J f (r, t) that it is carried with velocity V(r, t) of the probability flow j(r, t) = (r, t)V(r, t), and recognized the familiar probability continuity, Another direct implication of the probability-continuity is the vanishing Laplacian of the wavefunction phase, which also determines the divergence of the velocity field of probability "fluid". It indeed follows from Eqs. (22) and (27) that dp[r(t), t]/dt = p(r, t)/t + [p(r, t)/r(t)] [dr(t)/dt] = p(r, t)/t + p(r, t) V(r, t) = p(r, t)/t +  j(r, t) = 0 or p(r, t)/t = j(r, t) =  [p(r, t) V(r, t) + p(r, t) V(r, t)] =  p(r, t) V(r, t).
(29) Therefore, the divergence of effective velocity field V(r, t), determined by the state phase-Laplacian (r, t), must identically vanish: V(r, t) = (ħ/m) (r, t) = 0 or (r, t) = 0. (30) Similar balance equations can be derived for wavefunction components (see Table 1). The SE expressed in terms of the state modulus and phase degrees-of-freedom reads: iħ
In Table 1 we have summarized dynamic equations for the wavefunction modulus and phase components, as well as the continuity relations of the state probability, current and information densities, which directly follow from molecular SE. The time evolutions of the scalar fields f(r, t)  {(r, t), R(r, t), (r, t), (r, t)}, (39) with the associated current definitions J f (r, t) = f(r, t)V(r, t), can be cast in the form of continuity equation for the distribution in question: (40) They involve the negative divergence of the property current density, J f (r, t) = f(r, t)V(r, t), and the property source  f (r, t) = df[r(t), t]/dt, reflecting the local net outflow and production terms, respectively.
To summarize, the probability velocity also determines fluxes of the physical and information properties in the specified electronic state of molecules. The source (net production) of the classical probability-variable of electronic states identically vanishes, while that of their nonclassical, phase-part remains finite. It ultimately generates finite resultant entropy/information productions in quantum states. For example, the derivative of the nonclassical information functional I[] generates a nonvanishing (integral) source of the average resultant gradient information I[]: Its density  I (r, t) is seen to be determined by the product of local probability "flux" j(r, t) and "affinity" factor proportional to gradient of the phase source   (r, t) of Eq. (38) (see also Table 1).

Probability Acceleration and Current Source
Of interest also is the local production of the probability current j(r, t), (43) Above, the total time-derivative of V(r, t) provides a natural measure of the local "acceleration" of probability fluid [73,74]: since differentiations with respect to different variables "r" and "t" commute. This descriptor thus reflects the gradient of the phase-source of Eq. (38).
It also generates the associated probability "force" ,  F(r, t) = m a(r, t)  W(r, t), (45) the negative gradient of the underlying probability potential W(r, t) =  ħ  (r, t) + C(t).
(46) Therefore, the phase-source also reflects the potential generating forces acting on the probability distribution in electronic states of molecular systems.
(48) As an illustration, consider the stationary electronic state,  s (r, t) = R s (r) exp(iE s t/ħ)  R s (r) exp(i s t), corresponding to the sharply specified energy E s determining the time-dependent phase  s (t) =  s t and exhibiting time-independent probability distribution  s (r, t) =  s (r). It satisfies the stationary SE for determining the time-independent probability amplitude R s (r), and generates a constant phase source   s (r, t) =  s and the equalized probability potential W s (r, t) = E s reflecting the state local energy: (50) Therefore, the stationary probability distribution is indeed characterized by the vanishing probability acceleration and force descriptors: This confirms the equilibrium character of  s (r), since the vanishing force does not create any perturbations for a change in this stationary distribution. Due to a common velocity component in the current of the property density f(r, t), J f (r, t) = f(r, t)V(r, t), the above probability acceleration also enters into expressions for local sources of such property currents: t) a(r, t). (52) For example, the probability acceleration determines the familiar local sources of the probability and phase currents: t) a(r, t), where we have recognized the zero local-probability production,   (r, t) = 0, and nonvanishing phase-source:   (r, The net production of the resultant measure of the state gradient information content has been shown to have a purely nonclassical origin [see Eq. (41)]. Its local density per electron also involves the probability acceleration:  I (r, t) = dI(r, t)/dt = I(r, t)/t +  J I (r, t) = 8(r, t)   (r, t) = (8m/ħ) (r, t)  a(r, t), (55) where the resultant-information current J I (r, t) = I(r, t)V(r, t) determines its divergence:  J I (r, t) = (ħ/m) (r, t) I(r, t). (56)

Experimental Elimination of Electronic Uncertainties
The information given us by carrying out the experiment consists in removing the uncertainty existing before the experiment [71]. The classical (infinite, continuous) position scheme {r(r)} of Fig. 1 describes a state of uncertainty in the particle location, since we know only the probabilities (r, t) = (r, t) 2 of alternative, definite outcomes of the particle-position probes. The amount of information given us by the experimental realization of the particle position scheme equals to the classical entropy content in the spatial probability density [71].
In QM one deals with the wavefunction scheme {r(r)} of Fig. 1, in which the classical map {r(r)} constitutes only a part of the overall (complex) mapping. The wavefunction mapping indeed implies a simultaneous ascribing to an electron position the local modulus and phase arguments of the state wavefunction, or of the related local probability and current/velocity quantities. This two-fold scheme in QM ultimately calls for the resultant entropy/information measures, which reflect the overall uncertainty/information content in (complex) electronic states, due to both the classical (probability) and nonclassical (phase/current) distributions [4,[60][61][62][63][64][65][66][67][68][69].
In QM the localization experiment cannot remove all the uncertainty contained in a general electronic state, exhibiting nonvanishing local phase component, (r, t) > 0 and hence also the finite current density: j(r, t)  0. The latter vanishes identically only in the stationary states, for the sharply specified energy E s , when  s (r, t) =  s t =  s (t). In such zero-current states the probability distribution is time-independent,  s (r, t) =  s (r, t) 2 =  s (r) 2 , (58) and experimental determination of the electron position removes all the uncertainty contained in  s (r, t). Indeed, the quantum scheme of Fig, 1  One observes that the current operator in Eq. (4) includes the paricle momentum operator p, which does not commute with the position operator r: [r, p] = iħ. Therefore, in accordance with the Heisenberg uncertainty principle of QM the incompatible observables r and j(r) do not have common eigenstates, and hence cannot be simultaneously sharply defined. The position dispersion  r thus cannot be eliminated simultaneously with the current dispersion  j(r) . Indeed, a removal of the current uncertainty  j(r) calls for the momentum experimental setup, which is incompatible with that required for determining the definite electron position, which eliminates the position uncertainty  r . Only the separate localization and momentum experiments can eliminate simultaneously the position and current uncertainties contained in general electronic states.
The limiting value of product of the squared dispersions  j(r) and  r , where  X = (X  X  ) 2  1/2 and X  = X, (59) is determined by the expectation value of the commutator of j(r) and r: (62) This lower bound is seen to be proportional to the spatial probability distribution itself. Consequently, in the wavefunction mapping {r(r, t)} = {r[(r, t), j(r, t)]} the state overall quantum uncertainty cannot be completely eliminated by a single type of experiment, due to the known determinacy limitations imposed by the Heisenberg principle. However, these two sources of information (the removed uncertainty) are in principle accessible experimentally by performing separate position and momentum probes.
Therefore, accounting for the position and current indeterminacies requires carrying out the incompatible experiments for determining the particle position and its momentum. The simultaneous elimination of these uncertainties in a single type of experiment is thus impossible in principle, but successive position and current (momentum) experiments can recover both the classical and nonclassical contributions to the overall (resultant) information contained in the state wavefunction. Eliminating the ("static") probability uncertainty, by performing the localization-type experiment, still leaves the ("dynamic") current uncertainty, which can be accounted for in the overall (resultant) IT descriptor only by carrying out a separate momentum-type experiment. The

Conclusion
Entropic theories of molecular electronic structure ultimately require the resultant generalization of the classical (probability) measures of the information/entropy content in molecular states. The resultant IT descriptors combine contributions due to the wavefunction modulus and phase components. These overall concepts unite the classical (probability) and nonclassical (current) contributions in the single (position) representation of QM. The resultant approach provides tools for exploring the interplay between the "static" (probability) and "dynamic" (current) entropic factors in molecular electronic structure. The association of the overall gradient information with the (dimensionless) kinetic-energy descriptor also allows one to use the molecular virial theorem in general reactivity considerations. The information distinction between the bonded (entangled) and nonbonded (disentangled) states of molecular subsystems, e.g., substrates in the reaction transition-state complex, also calls for such a resultant IT description of complex electronic states.
The classical and nonclassical information contributions can be regarded as reflecting the complementary structures of "being" and "becoming" [70] contained in quantum molecular states. The electron density alone reflects the "static" structure of "being", missing the "dynamic" structure of "becoming" contained in the state phase/current pattern. Both these manifestations of the electronic "organization" in molecular states ultimately contribute to the overall structural entropy/information content in generally complex electronic wavefunctions. In the quantum IT treatment classical terms probe the entropic content of the incoherent (disentangled) local electronic events, while their nonclassical supplements provide the information complement due to the coherence (entanglement) of such elementary local events.
In the present analysis we have summarized the independent sources of the information content in quantum electronic states, examined the relevant dynamical equations for the wavefunction components, and explored continuity relations for the probability and current degrees-of-freedom of electronic states. Using the intuitive link between the "uncertainty-removed" and the "information-gained" [71] we have also examined limitations imposed by Heisenberg's indeterminacy principle on the simultaneous elimination in a single-type of experiment of the position and current uncertainties. Although the classical and nonclassical information contributions are not available from the position-only experiment, they are in principle accessible from separate position and momentum experiments, which require different, incompatible experimental setups.