Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
An Evolution Based on Various Energy Strategies
Previous Article in Journal
From Kerr to Heisenberg
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Note on Causation versus Correlation in an Extreme Situation

1
Nanjing Institute of Meteorology, 219 Ningliu Blvd, Nanjing 210044, China
2
School of Atmospheric Sciences, Nanjing University, 163 Xianlin Avenue, Nanjing 210023, China
*
Author to whom correspondence should be addressed.
Entropy 2021, 23(3), 316; https://doi.org/10.3390/e23030316
Submission received: 10 February 2021 / Revised: 2 March 2021 / Accepted: 4 March 2021 / Published: 7 March 2021

Abstract

:
Recently, it has been shown that the information flow and causality between two time series can be inferred in a rigorous and quantitative sense, and, besides, the resulting causality can be normalized. A corollary that follows is, in the linear limit, causation implies correlation, while correlation does not imply causation. Now suppose there is an event A taking a harmonic form (sine/cosine), and it generates through some process another event B so that B always lags A by a phase of π / 2 . Here the causality is obviously seen, while by computation the correlation is, however, zero. This apparent contradiction is rooted in the fact that a harmonic system always leaves a single point on the Poincaré section; it does not add information. That is to say, though the absolute information flow from A to B is zero, i.e., T A B = 0 , the total information increase of B is also zero, so the normalized T A B , denoted as τ A B , takes the form of 0 0 . By slightly perturbing the system with some noise, solving a stochastic differential equation, and letting the perturbation go to zero, it can be shown that τ A B approaches 100%, just as one would have expected.

1. A Review of the Rigorous Information Flow-Based Causality Analysis

Causal inference is a fundamental problem in scientific research. Recently it has been shown that the problem can be recast into the framework of information flow, another fundamental notion in general physics which has wide applications in different disciplines (see [1]), and hence can be put on a rigorous footing. The causality between two time series can then be analyzed in a quantitative sense, and, besides, the resulting formula is very concise in form. In the linear limit, it involves only the usual statistics namely sample covariances [2], making the important and otherwise difficult problem an easy task.
To briefly review the theory, consider a two-dimensional continuous-time stochastic system for state variables x = ( x 1 , x 2 )
d x d t = F ( x , t ) + B ( x , t ) w ˙ ,
where F = ( F 1 , F 2 ) may be arbitrary nonlinear functions of x and t, w ˙ is a vector of white noise, and B = ( b i j ) is the matrix of perturbation amplitudes which may also be any functions of x and t. Here we adopt the convention in physics and do not distinguish deterministic and random variables; in probability theory, they are ususally distinguished with capital and lower-case symbols. Assume that F and B are both differentiable with respect to x and t. Then the information flow from x 2 to x 1 (in nats per unit time) can be explicitly found in a closed form [3] (the multiple-dimensional case is referred to [1]):
T 2 1 = E 1 ρ 1 ( F 1 ρ 1 ) x 1 + 1 2 E 1 ρ 1 2 g 11 ρ 1 x 1 2 ,
where E stands for mathematical expectation, and g i i = k = 1 n b i k b i k , ρ i = ρ i ( x i ) is the marginal probability density function (pdf) of x i . The rate of information flowing from x 1 to x 2 can be obtained by switching the indices. If T j i = 0 , then x j is not causal to x i ; otherwise it is causal, and the absolute value measures the magnitude of the causality from x j to x i . For discrete-time mappings, the information flow is in much more complicated a form; see [1].
In the case with only two time series (no dynamical system is given) X 1 and X 2 , under the assumption of a linear model with additive noise, the maximum likelihood estimator (MLE) of the rate of information flowing from X 2 to X 1 is [2]
T ^ 2 1 = C 11 C 12 C 2 , d 1 C 12 2 C 1 , d 1 C 11 2 C 22 C 11 C 12 2 ,
where C i j is the sample covariance between X i and X j , and C i , d j the sample covariance between X i and a series derived from X j using the Euler forward differencing scheme (also see the Euler–Maruyama scheme in [4]): X ˙ j , n = ( X j , n + k X j , n ) / ( k Δ t ) , with k 1 some integer. Note that Equation (3) is rather concise in form; it only involves the common statistics, i.e., sample covariances. In other words, a combination of some sample convariances will give a quantiative measure of the causality between the time series. This makes causality analysis, which otherwise would be complicated with the classical empirical/half-empirical methods, very easy. Nonetheless, note that Equation (3) cannot replace (1); it is just the mle of the latter. Statistical significance test must be performed before a causal inference is made based on the computed T 2 1 . For details, refer to [2].
Considering the long-standing debate ever since Berkeley (1710) [5] over correlation versus causation, we may rewrite (3) in terms of linear correlation coefficients, which immediately implies [2]:
Causation implies correlation, but correlation does not imply causation.
The above formalism has been validated with many benchmark systems (e.g., [1]) such as baker transformation, Hénon map, Kaplan-Yorke map, Rössler system, etc. It also has been successfully applied to the studies of many real world problems such those in financial economics (e.g., the “Seven Dwarfs vs. a Giant” problem [6]), earth system science (e.g., the Antarctica mass balance problem [7] and the global warming problem [8]), neuroscience (e.g., the concussion problem [9]), to name but a few.

2. The Question

Now suppose we have a dynamic event A which drives another event B. The former has a harmonic form, leading the latter by a phase of π / 2 . That is to say, the time series, say, { x A ( t ) } and { x B ( t ) } resulting from the two, are in quadrature. Then the correlation between the two is zero. Here by zero-correlation we mean a zero integral
Ω ( x A ( t ) x ¯ A ) ( x B ( t ) x ¯ B ) d t ,
with the integration domain Ω being one period or more periods, and the overbar being the mean over the domain. However, since A causes B, the result is apparently in contradiction to the above corollary that “causation implies correlation”.

3. The Solution

The problem can be more formally stated with the harmonic system:
d x d t = F ( x , t ) = A x = a 11 a 12 a 21 a 22 x 1 x 2 = 0 1 1 0 x 1 x 2 .
If the system is initialized with x 1 ( 0 ) = 1 , x ˙ 1 ( 0 ) = 1 , the solution is, x 1 = cos t , x 2 = sin t . Thus, the population covariance σ 12 = Ω cos t sin t d t = 0 ( Ω is one period or many periods). This yields an information flow from x 2 to x 1 :
T 2 1 = a 12 σ 12 σ 11 = 0 .
Fundamentally the above problem arises from the fact that it is a deterministic system. In Granger causality test [10] (also see a recent reference [11]), this case has been explicitly excluded, as in such case the trajectories do not form appropriate ensembles in the sample space. For a harmonic series, it shows on a Poincaré section only one single point; so the total information does not accrue. If the total information does not change, the information flow to x 1 must also vanish. However, the vanishing information flow does not mean that there is no influence of x 2 on x 1 . As we argued in Liang (2015), the so-obtained information must be normalized, just as covariance needs to be normalized into correlation, for one to assess the causal influence. Here if the normalizer is zero, T 2 1 involves the indeterminate form 0 0 . We may then approach it by taking the limit. Specifically, we may approach it by enlarging the sample space slightly, i.e., by adding some stochasticity to the system, then take the limit by letting the stochastic perturbation amplitude go zero.
By Liang (2015), the normalizer for T 2 1 is
Z 2 1 = | T 2 1 | + | d H 1 * d t | + | d H 1 noise d t | ,
where on the right hand side, the second term is the contribution from x 1 itself, and the third term the contribution from noise. In Liang (2015), it has shown that d H 1 * d t is a Lyapunov exponent-like, phase-space stretching rate, and d H 1 n o i s e d t a noise-to-signal ratio. In this problem, we do not have noise taken into account. However, in reality, noise is ubiquitous. We may hence view a deterministic system as a limit or extreme case as the amplitude of stochastic perturbation goes to zero. For this case, we add to (4) a stochastic term:
d x d t = A x + B w ˙ ,
where w is a vector of standard Wiener processes. For simplicity, let the perturbation amplitude B a constant matrix. Further let G = B B T , with elements
( g i j ) = k = 1 2 b i k b j k .
Liang (2008) established that
d H 1 * d t = a 11 = 0 ,
d H 1 noise d t = 1 2 g 11 σ 11 .
So in this case, the normalized flow from x 2 to x 1 is
τ 2 1 = a 12 σ 12 σ 11 | a 12 σ 12 σ 11 |   +   0   +   | 1 2 g 11 σ 11 | = σ 12 | σ 12 |   +   g 11 2 .
Likewise,
τ 1 2 = a 21 σ 12 σ 22 | a 21 σ 12 σ 22 |   +   | a 22 |   +   | 1 2 g 22 σ 22 | = + σ 12 | σ 12 |   +   g 22 2 .
Note that τ 2 1 (or τ 1 2 ) may be positive or negative. In causal inference, this does not matter; we need only consider the absolute value, although the sign does carry a meaning according to the original formulation. (A positive τ 2 1 means x 2 causes the marginal entropy of x 1 to grow and vice versa; see [1,2].)
Now for the stochastic equation, the covariance matrix Σ evolves as
d Σ d t = A Σ + Σ A T + B B T = A Σ + Σ A T + G
Expanding, this is
d d t σ 11 σ 12 σ 12 σ 22   =   σ 12 σ 22 σ 11 σ 12   +   σ 12 σ 11 σ 22 σ 12   +   g 11 g 12 g 12 g 22 .
We hence obtain the following equation set:
d σ 11 d t = 2 σ 12 + g 11 , d σ 12 d t = σ 22 + σ 11 + g 12 , d σ 22 d t = 2 σ 12 + g 22 .
Solving, we get
d 2 σ 12 d t 2 = 4 σ 12 + ( g 11 g 22 + g 12 ) .
So the solution is
σ 12 = C 1 cos 2 t + C 2 sin 2 t + 1 2 ( g 11 g 22 + g 12 ) t 2 .
If σ 12 ( 0 ) = 0 , σ ˙ 12 ( 0 ) = 0 , then the integration constants C 1 = C 2 = 0 . So
τ 2 1 = σ 12 | σ 12 |   +   1 2 g 11 = 1 1 + g 11 ( g 11 g 22 + g 12 ) t 2
Two cases are distinguished:
Case I
g 12 g 22 = const 0 .
lim g 11 0 τ 2 1 = 1 .
Case II
g 12 g 22 = 0 .
lim g 11 0 τ 2 1 = 1 1 + 1 / t 2 .
As t goes to infinity, τ 2 1 also approaches 1 .
If initially there exists some covariance, say, σ 12 ( 0 ) = c , then C 1 = c , and hence
τ 2 1 = 1 1 + g 11 2 c cos 2 t + ( g 11 g 22 + g 12 ) t 2 .
In this case, as g 11 0 , we always have τ 2 1 1 . Either way, the relative information flow τ 2 1 approaches 1 in the limit of deterministic system.
In the other direction, we now need to consider the uncertainty growth of x 2 and hence perturb g 22 . Repeating the above procedure, when σ 12 ( 0 ) = σ ˙ 12 ( 0 ) = 0 , the normalized information flow is
τ 1 2 = + σ 12 | σ 12 |   +   1 2 g 22 = + 1 1 + g 22 ( g 11 g 22 + g 12 ) t 2 .
If g 11 + g 12 = const 0 , then
lim g 22 0 τ 1 2 = + 1 ;
else ( g 11 + g 12 = 0 )
lim g 22 0 τ 1 2 = 1 1 1 / t 2
which approaches to 1 for enough long time ( t ). On the other hand, if initially there exists some covariance such that σ 12 ( 0 ) = c then
τ 12 = 1 1 + g 22 2 c cos 2 t + ( g 11 g 22 + g 12 ) t 2
which implies
lim g 22 0 τ 1 2 = 1 .
This is indeed what we expect. So even for this extreme case, there is no contradiction at all for causal inference using information flow.

4. Discussion

To summarize, a recent rigorously formulated causality analysis asserts that, in the linear limit, causation implies correlation, while correlation does not necessarily mean causation. In this short note, an extreme case which apparently contradicts to the assertion is examined. In this case an event x 1 takes a harmonic form (sine/cosine), and generates through some process another event x 2 so that x 2 is always out of phase with x 1 , i.e., lag x 1 by π / 2 . Obviously x 1 causes x 2 , but by computation the correlation between x 1 and x 2 is zero. In this study we show that this is an extreme case, with only one point in the phase space and hence the problem becomes singular. We re-examine the problem by enlarging the ensemble space slightly through adding some noise. A stochastic differential equation is then solved for the corresponding covariances, which allows us to obtaint the information flows for the perturbed system. Then as the noisy perturbation goes to zero, the normalized information flow rate from x 1 to x 2 is established to be 100%, just as one would have expected. So actually no contradiction exists. (see [12] for how a stochastic differential equation is solved by perturbing it with noise [12].)
One thing that merits mentioning is that, here although it seems that x 1 causes x 2 , actually here the normalized information flow rate from x 2 to x 1 is also 100%. That is to say, for such a harmonic system with circular cause-effect relation, it is actually impossible to differentiate causality by simply assessing which takes place first; anyhow, taking lead by π / 2 is equivalent to lagging by 3 π / 2 . The moral is, for a process that is nonsequential (e.g., that in the nonsequential stochastic control systems), circular cause and consequence coexist, it is essentially impossible to distinguish a delay from an advance.

Author Contributions

Conceptualization, X.S.L. and X.-Q.Y.; methodology, X.S.L.; formal analysis, X.S.L.; investigation, X.S.L. and X.-Q.Y.; validation, X.S.L. and X.-Q.Y.; writing, X.S.L. All authors have read and agreed to the published version of the manuscript.

Funding

This study was partially supported by the National Science Foundation of China (NSFC) under Grant No. 41975064, and the 2015 Jiangsu Program for Innovation Research and Entrepreneurship Groups.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Liang, X.S. Information flow and causality as rigorous notions ab initio. Phys. Rev. E 2016, 94, 052201. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Liang, X.S. Unraveling the cause-effect relation between time series. Phy. Rev. E 2014, 90, 052150. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Liang, X.S. Information flow within stochastic dynamical systems. Phys. Rev. E 2008, 78, 031113. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Iacus, S.M. Simulation and Inference for Stochastic Differential Equations: With R Examples; Springer: New York, NY, USA, 2008. [Google Scholar]
  5. Berkeley, G. A Treatise Concerning the Principles of Human Knowledge; originally published in 1710; Hackett Publishing Company, Inc.: Indianapolis, IN, USA, 1982. [Google Scholar]
  6. Liang, X.S. Normalizing the causality between time series. Phys. Rev. E 2015, 92, 022126. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Vannitsem, S.; Dalaiden, Q.; Goosse, H. Testing for dynamical dependence—Application to the surface mass balance over Antarctica. Geophys. Res. Lett. 2019. [Google Scholar] [CrossRef]
  8. Stips, A.; Macias, D.; Coughlan, C.; Garcia-Gorriz, E.; Liang, X.S. On the causal structure between CO2 and global temperature. Sci. Rep. 2016, 6, 21691. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Hristopulos, D.T.; Babul, A.; Babul, S.; Brucar, L.R.; Virji-Babul, N. Disrupted information flow in resting-state in adoloscents with sports related concussion. Front. Hum. Neurosci. 2019, 13, 419. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  10. Granger, C.W. Investigating causal relations by econometric models and cross-spectral methods. Econometrica 1969, 37, 424–438. [Google Scholar] [CrossRef]
  11. Contreras-Reyes, J.E.; Hernández-Santoro, C. Assessing Granger-causality in the southern Humboldt current ecosystem using cross-spectral methods. Entropy 2020, 22, 1071. [Google Scholar] [CrossRef] [PubMed]
  12. Argyris, J.; Andreadis, I.; Pavlos, G.; Athanasiou, M. The influence of noise on the correlation dimension of chaotic attractors. Chaos Solit. Fract. 1998, 9, 343–361. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Liang, X.S.; Yang, X.-Q. A Note on Causation versus Correlation in an Extreme Situation. Entropy 2021, 23, 316. https://doi.org/10.3390/e23030316

AMA Style

Liang XS, Yang X-Q. A Note on Causation versus Correlation in an Extreme Situation. Entropy. 2021; 23(3):316. https://doi.org/10.3390/e23030316

Chicago/Turabian Style

Liang, X. San, and Xiu-Qun Yang. 2021. "A Note on Causation versus Correlation in an Extreme Situation" Entropy 23, no. 3: 316. https://doi.org/10.3390/e23030316

APA Style

Liang, X. S., & Yang, X. -Q. (2021). A Note on Causation versus Correlation in an Extreme Situation. Entropy, 23(3), 316. https://doi.org/10.3390/e23030316

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop