Information-theoretical notions of causality provide a model-free approach to identification of the magnitude and direction of influence among sub-components of a stochastic dynamical system. In addition to detecting causal influences, any effective test should also report the level of statistical significance of the finding. Here, we focus on transfer entropy, which has recently been considered for causality detection in a variety of fields based on statistical significance tests that are valid only in the asymptotic regime, that is, with enormous amounts of data. In the interest of applications with limited available data, we develop a non-asymptotic theory for the probability distribution of the difference between the empirically-estimated transfer entropy and the true transfer entropy. Based on this result, we additionally demonstrate an approach for statistical hypothesis testing for directed information flow in dynamical systems with a given number of observed time steps.

This content is only available via PDF.
You do not currently have access to this content.