The differential entropy of a continuous waveform is defined over the time period spanned by the recording. The time-dependent information dynamics of the observed process are not accessible to this measure. We review here the construction and validation of approximate time-dependent measures of information dynamics.
Local entropy rate (Lizier, et al. Physical Review E, 77, 026110) at time t quantifies information generation at that time. A complementary measure, specific entropy rate (Darmon, Entropy, 18, 190), is the statistical uncertainty in an as yet unobserved future at time t. Specific entropy rate has been used to construct specific transfer entropy (Darmon and Rapp, Physical Review E, 96, 022121) which gives a time-dependent measure of information movement in multichannel dynamical systems. Specific transfer entropy can then be used in a network analysis which results in an asymmetric time-dependent adjacency matrix. Hierarchical transition chronometries in the network can be identified by examining measures derived from the adjacency matrix with quadrant scans of recurrence diagrams (Rapp, et al. NOLTA Conference, 2012).