Local entropy statistics for point processes
Local entropy statistics for point processes
Point processes are often described with functionals, such as the probability generating functional, the Laplace functional, and the factorial cumulant generating functional. These are used to facilitate modelling of different processes and to determine important statistics via functional differentiation. In information theory, generating functions have also been defined for probability densities to determine information quantities such as the Shannon information and Kullback-Leibler divergence, though as yet there are no such analogues for point processes. The purpose of this article is to exploit the advantages of both types of generating function to facilitate the derivation of information statistics for point processes. In particular, a generating functional for point processes is introduced for determining statistics related to entropy and relative entropy based on Golomb's information function and Moyal's probability generating functional. It is shown that the information generating functional permits the derivation of a suite of statistics, including localised Shannon entropy and Kullback-Leibler divergence calculations.
Information entropy
1155-1163
Clark, Daniel E.
537f80e8-cbe6-41eb-b1d4-31af1f0e6393
1 February 2020
Clark, Daniel E.
537f80e8-cbe6-41eb-b1d4-31af1f0e6393
Clark, Daniel E.
(2020)
Local entropy statistics for point processes.
IEEE Transactions on Information Theory, 66 (2), .
(doi:10.1109/TIT.2019.2941213).
Abstract
Point processes are often described with functionals, such as the probability generating functional, the Laplace functional, and the factorial cumulant generating functional. These are used to facilitate modelling of different processes and to determine important statistics via functional differentiation. In information theory, generating functions have also been defined for probability densities to determine information quantities such as the Shannon information and Kullback-Leibler divergence, though as yet there are no such analogues for point processes. The purpose of this article is to exploit the advantages of both types of generating function to facilitate the derivation of information statistics for point processes. In particular, a generating functional for point processes is introduced for determining statistics related to entropy and relative entropy based on Golomb's information function and Moyal's probability generating functional. It is shown that the information generating functional permits the derivation of a suite of statistics, including localised Shannon entropy and Kullback-Leibler divergence calculations.
This record has no associated files available for download.
More information
e-pub ahead of print date: 13 September 2019
Published date: 1 February 2020
Additional Information:
Funding Information:
Manuscript received October 10, 2018; revised September 9, 2019; accepted September 9, 2019. Date of publication September 13, 2019; date of current version January 20, 2020. This work was supported in part by the Joint AFRL-DSTL Basic-Research in Autonomous Signal Processing AFSOR under Grant FA9550-19-1-7008 and in part by the DSTL Task under Grant 1000133068.
Publisher Copyright:
© 1963-2012 IEEE.
Keywords:
Information entropy
Identifiers
Local EPrints ID: 475498
URI: http://eprints.soton.ac.uk/id/eprint/475498
ISSN: 0018-9448
PURE UUID: a27d8f01-43db-4028-95c7-d97e922591aa
Catalogue record
Date deposited: 20 Mar 2023 17:45
Last modified: 17 Mar 2024 13:11
Export record
Altmetrics
Contributors
Author:
Daniel E. Clark
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics