Critical values of a kernel density-based mutual information estimator

Date

2006

Authors

May, R.
Dandy, G.
Maier, H.
Fernando, T.

Editors

Yen, G.

Advisors

Journal Title

Journal ISSN

Volume Title

Type:

Conference paper

Citation

International Joint Conference on Neural Networks, 16-21 July 2006:pp.4898-4903

Statement of Responsibility

Conference Name

International Joint Conference on Neural Networks (2006 : Vancouver, Canada)

Abstract

Recently, mutual information (MI) has become widely recognized as a statistical measure of dependence that is suitable for applications where data are non-Gaussian, or where the dependency between variables is non-linear. However, a significant disadvantage of this measure is the inability to define an analytical expression for the distribution of MI estimators, which are based upon a finite dataset. This paper deals specifically with a popular kernel density based estimator, for which the distribution is determined empirically using Monte Carlo simulation. The application of the critical values of MI derived from this distribution to a test for independence is demonstrated within the context of a benchmark input variable selection problem.

School/Discipline

Dissertation Note

Provenance

Description

Copyright © 2006 IEEE

Access Status

Rights

License

Grant ID

Call number

Persistent link to this record