Jump to content

Talk:Interaction information

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Sign convention

[edit]

The sign convention used in interaction information as described in the article is not the same as that used in the measure-theoretic generalization of mutual information to the multivariate case. (See Information theory and measure theory.) For an even number of random variables the signs are the same, whereas for an odd number of random variables the signs are opposite. This would seem to mean the the interaction information of a single random variable, if it were defined, would be negative (or at least non-positive). Or do I have it all wrong? 198.145.196.71 (talk) 06:20, 20 December 2007 (UTC)[reply]

You may have it right. I would email one of the authors of the Interaction papers (Jakulin). They can tell you definitively. —Dfass (talk) 14:33, 23 December 2007 (UTC)[reply]
I looked very carefully for the second time at the general formula given in their paper (which you have duplicated in this article) and I do believe I am right. They mention (but do not describe) towards the end of their paper a slightly different formulation of interaction information that can be derived from the set-theoretic principle of inclusion-exclusion, and also admit that it is this formulation which is the more popular in recent literature (Yeung, for example).
Consider the case where all the variables but one are independent unbiased bits and the last is a parity bit. Here Jakulin and Bratko's formulation gives an interaction information of +1 in all cases, but the formulation which is based on measure theory, or rather the set-theoretic principle of inclusion-exclusion, gives -1 in the case of an odd number of random variables and +1 in the even case. 198.145.196.71 (talk) 19:34, 24 December 2007 (UTC)[reply]

Merge discussion

[edit]

The Interaction information (this article) is the negative of the Multivariate mutual information (MMI) for an odd number of variables. Both articles contain valuable information not reproduced in the other. They should be merged in one way or another. I favor the MMI as the main article, because in measure theoretic terms, it is simply a signed measure while the interaction information is the negative of a signed measure for an odd number of variables, which introduces a needless complication. Also, both are represented by , a notational inconsistency which needs to be resolved. PAR (talk) 05:45, 7 June 2015 (UTC)[reply]

I would favor the Interaction Information as the main article as I believe this is the most commonly used term, and was the first coined (McGill 1954). I don't know of any articles which use the term "multivariate mutual information". Other terms for the same (or oppositely signed quantity) include co-information (Bell 2003), multiple mutual information (Han 1980) or separately as redundancy or synergy (Gawne and Richmond 1993; Schneidman et al. 2003). The only place I have seen the term "multivariate mutual information" is here on wikipedia, which I and others have found confusing. I think definitely all these articles should be linked to a single place and the relationship between the quantities made clear, but I don't think "Multivariate mutual information" is the correct umbrella term as that is not used outside wikipedia as far as I can tell. 130.209.89.69 (talk) 11:11, 13 January 2016 (UTC)[reply]

I think that before deciding this issue, Krippendorff [1] must be read. He provides an extensive, referenced discussion of these measures, as well as approaching the question of what they actually mean.
The multivariate mutual information (MMI) article states that the interaction information (II) is the negative of the MMI for an odd number of variables, yet the definitions for the 3- and 4- variable case indicate that they are simply opposite in sign. Krippendorff discusses all three versions and indications are that the two are in fact simply opposite in sign.
It should be noted that the MMI is a (signed) measure. In the three-variable case it corresponds directly to the gray region in the information diagram, the intersection of all three entropies. There is a certain simplicity to this. Krippendorff says, mistakenly I think, that the interaction information (as negative MMI) is the intersection of all three. Also, both are represented by , a notational inconsistency which needs to be resolved. Note that the notation of information theoretic measures is very systematic as described in Information theory and measure theory and according to this system, I(X;Y;Z) corresponds to the MMI and the negative of the II. Defining I(X;Y;Z) as anything else would mess up a very consistent notational system.
Regarding the naming convention, there seems to be no clear consensus other than "interaction information" as defined in II and the negative of MMI, but my reading of the literature is hardly complete. PAR (talk) 04:43, 14 January 2016 (UTC)[reply]

I don't see the terminology "Multivariate mutual information" anywhere in the Kirppendorff paper you refer to. Can you point to any published work that defines / uses the "Multivariate mutual information" term other than the wikipedia page? It seems to me that given the already confusing nature of the field with multiple terms for the same quantity, it would be better for wikipedia to clarify these different terminologies in a single place (I suggest "Interaction information" as it the first published term for the quantity in question, and arguably the most intuitive and widely used). But I think it is definitely not helpful for wikipedia to define a new term (Multivariate mutual information) which is quite generic (all google results for this term are either generally discussing bivariate mutual information on multivariate spaces). This has already resulted in some confusion... None of the papers listed themselves on the "Multivariate mutual information" actual use that term to describe the quantity. So I think that article should be removed / merged / renamed. 130.209.89.69 (talk) 12:25, 30 March 2016 (UTC)[reply]

Closing, given no consensus on any particular action over more than 2 years and discussion stale (no contribution for more than 18 months). Klbrain (talk) 23:23, 18 November 2017 (UTC)[reply]

Wolf

[edit]

It looks like D. Wolf also independently introduced the interaction information in 1996. (He doesn't cite McGill, Ting, or Bell.) See eq. (13) to confirm this is the same quantity. "The generalization of mutual information as the information between a set of variables: The information correlation function hierarchy and the information structure of multi-agent systems. " [2] Jess (talk) 19:29, 5 April 2017 (UTC)[reply]

New merger proposal

[edit]

In the previous merger discussion one of the issues was the sign difference between the two articles. Well, as I'm writing this, the two articles use the same sign! How did this happen? Apparently, just over two years ago someone changed it and nobody noticed. So right now the two articles are about the same thing.

I propose to merge Multivariate mutual information into Interaction information. 66.17.111.151 (talk) 23:09, 9 April 2021 (UTC)[reply]

Difficulty of interpretation - Independent Binary Variables Example

[edit]

This example is easy, not difficult, to interpret; . B52481 (talk) 17:08, 14 June 2023 (UTC)[reply]

I agree. In addition, there is no negativity of interaction information coming into play here. Pleuvirion (talk) 22:57, 7 July 2023 (UTC)[reply]