Noise-independent invariants of communication channels
This talk is an essentially interdisciplinary talk, that describes how ideas and techniques were transplanted en masse from one field into another. The original work was from a (published) series of papers in cognitive science and psychology; the (unpublished) applications were in an entirely different area: the theory of communication channels. The first half of the talk describes how information theory found applications within the cognitive science topic of classification: how and why people decide to group certain objects together. The underlying claim is that objects are grouped together simply because there is a utility in doing so: we expect to be able to make predictions about objects depending on the group to which they are assigned. It is relatively straightforward to formalise this notion of predictive power using basic information theory. Classifications may then be compared according to their utility. This was implemented first as a C++ program, and subsequently a MATLAB package. Using information theory allows us to observe and quantify flaws in data-analysis techniques. In particular, we are able to observe that some techniques commonly used in cognitive science throw away almost exactly half of the information. We demonstrate how this problem can be fixed, and that there is practical utility in doing so! The link with the theory of communication came from the observation that a classic cognitive science experiment is precisely an analysis of a communication channel, and the techniques used to analyse this -- including the ones presented in the first half of the talk -- may be used to analyse communication channels more generally. We give theoretical and practical results, analyse a significantly larger test case, and discuss potential applications.