Track:
Contents
Downloads:
Abstract:
We have recently proposed that consciousness can be conceptualized as integrated information. Phenomenologically: i. there is a large repertoire of conscious experiences so that, when one particular experience occurs, it generates a lot of information; ii. each experience is integrated, i.e. it appears as a whole that cannot be decomposed into independent parts. To substantiate this notion we have introduced a measure of integrated information, called phi, which captures the repertoire of causal states that are available to a system as a whole. In essence phi quantifies how much effective information is generated through causal interactions within the system, above and beyond the effective information generated by its parts independently. This paper extends previous work on stationary systems to introduce a time-dependent measure, so that integrated information can be evaluated as a function of the dynamics and the causal architecture of a network. An analysis of basic examples, such as Hopfield networks, suggests that integrated information is high when the architecture of a system is balanced between functional specialization and functional integration. Networks that are too integrated or too specialized - homogeneous, hierarchical feed-forward and modular networks - generate little integrated information. The analysis also suggests that the dynamics of a network should be balanced; if the network is close to a hyperactive, inactive or attractor state it generates little integrated information. Metastable dynamics generated by loosely antagonistically coupled populations of elements are suggested as an architecture that can sustain high values of integrated information. These basic examples appear to match well against available neurobiological evidence concerning the neural substrates of consciousness. Since the notion of integrated information is fundamental and general, it potentially extends to inform the design and evaluation of conscious artifacts.