Concepedia

Publication | Open Access

Efficiency of cellular information processing

161

Citations

46

References

2014

Year

Abstract

We show that a rate of conditional Shannon entropy reduction, characterizing\nthe learning of an internal process about an external process, is bounded by\nthe thermodynamic entropy production. This approach allows for the definition\nof an informational efficiency that can be used to study cellular information\nprocessing. We analyze three models of increasing complexity inspired by the E.\ncoli sensory network, where the external process is an external ligand\nconcentration jumping between two values. We start with a simple model for\nwhich ATP must be consumed so that a protein inside the cell can learn about\nthe external concentration. With a second model for a single receptor we show\nthat the rate at which the receptor learns about the external environment can\nbe nonzero even without any dissipation inside the cell since chemical work\ndone by the external process compensates for this learning rate. The third\nmodel is more complete, also containing adaptation. For this model we show\ninter alia that a bacterium in an environment that changes at a very slow\ntime-scale is quite inefficient, dissipating much more than it learns. Using\nthe concept of a coarse-grained learning rate, we show for the model with\nadaptation that while the activity learns about the external signal the option\nof changing the methylation level increases the concentration range for which\nthe learning rate is substantial.\n

References

YearCitations

Page 1