
Parametry
Více o knize
It is commonly assumed that computers process information, but what exactly is information? Shannon’s information theory provides an initial answer by focusing on measuring the information content of a message, defined as the reduction of uncertainty gained from receiving that message. This uncertainty, rooted in ignorance, is quantified by entropy. The theory has significantly influenced information storage, data compression, transmission, and coding, remaining a vibrant area of research. While Shannon's theory has garnered philosophical interest, it is often critiqued for its "syntactic" focus, overlooking "semantic" aspects. Various philosophical attempts have sought to infuse semantic dimensions into information theory, frequently linking back to Shannon’s work. Additionally, semantic information theory often employs formal logic, connecting information to reasoning, deduction, inference, and decision-making. Entropy and related measures have also proven vital in statistical inference, as statistical data and observations convey information about unknown parameters. Consequently, a distinct branch of statistics has emerged, rooted in concepts from Shannon’s theory, including measurements like Fisher’s information.
Nákup knihy
Formal theories of information, Giovanni Sommaruga
- Jazyk
- Rok vydání
- 2009
- product-detail.submit-box.info.binding
- (měkká)
Doručení
Platební metody
Nikdo zatím neohodnotil.