What is Information Theory?
Information theory is a branch of mathematics concerned with the process of making choices. Although it has a rich history going back centuries, it was the work of Claude Shannon, published in 1948 and later, that started the field. The theory is powerful and has resulted in great achievements. The beautiful sound we enjoy from compact disks (CD’s) became possible only because of Shannon’s work. The bionet.info-theory news group was formed to discuss the many applications of information theory to biology. (It is not a general information news group as some might be mislead to think.) It is worth at least some of your time to see why we are so excited about this application, as it could turn your research around by sharpening your experimental approaches.
Information theory is a process that focuses on the task of quantifying information. The quantification of information is achieved by identifying viable methods of compressing and communicating data without causing and degradation in the integrity of the data. Information theory can be utilized in a number of different fields, including quantum computing, data analysis and cryptography. The origin of modern informational theory is usually attributed to Claude E. Shannon. His work A Mathematical Theory of Communication, first published in 1948, lays the foundation for the quantification and compression of data into viable units that may be stored for easy retrieval later. His basic approach provided the tools necessary to enhance the efficiency of early mainframe computer systems, and translated easily into the advent of desktop computers during the decade of the 1970’s. As a branch of both electrical engineering and applied mathematics, information theory seeks to uncover the most effi