Information theory
Information theory is a branch of applied mathematics, electrical engineering, and computer science involving thequantification of information. Information theory was developed by Claude E. Shannon to find fundamental limits onsignal processing operations such as compressing data and on reliably storing and communicating data. Since its inception it has broadened to find applications in many other areas, including statistical inference, natural language processing, cryptography, neurobiology,[1] the evolution[2] and function[3] of molecular codes, model selection inecology,[4] thermal physics,[5] quantum computing, linguistics, plagiarism detection,[6] pattern recognition, anomaly detection and other forms of data analysis.