Generally, entropy refers to disorder or uncertainty, and the definition of entropyused in information theory is directly analogous to the definition used in statistical thermodynamics. The concept of information entropy was introduced by Claude Shannon in his 1948 paper “A Mathematical Theory of Communication”.