## Dr. Ahmed G. Abo-Khalil

Electrical Engineering Department

# Entropy (information theory)

In information theoryentropy is the average amount of information contained in each message received. Here, message stands for an event, sample or character drawn from a distribution or data stream. Entropy thus characterizes our uncertainty about our source of information. (Entropy is best understood as a measure of uncertainty rather than certainty as entropy is larger for more random sources.) The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when it occurs. For some other reasons (explained below) it makes sense to define information as the negative of the logarithm of the probability distribution. The probability distribution of the events, coupled with the information amount of every event, forms a random variable whose average (a.k.a. expected) value is the average amount of information, a.k.a. entropy, generated by this distribution. The units of entropy are commonly referred to as bits, but entropy is also measured in shannonsnats, or hartleys, depending on the base of the logarithm used to define it.

Monday 10 -2

Tuesday 10-12

Thursday 11-1

### Contacts

email: [email protected]

[email protected]

Phone: 2570

### Welcome

Welcome To Faculty of Engineering

# Institute of Electrical and Electronics Engineers

http://www.ieee.org/

http://ieeexplore.ieee.org/Xplore/guesthome.jsp

http://ieee-ies.org/

http://www.ieee-pes.org/

http://www.pels.org/

### Links of Interest

http://www.utk.edu/research/

http://science.doe.gov/grants/index.asp

http://www1.eere.energy.gov/vehiclesandfuels/

http://www.eere.energy.gov/

### القران الكريم

http://quran.muslim-web.com/

### Travel Web Sites

http://www.hotels.com/

http://www.orbitz.com/

http://www.hotwire.com/us/index.jsp

http://www.kayak.com/