Dr. Ahmed G. Abo-Khalil

Electrical Engineering Department

Entropy (informati

Entropy (information theory)

In information theoryentropy is the average amount of information contained in each message received. Here, message stands for an event, sample or character drawn from a distribution or data stream. Entropy thus characterizes our uncertainty about our source of information. (Entropy is best understood as a measure of uncertainty rather than certainty as entropy is larger for more random sources.) The source is also characterized by the probability distribution of the samples drawn from it. The idea here is that the less likely an event is, the more information it provides when it occurs. For some other reasons (explained below) it makes sense to define information as the negative of the logarithm of the probability distribution. The probability distribution of the events, coupled with the information amount of every event, forms a random variable whose average (a.k.a. expected) value is the average amount of information, a.k.a. entropy, generated by this distribution. The units of entropy are commonly referred to as bits, but entropy is also measured in shannonsnats, or hartleys, depending on the base of the logarithm used to define it.

Office Hours

Monday 10 -2

Tuesday 10-12

Thursday 11-1

My Timetable


email: [email protected]

[email protected]

Phone: 2570


Welcome To Faculty of Engineering

Almajmaah University




Links of Interest





Travel Web Sites






ستقام اختبارات الميدتيرم يوم الثلاثاء 26-6-1440

حسب الجدول المعلن بلوحات الاعلان

Summer training

The registration for summer training will start from 5th week of second semester

Academic advising

Class registration week 1

برنامج التجسير

إحصائية الموقع

عدد الصفحات: 2879

البحوث والمحاضرات: 1280

الزيارات: 61359