Entropy measures the amount of uncertainty and dispersion of an unknown or random
quantity, this concept introduced at first by Shannon (1948), it is important for studies
in many areas. Like, information theory: entropy measures the amount of information
in each message received, physics: entropy is the basic concept that measures the
disorder of the thermodynamical system, and others. Then, in this paper, we introduce
an alternative measure of entropy, called HN- entropy, unlike Shannon entropy, this
proposed measure of order α and β is more flexible than Shannon. Then, the cumulative
residual HN- entropy, cumulative NH- entropy, and weighted version have been
introduced. Finally, comparison between Shannon entropy and HN- entropy and
numerical results have been introduced |