Entropy: Difference between revisions

From Rosetta Code
Content added Content deleted
(- less decimals)
(+ added wikipedia link)
Line 1: Line 1:
{{task}}
{{task}}


Calculate the entropy (shannon entropy) of a given input sequence.
Calculate the [http://en.wikipedia.org/wiki/Entropy_(information_theory) entropy] (shannon entropy) of a given input sequence.
Use "1223334444" as an example sequence. The result should be around 1.84644.
Use "1223334444" as an example sequence. The result should be around 1.84644.



Revision as of 19:29, 21 February 2013

Task
Entropy
You are encouraged to solve this task according to the task description, using any language you may know.

Calculate the entropy (shannon entropy) of a given input sequence. Use "1223334444" as an example sequence. The result should be around 1.84644.


Burlesque

<lang burlesque> blsq ) "1223334444"F:u[vv^^{1\/?/2\/LG}m[?*++ 1.8464393446710157 </lang>

Haskell

<lang haskell> import Data.List

main = print $ entropy "1223334444"

entropy s =

sum . map lg' . fq' . map (fromIntegral.length) . group . sort $ s
 where lg' c = (c * ) . logBase 2 $ 1.0 / c
       fq' c = map (\x -> x / (sum c)) c 

</lang>