Entropy

From Rosetta Code
Revision as of 20:43, 21 February 2013 by Grondilu (talk | contribs) (adding unit so we know we want the base2 logarithm.)
Task
Entropy
You are encouraged to solve this task according to the task description, using any language you may know.

Calculate the entropy (shannon entropy) of a given input sequence. Use "1223334444" as an example sequence. The result should be around 1.84644 bits.


Burlesque

<lang burlesque> blsq ) "1223334444"F:u[vv^^{1\/?/2\/LG}m[?*++ 1.8464393446710157 </lang>

Haskell

<lang haskell> import Data.List

main = print $ entropy "1223334444"

entropy s =

sum . map lg' . fq' . map (fromIntegral.length) . group . sort $ s
 where lg' c = (c * ) . logBase 2 $ 1.0 / c
       fq' c = map (\x -> x / (sum c)) c 

</lang>

Perl 6

<lang Perl 6>sub entropy(@a) {

   my %count; %count{$_}++ for @a;
   my @p = %count.values »/» @a.elems;
   -log(2) R/ [+] map { $_ * log $_ }, @p;

}

say entropy "1223334444".comb;</lang>