Entropy: Difference between revisions

From Rosetta Code
Content added Content deleted
m (→‎{{header|Perl 6}}: useless newline)
(adding unit so we know we want the base2 logarithm.)
Line 2: Line 2:


Calculate the [http://en.wikipedia.org/wiki/Entropy_(information_theory) entropy] (shannon entropy) of a given input sequence.
Calculate the [http://en.wikipedia.org/wiki/Entropy_(information_theory) entropy] (shannon entropy) of a given input sequence.
Use "1223334444" as an example sequence. The result should be around 1.84644.
Use "1223334444" as an example sequence. The result should be around 1.84644 bits.





Revision as of 20:43, 21 February 2013

Task
Entropy
You are encouraged to solve this task according to the task description, using any language you may know.

Calculate the entropy (shannon entropy) of a given input sequence. Use "1223334444" as an example sequence. The result should be around 1.84644 bits.


Burlesque

<lang burlesque> blsq ) "1223334444"F:u[vv^^{1\/?/2\/LG}m[?*++ 1.8464393446710157 </lang>

Haskell

<lang haskell> import Data.List

main = print $ entropy "1223334444"

entropy s =

sum . map lg' . fq' . map (fromIntegral.length) . group . sort $ s
 where lg' c = (c * ) . logBase 2 $ 1.0 / c
       fq' c = map (\x -> x / (sum c)) c 

</lang>

Perl 6

<lang Perl 6>sub entropy(@a) {

   my %count; %count{$_}++ for @a;
   my @p = %count.values »/» @a.elems;
   -log(2) R/ [+] map { $_ * log $_ }, @p;

}

say entropy "1223334444".comb;</lang>