Entropy

From Rosetta Code
Revision as of 00:19, 22 February 2013 by Grondilu (talk | contribs) (→‎{{header|Perl 6}}: style again, avoiding list comprehension)
Task
Entropy
You are encouraged to solve this task according to the task description, using any language you may know.

Calculate the entropy (shannon entropy) of a given input string. Use "1223334444" as an example. The result should be around 1.84644 bit.


Burlesque

<lang burlesque> blsq ) "1223334444"F:u[vv^^{1\/?/2\/LG}m[?*++ 1.8464393446710157 </lang>

D

<lang d>import std.stdio, std.algorithm, std.math;

double entropy(T)(T[] s) /*pure nothrow*/ if (__traits(compiles, sort(s))) {

   return s
          .sort()
          .group
          .map!(g => g[1] / cast(double)s.length)
          .map!(p => -p * log2(p))
          .reduce!q{a + b};

}

void main() {

   "1223334444"d.dup.entropy.writeln;

}</lang>

Output:
1.84644

Haskell

<lang haskell> import Data.List

main = print $ entropy "1223334444"

entropy s =

sum . map lg' . fq' . map (fromIntegral.length) . group . sort $ s
 where lg' c = (c * ) . logBase 2 $ 1.0 / c
       fq' c = map (\x -> x / (sum c)) c 

</lang>

J

Solution:<lang j> entropy=: +/@:-@(* 2&^.)@(#/.~ % #)</lang> Example:<lang j> entropy '1223334444' 1.84644</lang>


Perl

Translation of: Perl 6

<lang Perl>sub entropy {

   my %count; $count{$_}++ for @_;
   my @p = map $_/@_, values %count;
   my $entropy = 0;
   $entropy += - $_ * log $_ for @p;
   $entropy / log 2

}

print entropy split //, "1223334444";</lang>

Perl 6

<lang Perl 6>sub entropy(@a) {

   - [+] map -> $p { $p * log $p }, 
   @a.classify({$_}).map: *.value.elems / @a.elems;

}

say log(2) R/ entropy '1223334444'.comb;</lang>