Entropy: Difference between revisions
Content added Content deleted
(+ added wikipedia link) |
(perl6 entry) |
||
Line 24: | Line 24: | ||
fq' c = map (\x -> x / (sum c)) c |
fq' c = map (\x -> x / (sum c)) c |
||
</lang> |
</lang> |
||
=={{header|Perl 6}}== |
|||
<lang Perl 6> |
|||
sub entropy(@a) { |
|||
my %count; %count{$_}++ for @a; |
|||
my @p = %count.values »/» @a.elems; |
|||
-log(2) R/ [+] map { $_ * log $_ }, @p; |
|||
} |
|||
say entropy "1223334444".comb;</lang> |
Revision as of 20:39, 21 February 2013
Entropy
You are encouraged to solve this task according to the task description, using any language you may know.
You are encouraged to solve this task according to the task description, using any language you may know.
Calculate the entropy (shannon entropy) of a given input sequence. Use "1223334444" as an example sequence. The result should be around 1.84644.
Burlesque
<lang burlesque> blsq ) "1223334444"F:u[vv^^{1\/?/2\/LG}m[?*++ 1.8464393446710157 </lang>
Haskell
<lang haskell> import Data.List
main = print $ entropy "1223334444"
entropy s =
sum . map lg' . fq' . map (fromIntegral.length) . group . sort $ s where lg' c = (c * ) . logBase 2 $ 1.0 / c fq' c = map (\x -> x / (sum c)) c
</lang>
Perl 6
<lang Perl 6> sub entropy(@a) {
my %count; %count{$_}++ for @a; my @p = %count.values »/» @a.elems; -log(2) R/ [+] map { $_ * log $_ }, @p;
}
say entropy "1223334444".comb;</lang>