Entropy: Difference between revisions
m (whitespace) |
|||
Line 6: | Line 6: | ||
=={{header|Burlesque}}== |
=={{header|Burlesque}}== |
||
⚫ | |||
⚫ | |||
<lang burlesque> |
|||
⚫ | |||
⚫ | |||
</lang> |
|||
=={{header|D}}== |
=={{header|D}}== |
||
<lang d>import std.stdio, std.algorithm, std.math; |
<lang d>import std.stdio, std.algorithm, std.math; |
||
Line 33: | Line 29: | ||
=={{header|Haskell}}== |
=={{header|Haskell}}== |
||
⚫ | |||
<lang haskell> |
|||
⚫ | |||
main = print $ entropy "1223334444" |
main = print $ entropy "1223334444" |
||
Line 42: | Line 36: | ||
sum . map lg' . fq' . map (fromIntegral.length) . group . sort $ s |
sum . map lg' . fq' . map (fromIntegral.length) . group . sort $ s |
||
where lg' c = (c * ) . logBase 2 $ 1.0 / c |
where lg' c = (c * ) . logBase 2 $ 1.0 / c |
||
fq' c = map (\x -> x / (sum c)) c |
fq' c = map (\x -> x / (sum c)) c </lang> |
||
</lang> |
|||
=={{header|J}}== |
=={{header|J}}== |
||
'''Solution''':<lang j> entropy=: +/@:-@(* 2&^.)@(#/.~ % #)</lang> |
'''Solution''':<lang j> entropy=: +/@:-@(* 2&^.)@(#/.~ % #)</lang> |
||
{{out|Example}} |
|||
<lang j> entropy '1223334444' |
|||
1.84644</lang> |
1.84644</lang> |
||
=={{header|Perl}}== |
=={{header|Perl}}== |
||
Line 71: | Line 64: | ||
=={{header|Ruby}}== |
=={{header|Ruby}}== |
||
{{works with|Ruby|1.9}} |
{{works with|Ruby|1.9}} |
||
<lang ruby>def entropy(s) |
<lang ruby>def entropy(s) |
||
counts = Hash.new(0) |
counts = Hash.new(0) |
||
Line 81: | Line 73: | ||
end |
end |
||
end</lang> |
end</lang> |
||
One-liner, same performance: |
One-liner, same performance: |
||
<lang ruby>def entropy2(s) |
<lang ruby>def entropy2(s) |
||
s.each_char.group_by(&:to_s).values.map { |x| x.length / s.length.to_f }.reduce(0) { |e, x| e - x*Math.log2(x) } |
s.each_char.group_by(&:to_s).values.map { |x| x.length / s.length.to_f }.reduce(0) { |e, x| e - x*Math.log2(x) } |
Revision as of 21:50, 22 February 2013
You are encouraged to solve this task according to the task description, using any language you may know.
Calculate the entropy (shannon entropy) of a given input string. Use "1223334444" as an example. The result should be around 1.84644 bit.
Burlesque
<lang burlesque>blsq ) "1223334444"F:u[vv^^{1\/?/2\/LG}m[?*++ 1.8464393446710157</lang>
D
<lang d>import std.stdio, std.algorithm, std.math;
double entropy(T)(T[] s) /*pure nothrow*/ if (__traits(compiles, sort(s))) {
return s .sort() .group .map!(g => g[1] / cast(double)s.length) .map!(p => -p * log2(p)) .reduce!q{a + b};
}
void main() {
"1223334444"d.dup.entropy.writeln;
}</lang>
- Output:
1.84644
Haskell
<lang haskell>import Data.List
main = print $ entropy "1223334444"
entropy s =
sum . map lg' . fq' . map (fromIntegral.length) . group . sort $ s where lg' c = (c * ) . logBase 2 $ 1.0 / c fq' c = map (\x -> x / (sum c)) c </lang>
J
Solution:<lang j> entropy=: +/@:-@(* 2&^.)@(#/.~ % #)</lang>
- Example:
<lang j> entropy '1223334444' 1.84644</lang>
Perl
<lang Perl>sub entropy {
my %count; $count{$_}++ for @_; my @p = map $_/@_, values %count; my $entropy = 0; $entropy += - $_ * log $_ for @p; $entropy / log 2
}
print entropy split //, "1223334444";</lang>
Perl 6
<lang Perl 6>sub entropy(@a) {
- [+] map -> \p { p * log p }, @a.bag.values »/» +@a;
}
say log(2) R/ entropy '1223334444'.comb;</lang>
Ruby
<lang ruby>def entropy(s)
counts = Hash.new(0) s.each_char { |c| counts[c] += 1 }
counts.values.reduce(0) do |entropy, count| freq = count / s.length.to_f entropy - freq * Math.log2(freq) end
end</lang> One-liner, same performance: <lang ruby>def entropy2(s)
s.each_char.group_by(&:to_s).values.map { |x| x.length / s.length.to_f }.reduce(0) { |e, x| e - x*Math.log2(x) }
end</lang>