Anonymous user
Talk:Entropy: Difference between revisions
→Description
Line 42:
:: [[User:CRGreathouse|CRGreathouse]] ([[User talk:CRGreathouse|talk]]) 00:55, 18 April 2013 (UTC)
: Yes, it was supposed to be bits/character and I've corrected it. Shannon called N the number of "symbols" and I call the number of different
:Physics is our best known (most efficient) set of non-lossy compression functions for the data observed in the world, which is a way of saying Occam's razor was applied. Other sets of compression functions like religion + democracy + free markets are more intelligent than physics's functions if they result in more profit for the computing device implementing the functions (algorithms), in keeping with Marcus Hutter's definition of intelligence. Deciding the most efficient way to move the symbols around with electrons in CPUs, ions in brains, votes in government, reputation in open source, or money in markets allows the computing device (CPUs implementing real A.I., brains, government, open source, markets) to most efficiently move the much larger real-world objects that the symbols represent so that the computing device can more efficiently make copies of itself (seek lower entropy). This is how entropy relates to information and how information relates to intelligence and evolution which are the result of least action dynamics seeking lower physical entropy in the distribution of N atoms on Earth, as the excess entropy is sloughed off to space with 17 low-energy un-directed photons per incoming sun-directed photon, so that the Universe can expand (entropy and energy per comoving volume of the universe is constant). The result is higher energy bonds (lower entropy due to smaller volume) of the N atoms on Earth of n types, which is why machines that utilize metal and metalloid (e.g. silicon) atoms with their previously attached oxygen atoms removed (resulting in carbon-carbon, metal-metal, silicon-silicon) are replacing biology: they are more efficient due to the higher energy low entropy bonds. This is why entropy is important. [[User:Zawy|Zawy]] ([[User talk:Zawy|talk]]) 10:33, 23 January 2016 (UTC)
|