viernes, 27 de marzo de 2015

Grupo B ficha 891434 Entrada Nº4

Entropy

The entropy, in the theory of the information, is a magnitude that measures the information provided by a data source, that is to say, what contributes us on an information or concrete fact. For example, that say to us that the streets are wetted, knowing that it has just rained, little information contributes us, because it is the habitual thing. But if they say to us that the streets are wetted and we know that it has not rained, it contributes a lot of information (because they do not water them every day). One notice that in the previous example the quantity of information is different, in spite of treating itself about the same message: The streets are wet. On it there are based the technologies of compression of information, which allow to pack the same information in more short messages. The measure of the entropy can be applied to sources of information of any nature, and allows us to codify it adequately, indicating the code elements necessary to us to transmit it, eliminating any redundancy. (To indicate the result of a coarse career of horses in spite of transmitting the code associated with the winning horse, it is not necessary to tell that it is either a career of horses or his development).

The entropy also can be considered to be the quantity of average information that the secondhand symbols contain. The symbols with minor probability are those who contribute major information; for example, if it is considered to be a system of symbols to the words in a text, frequent words as "which", "", "a" contribute little less frequent information, whereas words since "they" "run", "child", "dog" contribute more information. If of a given text we erase one "that", surely it will not concern the comprehension and will be implied, not being like that if we erase the word "child" of the same original text. When all the symbols are equally probable (distribution of flat probability), they all contribute relevant information and the entropy is maximum.

No hay comentarios:

Publicar un comentario