Image:Binary entropy plot.svg

From Wikipedia, the free encyclopedia

Binary_entropy_plot.svg (SVG file, nominally 169 × 163 pixels, file size: 40 KB)

Wikimedia Commons logo This is a file from the Wikimedia Commons. The description on its description page there is shown below.
Commons is a freely licensed media file repository. You can help.

[edit] Summary

Description

en:Information entropy of a en:Bernoulli trial X. If X can assume values 0 and 1, entropy of X is defined as H(X) = -Pr(X=0) log2 Pr(X=0) - Pr(X=1) log2 Pr(X=1). It has value if Pr(X=0)=1 or Pr(X=1)=1. The entropy reaches maximum when Pr(X=0)=Pr(X=1)=1/2 (the value of entropy is then 1). The image was created in the following steps. First I have created a DVI version starting from a LaTeX/Pstricks source. Here is the code:

%Plot of information entropy of bernoulli variable
%
%latex binary_entropy_plot; dvips binary_entropy_plot
%open .ps file in gimp, choose strong antialias in both text and graphics,
%resulution 500, color mode, crop, scale to 45%, save as .png
\documentclass[12pt]{article}
\usepackage{pst-plot}
\begin{document}
\psset{unit=4cm}        
\begin{pspicture}(0,0)(1.01,1)
\psgrid[gridlabels=0pt,gridcolor=lightgray,subgriddiv=10,subgridcolor=lightgray](0,0)(0,0)(1,1)
\newrgbcolor{myblue}{0 0 0.7}
\psaxes[arrows=->,arrowsize=2pt 4,Dx=0.5,Dy=0.5](0,0)(0,0)(1.1,1.1)
\psplot[plotstyle=curve,plotpoints=100,linewidth=1.8pt,linecolor=myblue]{0.0001}{0.9999}{-1 x x log 2 log div mul 1 x sub 1 x sub log 2 log div mul add mul}
\rput(0.5,-0.22){$\Pr(X=1)$}
\rput{90}(-0.28,0.5){$H(X)$}
\end{pspicture}
\end{document}

compile it with latex to get the DVI. Then it was converted to PS with dvips. Finally it was converted to SVG using ps2svg.sh; it needed some post-processing with Inkscape

Source

original work by Brona, published on Commons at Image:Binary entropy plot.png. Converted to SVG by Alessio Damato

Date

April 2007

Author

Brona and Alessio Damato

Permission
(Reusing this image)

see below

Other versions Image:Binary entropy plot.png

[edit] Licensing

I, the copyright holder of this work, hereby publish it under the following license:
GNU head Permission is granted to copy, distribute and/or modify this document under the terms of the GNU Free Documentation license, Version 1.2 or any later version published by the Free Software Foundation; with no Invariant Sections, no Front-Cover Texts, and no Back-Cover Texts. A copy of the license is included in the section entitled "GNU Free Documentation license".

Aragonés | العربية | Asturianu | Български | বাংলা | ইমার ঠার/বিষ্ণুপ্রিয়া মণিপুরী | Brezhoneg | Bosanski | Català | Cebuano | Česky | Dansk | Deutsch | Ελληνικά | English | Esperanto | Español | Eesti | Euskara | فارسی | Suomi | Français | Gaeilge | Galego | עברית | Hrvatski | Magyar | Bahasa Indonesia | Ido | Íslenska | Italiano | 日本語 | ქართული | ភាសាខ្មែរ | 한국어 | Kurdî / كوردی | Latina | Lëtzebuergesch | Lietuvių | Bahasa Melayu | Nnapulitano | Nederlands | ‪Norsk (nynorsk)‬ | ‪Norsk (bokmål)‬ | Occitan | Polski | Português | Română | Русский | Slovenčina | Slovenščina | Shqip | Српски / Srpski | Svenska | తెలుగు | ไทย | Türkçe | Українська | اردو | Tiếng Việt | Volapük | Yorùbá | ‪中文(中国大陆)‬ | ‪中文(台灣)‬ | +/-

File history

Click on a date/time to view the file as it appeared at that time.

Date/TimeDimensionsUserComment
current15:19, 22 April 2007169×163 (40 KB)Alejo2083 ({{Information |Description=Information entropy of a Bernoulli trial ''X''. If ''X'' can assume values 0 and 1, entropy of ''X'' is defined as ''H''(''X'') = -Pr(''X''=0) log<sub>2</sub> Pr(''X''=0) - Pr(''X''=1) log<sub>2</sub> Pr(''X''=1). It has)
The following pages on the English Wikipedia link to this file (pages on other projects are not listed):