Nothing Special »
Address
:
[go:
up one dir
,
main page
]
Include Form
Remove Scripts
Accept Cookies
Show Images
Show Referer
Rotate13
Base64
Strip Meta
Strip Title
Session Cookies
Jump to content
Main menu
Main menu
move to sidebar
hide
Navigation
Main page
Contents
Current events
Random article
About Wikipedia
Contact us
Contribute
Help
Learn to edit
Community portal
Recent changes
Upload file
Search
Search
Appearance
Donate
Create account
Log in
Personal tools
Donate
Create account
Log in
Pages for logged out editors
learn more
Contributions
Talk
Template
:
Information theory
8 languages
Afrikaans
العربية
বাংলা
日本語
Русский
Simple English
Українська
粵語
Edit links
Template
Talk
English
Read
Edit
View history
Tools
Tools
move to sidebar
hide
Actions
Read
Edit
View history
General
What links here
Related changes
Upload file
Special pages
Permanent link
Page information
Get shortened URL
Download QR code
Print/export
Download as PDF
Printable version
In other projects
Wikidata item
Appearance
move to sidebar
hide
From Wikipedia, the free encyclopedia
Information theory
Entropy
Differential entropy
Conditional entropy
Joint entropy
Mutual information
Directed information
Conditional mutual information
Relative entropy
Entropy rate
Limiting density of discrete points
Asymptotic equipartition property
Rate–distortion theory
Shannon's source coding theorem
Channel capacity
Noisy-channel coding theorem
Shannon–Hartley theorem
v
t
e
Template documentation
[
create
] [
purge
]
Editors can experiment in this template's sandbox
(
create
|
mirror
)
and testcases
(
create
)
pages.
Add categories to the
/doc
subpage.
Subpages of this template
.
Category
:
Computer science sidebar templates