Huffman Coding Entropy

Loading...
Huffman Coding

Entropy H = ∑ p i ⋅ log 2 i = 2, p1 = 1,

1 pi

p2 = 0

p1 = 1 2 , p2 = 1 2

⇒H =0 ⇒ H = 1 bits / symbol

i = 4, p1 = p2 = p3 = p4 = 1 4 ⇒ H = 2 bits / symbol

• From information theory, the average number of bits needed to encode the symbols in a source S is always bounded by the entropy of S.

Huffman Coding „

„

„

Given the statistical distribution of the gray levels Generate a code that is as close as possible to the minimum bound, entropy A variable length code

Huffman Coding „

Five steps 1.

2.

3. 4. 5.

Find the gray-level probabilities for the image by finding the histogram Order the input probabilities (histogram magnitudes)from smallest to largest Combine the smallest two by addition GOTO step2,until only two probabilities are left By working backward along the tree,generate code by alternating assignment of 0 and 1

Example Number of pixels

40 30 20 10

20

g = 100 = 0.2 0

30

g = 100= 0.3 1

10

g = 100 = 0.1 2

Gray level a. Step 1:Histogram

40

g = 100 = 0.4 3

Original Gray Level (Natural Code)

Probability

Huffman code

g

00

0.2

010

g

01

0.3

00

g

10

0.1

011

g

11

0.4

1

0

1

2

3

3

Entropy = − ∑ i =0

p log( p ) i

i

= −[(0.2) log(0.2) + (0.3) log(0.3) + (0.1) log(0.1) + (0.4) log(0.4)] ≈ 1.846 bits/pixel

L

ave

=

L −1

∑ l p i= 0

i

i

= 3 ( 0 .2 ) + 2 ( 0 .3 ) + 3 ( 0 .1 ) + 1( 0 .4 )

=1.9 bits/pixel

Example

Relative frequency of characters in a message text. A larger number indicates higher frequency of use. These are the characters for which we will want the shortest code.

Group Lowest

Group Next Two Lowest

Group Next Two Lowest

Group Next Two Lowest

Group Next Two Lowest

Finally All Characters Are Leaf Nodes of a Single Binary Tree

Assign 0 to Left, 1 to Right

Code to a Character Is The Path to That Character from The Root

Code for Each Character Is The Path to That Character

Encode The Following Message

How to Decode Coded Message

How to Decode Coded Message

How to Decode Coded Message

Example: Huffman Coding Carphone motion vectors

„

log2(1/P) „ „

Information content To achieve optimum compression, each value should be represented with exactly log2(1/P) bits

Huffman Code Tree

Huffman Codes

Huffman Code „

„ „

High probability data items are assigned short codes No code contains any other code as a prefix Reading from the left-hand bit, each code is uniquely decodable

Example: Huffman Coding Claire motion vectors

A much higher proportion of vectors are zeros for Claire.

Huffman Code Tree

Huffman Codes

„

To achieve optimum compression, a separate code table is required for each of the two sequences Carphone and Claire

„

The loss of potential compression efficiency due to the requirement for integral length codes is very obvious for vector 0 in the Claire sequence

Modified Huffman Coding „

„ „

„

The image and video coding standards define sets of codewords based on the probability distributions of a large range of video material Generic table Compression efficiency is lower than that obtained by pre-analysing the data to be encoded Advantage: not requiring to calculate and transmit individual probability tables

Unconstrained Length

Constrained Length • Shortened Huffman code

7 7 7 7 7 7 7 7

EscapeEscape-code + FixedFixed-length code

H.26L Universal VLC „ „

Uniform Regular structure

Loading...

Huffman Coding Entropy

Huffman Coding Entropy H = ∑ p i ⋅ log 2 i = 2, p1 = 1, 1 pi p2 = 0 p1 = 1 2 , p2 = 1 2 ⇒H =0 ⇒ H = 1 bits / symbol i = 4, p1 = p2 = p3 = p4 = 1...

390KB Sizes 109 Downloads 24 Views

Recommend Documents

Matlab code for extended huffman coding
Nathan dinnerless rehabilitate its kata pengantar makalah reformasi birokrasi air dries very orderly. bonnie Weider misc

Information, Entropy, and Coding - Princeton University
Audio In digital audio, it's typical to use 16 bits per sample and 44,100 .... The bit strings used to represent the sym

Huffman Coding Variable Rate Codes Cost of Huffman Trees Cost of
Feb 27, 2011 - Applied Algorithmics - week7. 2. □ Two different encodings of AABDDCAA. ▫ 0000011111100000 (16 bits).

BioTechniques - An improved Huffman coding method for archiving
For retrieving information from the 844-bp DNA fragment in our particular example, sense primers 1 and 2 were flanking a

ENTROPY
ENTROPY. Entropy is important for two main reasons: • Entropy is a basic concept in physics and information science, b

Huffman - UW Oshkosh
COURSE DESCRIPTION: Investment Analysis and Portfolio Management covers both modern investment and portfolio ... Reilly,

Entropy of grayscale image - MATLAB entropy - MathWorks
This MATLAB function returns E, a scalar value representing the entropy of grayscale image I.

1 Entropy
15-359: Probability and Computing. Fall 2009. Lecture 25: Elements of Information Theory. Information Theory is a major

Shannon entropy
This chapter is a digression in information theory. This is a fascinating subject, which arose once the notion of inform

entropy - MDPI
1 day ago - 2D Tsallis entropy is proposed to solve the problem, and results are compared with 1D Fisher, 1D maximum ...