You are now following this question
- You will see updates in your followed content feed.
- You may receive emails, depending on your communication preferences.
huffman encoding for image
8 views (last 30 days)
Show older comments
for the code-
A=imread('xyz.jpg');
A1=rgb2gray(A);
A = A1(:);
[symbols,p]=hist(A,double(unique(A)))
p=p/sum(p)
symbols=symbols/sum(symbols)
[dict,avglen]=huffmandict(symbols,p)
comp=huffmanenco(A,dict)
i am getting error-
Error using huffmandict (line 164)
Source symbols repeat
Error in new (line 7)
[dict,avglen]=huffmandict(symbols,p)
suggest necessary changes
3 Comments
KALYAN ACHARJYA
on 10 Oct 2018
Hello @Nidhi
Walter sir already answered the question? Am I right? I also pointed the way.
Have you looked on that? p must be the same dimensions as symbols
Nidhi Kumari
on 10 Oct 2018
But how to make them same?
KALYAN ACHARJYA
on 10 Oct 2018
@Nidhi
Accepted Answer
OCDER
on 10 Oct 2018
I haven't used huffmandict before, but based on the document and the error message you have, I suspect your inputs are wrong.
Double check your inputs to huffmandict. Make sure symbols are unique values, and p are the probability of each value. That means:
[symbols, p] = hist(A,double(unique(A))) should be
[p, symbols] = hist(A,double(unique(A))) because the 1st output of hist is the frequency (probability)
and the second output is the unique bin (unique symbol)
30 Comments
Nidhi Kumari
on 10 Oct 2018
In that case i am getting a different error-
The Huffman dictionary provided does not have the codes for all the input signals.
OCDER
on 10 Oct 2018
Hm, try this:
A = uint8(randi(255, 100, 100, 3)); %imread('xyz.jpg');
A1 = rgb2gray(A);
A = A1(:);
[symbols, ~, idx] = unique(A);
p = histcounts(idx, 1:max(idx)+1);
p = p/sum(p);
[dict, avglen] = huffmandict(symbols, p);
comp = huffmanenco(A, dict)
Nidhi Kumari
on 11 Oct 2018
I am using R2014a ,so histcounts() is not present. Can you suggest any other alternative?
Nidhi Kumari
on 11 Oct 2018
Still no luck-
Error using eps
Class must be 'single' or 'double'.
Error in hist (line 116)
edgesc = edges + eps(edges);
Nidhi Kumari
on 11 Oct 2018
Probability of an input symbol cannot be greater than 1
Nidhi Kumari
on 11 Oct 2018
Ok now i got something, but the image is a thin vertical line with the message-
Image is too big to fit on screen; displaying at 0%
Also 'p' is 256x1 double while 'symbols' is 256x1 uint8 .
OCDER
on 11 Oct 2018
It's a 256x1 double because uint8 numeric format ranges from 0 to 255 (256 values). p should be a double because it's a probability value from 0 to 1. symbols should be whatever unique value, which is 0 to 255 integer number. You could do double(symbols) to make it double.
Your images is a vertical line because of this code you used:
A = A1(:); %make images a vertical vector
To fix, you'll have to reshape your A back to a MxN vector via
A = reshape(A, size(A1, 1), size(A1, 2))
Nidhi Kumari
on 11 Oct 2018
where do i have to put this line?
Nidhi Kumari
on 14 Oct 2018
The current code is-
A=imread('xyz.jpg');
A1=rgb2gray(A);
A = A1(:);
[symbols, ~, idx] = unique(A);
counts = accumarray(idx, 1);
p = zeros(size(symbols));
for j = 1:max(idx)
p(j) = sum(j == idx);
end
p = p/sum(p)
[dict,avglen]=huffmandict(symbols,p);
comp=huffmanenco(A,dict);
imshow(comp);
Walter Roberson
on 14 Oct 2018
The output of huffmanenco is not an image: it is a double vector with values 0 and 1. You should not be using imshow() on it unless you are prepared for exactly what you got -- a single pixel wide and probably extending to the limits of the screen.
The output of huffmanenco is encoding data, not an image.
Nidhi Kumari
on 20 Oct 2018
huffmandeco(comp,dict) is taking a lot of time. What to do?
Walter Roberson
on 20 Oct 2018
It should be fairly fast for dictionary of reasonable size. The worst case would probably be where the symbols were all close to the same probability as that would produce a tree of maximum size.
Nidhi Kumari
on 21 Oct 2018
Dictionary size is 256x2 cell in my case. And it is taking more than 20 minutes for a 40KB image. So is there any chance that my program is wrong?
Walter Roberson
on 21 Oct 2018
I tested on random data that was 199 x 201 . The huffmandeco took less than 1 second.
Nidhi Kumari
on 26 Oct 2018
a = imread('xyz.jpg');
imshow(a);
A1=rgb2gray(a);
imhist(A1);
[M N]=size(A1);
A = A1(:);
count = 0:255;
p = imhist(A1) / numel(A1)
[dict,avglen]=huffmandict(count,p) % build the Huffman dictionary
comp= huffmanenco(A,dict); %encode your original image with the dictionary you just built
compression_ratio= (512*512*8)/length(comp) %computing the compression ratio
%%DECODING
Im = huffmandeco(comp,dict); % Decode the code
I11=uint8(Im);
decomp=reshape(I11,M,N);
imshow(decomp);
This is my whole code and i am not able to find the cause to execution delay. Can you please provide your code for reference?
Walter Roberson
on 26 Oct 2018
Using your code on examples/deeplearning_shared/test.jpg (480 x 640, about 300 Kb), the decoding takes about 7 seconds.
Nidhi Kumari
on 26 Oct 2018
I am sorry but i don't get what 'examples/deeplearning_shared/test.jpg' image is or how to access it.
Walter Roberson
on 26 Oct 2018
Edited: Walter Roberson
on 26 Oct 2018
Do you have the Image Processing Toolbox?
If you use
imshow('peppers.png')
then does an image show up?
As of R2018b that particular image was moved to the Deep Learning Toolbox (formerly known as Neural Network Toolbox), and a new image test.jpg was added to that directory.
But since peppers.png was present for quite a number of releases, use that. In my test on peppers.png, the decoding took about 4.3 seconds.
function huff
a = imread('xyz.jpg');
imshow(a);
A1=rgb2gray(a);
imhist(A1);
[M, N]=size(A1);
A = A1(:);
count = 0:255;
p = imhist(A1) / numel(A1);
tic; [dict,avglen]=huffmandict(count,p); toc % build the Huffman dictionary
tic; comp= huffmanenco(A,dict); toc %encode your original image with the dictionary you just built
compression_ratio= (512*512*8)/length(comp); %computing the compression ratio
display(compression_ratio)
%%DECODING
tic; Im = huffmandeco(comp,dict); toc % Decode the code
I11=uint8(Im);
decomp=reshape(I11,M,N);
imshow(decomp);
Nidhi Kumari
on 26 Oct 2018
But when i used peppers.png then it took in total 6 minutes to give the 'decomp' image and that too in grayscale. What to do????
OCDER
on 26 Oct 2018
What's your RAM and CPU for the computer? Could be that there isn't enough physical RAM for processing, so you're using HDD, which is much slower. If that's not the issue, use profile to see which step is taking the most time.
profile on
runCode %run your code here
profview
Walter Roberson
on 26 Oct 2018
You need to expect grayscale, since you are doing A1=rgb2gray() and encoding A1.
Is your computer especially slow, or does it have a very small amount of memory? If you are using MS Windows, what does
memory
show?
If you invoke
bench
then where does your system show up in comparison to other systems?
Nidhi Kumari
on 26 Oct 2018
@Walter Roberson
Walter Roberson
on 26 Oct 2018
You are using a version of MATLAB no later than R2017b: I can tell because is_a_valid_code was removed from huffmandeco as of R2018a.
Looking at the amount of memory you have, I suspect you are running R2015b or earlier with a 32 bit MATLAB.
When I test with peppers.png on R2017b on my system, the decoding took about 53 seconds.
Walter Roberson
on 26 Oct 2018
I have a memory of having reported an inefficiency to MATLAB having to do with is_a_valid_code, that it was being called far too often. However, I do not seem to find that in my case list, so perhaps I reported it years ago on a different license.
Anyhow: upgrading to R2018a or later would probably speed up quite a bit.
More Answers (0)
See Also
Categories
Find more on Source Coding in Help Center and File Exchange
Tags
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!An Error Occurred
Unable to complete the action because of changes made to the page. Reload the page to see its updated state.
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom(English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)