How could I correct for glare when processing image?
Show older comments
I'm working on a code to track the postion of debris. The code converts the image first to gray scale and then to binary. It then takes the median value of the x-component of all pixels which are black. This gives the approximate x position of the debris (since the debris is darker than the water). However, when light reflects off the block, that portion does not get picked up, shifting the calculated position. Is there a way to adjust for this?

Answers (1)
Image Analyst
on 9 Dec 2018
0 votes
It's so much easier to prevent it in the first place than to correct it, which can't be done perfectly and without artifacts. Why don't you ust use crossed polarizers, or HDR photography?
14 Comments
Ephraim Bryski
on 9 Dec 2018
Image Analyst
on 9 Dec 2018
Well then let's see if we can develop an algorithm that doesn't care about the specular reflection. What do you need to know about that falling blue bar?
By the way, it's very cheap to put a polarizer in front of your flash, hold the blue bar at that location, and put another rotatable polarizer in front of your lens, and then spin the lens polarizer until the specular reflection on the blue bar disappears. Maybe $100 or so. How much were you thinking it would cost? Tens of thousands? No, it's a method for people with a very low budget like you.
Ephraim Bryski
on 9 Dec 2018
Image Analyst
on 9 Dec 2018
Maybe, but have you considered taking the convex hull with bwconvhull() and then finding the centroid with regionprops?
mask = bwconvhull(mask);
% Extract the largest only
mask = bwareafilt(mask, 1);
% Find centroid
props = regionprops(mask, 'Centroid');
xCentroid = props.Centroid(x);
yCentroid = props.Centroid(y);
% Plot red crosshairs at centroid location
hold on
plot(x, y, 'r+', 'MarkerSize', 30);
Ephraim Bryski
on 10 Dec 2018
Image Analyst
on 10 Dec 2018
It looks like your segmentation is not good. Are you able to take a blank shot of just the background with no objects or debris in there? That would help immensely because you could subtract the background to see what pixels are non-background pixels.
Ephraim Bryski
on 11 Dec 2018
Edited: Ephraim Bryski
on 12 Dec 2018
Image Analyst
on 12 Dec 2018
Just detect the difference
diffImage = abs(double(testImage) - double(backgroundImage));
mask = diffImage > 5; % Get pixels where there is more than 5 gray level difference between the two.
Ephraim Bryski
on 12 Dec 2018
Image Analyst
on 12 Dec 2018
Are they still frames or do you have a video? Can you use optical flow?
Ephraim Bryski
on 12 Dec 2018
Image Analyst
on 12 Dec 2018
It seems the problem is that the debris changes the background, like it cast shadows or reflections. Or else your lighting is not the same in a test image as when you took the background image. Is that correct? Can you simply change the background to something contrasting and uniform, like a solid black velvet wall, or a green or red wall?
Ephraim Bryski
on 12 Dec 2018
Image Analyst
on 12 Dec 2018
Then you may just have to fallback onto manual tracing since you have so little time - not enough to develop a robust algorithm. See my two attached demos for imfreehand.
If you just want to count things, you could use impoint() or ginput() and count the number of times the user clicked on debris objects. Use imfreehand or it's newer replacement drawfreehand() if you need areas of objects.
Categories
Find more on Image Transforms in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!