UniqueTol Exclusion of Similar Points
2 views (last 30 days)
Show older comments
I am trying to remove detected circles that are too close to one another. My original plan involved indexing through all of the centers (X Y stored in separate columns), determining the distances, and then using those to remove points within a tolerance.
for nn=1:height(location_table)
distances=sqrt(((location_table(nn,1) - location_table(:,1)).^2)+((location_table(nn,2) - location_table(:,2)).^2))
too_close=(0<distances & distances<tolerance);
location_table(too_close,:)=[]
end
I was never able to get the removal to work. It would end up breaking if spots were removed because the index nn would eventually get larger than the current table height.
I discovered the uniquetol command when searching for answers. However, in my use of it, it filters out spots that are not unique (ie, within the tolerance). However, I would want it to remove spots that all spots that are too similar (rather than leaving a representative one behind).
location_table=uniquetol(location_table, closeness_tolerance, 'ByRows', true, 'OutputAllIndices', true, 'DataScale',1);
I plan to do radial intensity line scans starting at the center of all circles (and proceeding past their perimeter by a variable) that make it through this filtering process. If spots are too close, then the scan (improfile) will cross over the border of the intended scan as well as a spot that is too close to it (which will skew the data). For example, circle 4 should be excluded because it is too close to other circles and its scans could be skewed. Uniquetold allowed me to exclude the nearby circles but 4 would ideally not be included either.
0 Comments
Accepted Answer
dpb
on 24 Sep 2025
Edited: dpb
on 24 Sep 2025
"...end up breaking if spots were removed because the index nn would eventually get larger than the current table height."
for nn=1:height(location_table)
distances=sqrt((location_table(nn,1)-location_table(:,1)).^2+(location_table(nn,2)-location_table(:,2)).^2);
too_close=(0<distances & distances<tolerance);
location_table(too_close,:)=[]
end
The problem as you've discovered is that when you remove one or more rows, the height is reduced and so can't iterate over the original height any longer. The trick in these cases is to not actually delete the row(s) but mark them as deleted in an auxiliary variable; then when done, remove the marked rows.
N=height(location_table);
marktodelete=false(N,1); % preallocate flag array to delete
for nn=1:N
distances=pdist2(location_table,location_table(nn,:)); % calculates pairwise distance for you
too_close=(distances>0 & distances<tolerance);
marktodelete=marktodelete | too_close; % update the flag array
end
location_table(marktodelete,:)=[]; % or location_table=location_table(~marktodelete,:); % keep not marked
However, wouldn't you want to delete the zero-distance ones as well based on your description if there is more than one element in thee too_close vector on each pass?
With uniquetol, you've change the search criterion to be based on each x,y being within a tolerance rather than the overall distance if that matters to you.
Without a representative dataset to play with, and being too lazy to try to create one <vbg>, I think you would need two steps -- keep the list of unique points as have and then locate any left that are within that same tolerance. To do that you've got to generate a new locations list for the remaining and then compare those to the pared list.
4 Comments
dpb
on 24 Sep 2025
Edited: dpb
on 24 Sep 2025
Ah, yes! Again "air code" and forgot the second dimension...I should have recognized that.
One improvement would be to check if the element
location_table(nn,:)
in the loop has already been marked for deletion in a prior loop and if so, continue because it is in a group already identified. Oh! not having that is what will remove the zero-distance one -- when look at it from one of the others that will have been in the same group, then the first one that was in the group won't be the zero-distance one any more.
Unless the list is exceedingly long, the extra work probably isn't noticeable.
More Answers (0)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!