How to avoid memory problem while processing huge table?
1 view (last 30 days)
Show older comments
I have a huge observation table with around 30 Lacs of rows and 12 columns. While training knn classifier in 2016a version, I am getting errors related to memory. Is there any way to avoid this? I have tried to reduce rows but it's affecting the output quality.
Each row in table is a pixel and it's other values as features in columns. In one set of MRI scan, there are around 20 images of 512x512, I am loading one set for creating observation table. Is there another way to pass large amount of data to knn classifier?
0 Comments
Answers (1)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!