Plenty of memory, but hard faults seem to bring my code to a standstill

8 views (last 30 days)
I have a sizeable simulation that includes interpolation of a 3D data set. It should take a long time (10 - 20 min), but lately(?) it seems to be coming to a standstill!
Reviewing my Resource Monitor, my CPU and disk get hit hard at first, but things really stop once I see a large number of Hard Faults. Then, my computer starts to crawl even with less than 50% of my 16 GB being used.
Are there system settings I can change to avoid this behavior? It is clearly not just a question of "buy more memory or a faster CPU".
My system:
  • Win 7 Pro / 64-bit
  • i7-4720HQ CPU @ 2.60 GHz, (x8) cores
  • 16.0 GB RAM
  • SSD (main drive, but only 15 GB left)
  • HD (secondary drive, tons of space remains

Answers (2)

Walter Roberson
Walter Roberson on 1 Feb 2017
This may sound strange, but you just might get more performance if you add one more row. You might be encountering cache resonace
  2 Comments
Brian
Brian on 1 Feb 2017
I think I understand. Your hypothesis is that there is something idiosyncratic in my simulation that causes it to page through memory in a way that prevents it from making progress.
The simulation has a scaling parameter. I can increase (decrease) the number of meta-elements in my simulation. Running it on two different computers with similar processor / memory, one never completes. The other completes in > 1 hr after starting. Both appear to be limited by the hard faults.
Seems like a little more than just the size of my simulation...
Walter Roberson
Walter Roberson on 2 Feb 2017
Hard faults are at the operating system level.
You mention that you still have a lot of memory left. Thinking about the situation, the one thing that comes to mind at the moment is that Windows would fault out under the situation that the virtual memory required was larger than the physical memory. That could potentially happen if your work involved large arrays that had bad locality (low re-use). I could imagine problems if your code involves dynamically resizing arrays (failure to preallocate.)
But as to why it would happen on one system but not the other: I would have to think about that more.

Sign in to comment.


Image Analyst
Image Analyst on 2 Feb 2017
16 GB of RAM is not that much. How big are your 3-D arrays? We sometimes work with 3-D CT data that is like 20 GB.
By the way, you'll see hard faults ( https://en.wikipedia.org/wiki/Page_fault ) even if your code is not causing them. You know that other things are going on in the background in your computer, don't you? They could be causing hard faults (too). If you monitor hard faults, does it suddenly dramatically increase when you run your code? Maybe the computer needs to swap your stuff in and out of RAM to handle other tasks it's doing in the background (backups, virus scans, whatever).

Categories

Find more on Simulink Functions in Help Center and File Exchange

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!