[Suggestion] - Aggregating geographical data over a map

1 view (last 30 days)
Sara Egidi on 3 Oct 2016
Commented: KSSV on 3 Oct 2016
Hi everyone, I would like to understand which is the best approach for my problem, as far as I understood there are plenty of options! I've a dataset of geolocated data over time. This is an humongous dataset, and for one minute I can get over 90 records consisting of latitude, longitude and intensity
Since I wanted to first build a prototype, I sampled the data and constructed an nxn matrix (for efficiency reasons 400x400), where I would plot the record lat-lon pair(transformed into the grid's coordinates) and intesity. I wanted to produce a plot over time, so I preferred to plot the matrix as a surface every say - 10 minutes. In order to do so, I would read the datafile in batches and plot the records.
Now, this is obviously not scalable. I'm trying to switch from the prototype to the real case scenario, but the data is just too many and too dense, and I understand there are other techniques like the Mapping toolbox that could be better for this problem. The real issue is that since the data is spread over a big city, the matrix should be ideally be a lot more than 400x400. I tried to preprocess the data and discretize the time and location in advance, but keeping that matrix in memory seems to slow a lot the process. So maybe the matrix isn't the ideal solution!
The records however are not "simple" marks left on the surface: each record is represented ad a gaussian over a matrix, and two marks in the same position correspond on a gaussian of double the height. As the time advances, those marks kind of evaporate. This is the reason why I opted to keep a matrix for the Z component until now -> simply summing the gaussian's intensity and remove a certain constant to simulate the evaporation of the marks.
Would you recommend any function/tool for this problem? Any kind of advice is kindly appreciated!

Sara Egidi on 3 Oct 2016
The data gets all added up to the same matrix, so I just keep that last one. The real problem is just that the matrix should be actually bigger, and maybe instead of having a 10000x10000 matrix it can be better to use a tool as the mapping one?
KSSV on 3 Oct 2016
I had to plot matrix of 6666X4710 for 1500 time steps, MATLAB was very slow on that. So I shifted to GMT for plotting such huge data.
KSSV on 3 Oct 2016
And if you feel, you are not loosing your data/ information; always you can down sample your data and plot in matlab.