Hi everyone, I would like to understand which is the best approach for my problem, as far as I understood there are plenty of options! I've a dataset of geolocated data over time. This is an humongous dataset, and for one minute I can get over 90 records consisting of latitude, longitude and intensity
Since I wanted to first build a prototype, I sampled the data and constructed an nxn matrix (for efficiency reasons 400x400), where I would plot the record lat-lon pair(transformed into the grid's coordinates) and intesity. I wanted to produce a plot over time, so I preferred to plot the matrix as a surface every say - 10 minutes. In order to do so, I would read the datafile in batches and plot the records.
Now, this is obviously not scalable. I'm trying to switch from the prototype to the real case scenario, but the data is just too many and too dense, and I understand there are other techniques like the Mapping toolbox that could be better for this problem. The real issue is that since the data is spread over a big city, the matrix should be ideally be a lot more than 400x400. I tried to preprocess the data and discretize the time and location in advance, but keeping that matrix in memory seems to slow a lot the process. So maybe the matrix isn't the ideal solution!
The records however are not "simple" marks left on the surface: each record is represented ad a gaussian over a matrix, and two marks in the same position correspond on a gaussian of double the height. As the time advances, those marks kind of evaporate. This is the reason why I opted to keep a matrix for the Z component until now -> simply summing the gaussian's intensity and remove a certain constant to simulate the evaporation of the marks.
Would you recommend any function/tool for this problem? Any kind of advice is kindly appreciated!