Dear all,
I have rectangular 2D-space gridded data that I want to interpolate/evaluate on a list of scattered (custom) Gridpoints.
The original (sample) grid is in 2D (longitude-latitude), and the variable to interpolate is organised in several dimensions:
As Mathworks suggests, it is faster to "create interpolant F and evaluate multiple times" than to "compute interpolation separately" Altought preparing the 2D gridded interpolant is pretty straigthforward, I'm at a loss as to choose the best approach when the variable to interpolate is multidmensionnal.
So far, I can do this:
lon = [-75 -73,5 -72 -70,5 -69];
lat = [46,5 48 49,5 51 52,5 54];
[Sg1, Sg2] = ndgrid(lon,lat);
Query_points = [-74.1; 47,
-74.12; 46.99,
...
for horizon = 1:40
for member = 1:4
for date = 1:8000
temp_precip = squeeze(precip(:,:,date,member,horizon));
F = griddedInterpolant(Sg1,Sg2, temp_precip);
Q_precip = F(Query_points);
Final_data(:,date) = Q_precip;
end
end
end
This approach works, but feels inneficcient (as well as taking a few seconds per the innermost dateloop, making the whole operation several minutes long). Is there a way to vectorize or unroll the date loop for faster execution? (the outmost and middle loop have to stay because of other operations I've ommited for the sake of simplicity)
I've tried creating a 3D interpolant by including the date dimension as a "virtual" 3D-space interpolant, such as
lon = [-75 -73.5 -72 -70.5 -69];
lat = [46.5 48 49.5 51 52.5 54];
date = [720000:1:728000];
[Sg1, Sg2, Sg3] = ndgrid(lon,lat,date);
Query_points =
But this feels unnecessarily bloated, as I'm not evaluating "in-between" dates (the sample and query dates are identical) and the resulting output can be a very very large array.
TL;DR In other words, what is the best approach to evaluate a multidimensional array of data representing values of a variable expressed in 2D space, if not looping on every dimensions of the data?
Thank you, I've searched around but can't wrap my head around this problem,
Simon.