Hi,
I wrote some code to implement a rudimentary lookup table for a voltage/time dataset. But because of the size of my dataset, it is freezing up.
Quick background, there might be around 1000 rows (time intervals) for each run and there are around 1500 runs, so 1.5m rows in total. In a nutshell, I am doing the following in my code before I start using the lookup table:
- Process dataset
- Calculate dV/dt from data
- Use tabulate to find the peak dV/dt for each individual run and create a new data table called dt2.
- Create a new column in the dataset to write the peak dV/dt value next to each dV/dt value in each run.
At this point my code starts going line by line through dt1, cross checking the different test parameters that define the # run against dt2 and finding the peak dV/dt and returning it to that line in dt1.
My code for searching the lookup table looks like this:
Current Data Table( dt1 );
For Each Row(
v = :sample;
w = :temp;
x = :cell;
y = :vicl;
z = :v_bg;
// Find the row in dt2 that matches v/w/x/y/z
row = dt2 << get rows where( :sample == v & :temp == w & :cell == x & :vicl == y & :v_bg == z);
// get values from the dV/dt peak column in dt2
col = Column(dt2, "dvdt_abs_max") << getValues;
// pick out the value of row[n] from that column and save into the active row in dt1
:dVdt_peak = col[row]
);
This code worked when I tested it for a small set, but once I had my complete dataset, the infinite spinning wheel appeared. I'm pretty new to this and am sure there must be a better more efficient way to do this, I just haven't figured it out.
Thanks