Hi Evan,
There are a number of ways you could go around it. There is a specific challenge with your duplicate removal in that you want to remove 'duplicates' that occur within an hour of the same alarm. This requires lag function logic, which in turn requires the data be appropriately ordered before execution. Furthermore if you delete a row before execution of the next logic then you could end up with out of bounds errors in looping. One workaround is to set a flag for duplicate removal, this can be done by creating a boolean column or by using some built in features. Below I've modified your code to use row selection as that flag. After the logic is performed to determine if the alarm is a duplicate, all rows are deleted at once. Further speed is added by opening the table in invisible mode so that screen I/O is avoided.
/* data table made invisible to speed up execution */
dt = Open( "SR_Loop_LAM_002_Alarm_Log.csv" , invisible );
/* updated for multiple sort */
dt << Sort( By( :ENTITY, :ALARM_DATE ), Order( Ascending, Ascending ), Replace Table( 0 ) );
/* clear row and column selection first (just good coding habit) */
dt << Clear Select;
dt << Clear Column Selection;
/* select rows that meet the specified conditions
because this is going from bottom to top the loop should stop at row 1 (there is no row 0)
*/
For( i = N Rows( dt ), i > 1, i--,
If(
:ENTITY[i - 1] == :ENTITY[i] &
:ALARM_ID[i - 1] == :ALARM_ID[i] &
Date Difference( :ALARM_DATE[i - 1], :ALARM_DATE[i], "hour", "fractional" ) < 1,
dt << Select Rows( i )
)
);
/* delete the selected rows */
dt << Delete Rows;
/* final sort */
dt << Sort( By( \:ALARM_DATE ), Order( Descending ), Replace Table( 0 ) );
dt << save( "SR_Loop_LAM_002_Alarm_Log_cleansed.csv" );
Close( dt, no save );