I have 605 data files I need to concatenate after filtering out the data I don't need from each file. Looping through each file, deleting the rows I don't need, concatenating the current file to the first file, then closing the current file works fine for the first 10+ files but gets exponentially slower with each additional file. Any suggestions?
I have noticed that each time a file is concatenated the source script gets added to the data table, could this be what is slowing things down? Is there a way to prevent this or delete them?