Hi @RedMonster796 ,
Do you have to review every single data point? Is each 0.1s really that important? Could you get by with cutting it back to every 10s or something? That certainly is a lot of data to load into memory, but I honestly am not sure that spending $17k on a processor is going to be worth it.
Definitely go for a fast processor and one that has multiple cores, but the processor alone isn't the whole story. You should also consider how much RAM your machine will/should need. I have 16GB in my work notebook and haven't had any problems dealing with files that are a million rows tall. However, considering that your data sets are 72mil and 3GB in size, you might consider upping your RAM and getting at least 32GB of RAM -- in which case you'll want to make sure that both your motherboard and processor can access all 32GB of RAM, otherwise you're losing out on resources for your computer.
Another thing to consider and have in mind is your GPU. Considering that you're wanting to visualize up to maybe 72mil rows, you should probably get a better GPU than a standard default that the system would come with. This way, you should be able to optimize the processing (math for statistics) from the graphical display of the data.
Again, you'll want to make sure that the motherboard can work with and handle a higher end GPU, as you'd need to verify that about the CPU and RAM as well.
I'd still come back to the task of evaluating whether every 0.1s, 24/7 is really necessary for your data analysis. Or, if you can identify any kind of trigger points in your data that you can then run a JSL script that creates a smaller data table that saves -- I don't know -- 1000 data points prior and 1000 data points after said trigger event. It seems to me that the majority of the time, you probably aren't needing to evaluate things every 0.1s and can probably get away with a larger time step. But, that's just a guess as I don't know what kind of system you're dealing with and how crucial each data point might be.
Hope this helps!,
DS