In addition to answer of others (jsl and dashboard size plays no role in comparison to data size, there are several ways to perform your task), I highly recommend to use compressed jmp file to provide data, in comparison to csv, if the data can be prepared once and used by many. Make sure that preferences are set accordingly or use "Compress File When Saved( 1 )" for the table to make it effective. The filesize will shrink from 60% down to 20% or so, depending on datatypes. Getting several GB's via network is no fun.
Names Default To Here( 1 );
// Script generates random data, stores as compressed jmp file and csv, and compares file size
// random normal = high complexity, size shrink to 60 %
// random integer = low complexity, size shrink to 20 %
form_expr = Expr( Random Normal() );
// form_expr = Expr( Random Integer( 3 ) );
dt = New Table( "RandomData", add rows( 1e5 ), New Column( "col", set each value( form_expr ) ) );
For( i = 1, i < 100, i++,
dt << New Column( "col", set each value( form_expr ) )
);
jmp_path = "$TEMP\delete_me_test.jmp";
csv_path = "$TEMP\delete_me_test.csv";
dt << save as( jmp_path );
dt << save as( csv_path );
Print( "file size ratio JMP vs CSV: " || Char( Round( File Size( jmp_path ) / File Size( csv_path ) * 100 ) ) || "%" );
Close( dt, NoSave );
Georg