I have thousands of formula columns to create and delete the formulas from those columns.
so to speed up the process i add formula in following ways.
dt<<Suppress Formula Eval(1);
for loop (
new col (formula);
dt<<Suppress Formula Eval(0);
2nd for loop
for loop (Try(column(dt,Newcolname[ik]) << delete formula));
Is there a way to send delete formula massage to datatable instead of to each columns.
since for loop to delete the formulas is taking more than 5 additional minutes.
Any work around is highly appreciated. supercomputer is not an option.
A workaround I come to think of is to make use of the formula deleting power of Concatenate(). The code below creates a copy of the data table – including any table variables or scripts – but without formulas.
dt = Current Data Table();// Table with formulas to delete
new_dt = dt << Concatenate(dt_temp = New Table("temp", New Column("deleteme")));
new_dt << delete column("deleteme");
Thank you MS. your approach of creating a new table via concatenate is fabulous. it keeps all table variables and row order etc but without formulas. it helped me to eliminate the extra large amount of time which was taken for deleting formulas.@ !
Thanks for great suggestion.Formula massages to data table
Hi! Maybe going back a step and finding out why the thousand columns with formulae are needed might be a better starting point.
I know that calculating values within memory as a matrix is much faster than doing it with data tables. This would also omit the need for columns in the first place. There was actually a pretty healthy discussion around this point at Discovery this year with the best practice being to avoid the use of column formulae in scripts where ever possible.
Can you help us understand the use case a little bit more?
i am sure metric is much faster instead of using data tables. but i need data in table because need to use local data filter for identifying very small improvements within splits.
here are the steps involved
1. formula needed to normalize raw electrical parameters by W and L, e.g Ion*W/*Wa.
2. spec limits are attached along with converting data to appropriate units like pA, uA, nA, mV, mW by multiplying with appropriate factor.
3. step 2 is also apply on some of raw parameter which is needed. this is where need to delete the formula, so that new columns can be converted interdependently.
1. we can not merge the formula and spec limit file, since spec limit keep updating . so must keep it separate.
2. mergin unit conversion is also not an option.
Understood... all you are asking for is still possible working with the data as a matrix... what I propose is that you get the data as a matrix, do that calculations and then write the results to a data table. You can then add the spec limits, units, etc as you were doing, but it takes out the need to add and then remove the formulae from the data table. In general where your slow down is coming from is the modification of the data table itself which is (I think) basically being rebuilt each time you change the formula.
If you really want to keep it all in the data table, then setting the column formula to Empty() might be faster than deleting it... but you'd have to try that out to know for sure.
Hi Michael, your idea seems very interesting for creating formula using metrices. but i am not sure how much we can leverage on this methods .
below is a situation : i have organised my new col name and corresponding formulas as a 2 columns in a excel sheet, left col is new names of columns to be added in the table and right column have many complex formula.
now i have to add these kind of 100 or 1000 formulas using for loop ( so far i know). so FOR loop does not know what is constant and what is column name. so i use namexpr to achieve this : column(dttable,Newcolname[ij])<<set Formula(nameexpr(newcol1))
so my question would be , is there a similar way(using expr) to do same with matrices without getting column values into a matrix<<GetAsMatrix?