cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 
Choose Language Hide Translation Bar
KST-CPT
Level II

End Of Field(other)

I have a file (essentially a text file but with .log as the extension).  It is comma seperated.  Contains both character and numeric data.  Every now and then I get a ID number with a comma in it.  This throws the columns off.  looking at the data I see that all the values I want to seperaate actually have a space after the comma. I want to seperate the columns with something like this:

 

For( i = 1 , i <= nf, i++,
filenow = ( filelist[i] );
fileopen=(filepath||filenow);

dt=open(fileopen, "text", End Of Field(Other), EOF Other(", "), Table Contains Column Headers( 0 ));

 

However, the data table is still comma delimeted so that the values I do not want to be delimeted are seperated into 2 columns. Is there a way to ignore the comma in a value such as 12345,6 but keep the delimeters that have a space after them ", "?

 

Thanks

 

 

1 ACCEPTED SOLUTION

Accepted Solutions
vince_faller
Super User (Alumni)

Re: End Of Field(other)

you could preprocess the text file itself to change all of them to tabs then just import like a tsv

 

Names Default to here(1);
file = pick file();
txt = load text file(file);
substitute into(txt, ", ", "\!t");
newfile = convert file path("$TEMP/tempfile.log");
save text file(newfile, txt);


dt=open(newfile, "text", End Of Field("Tab"), Table Contains Column Headers( 0 ));
Vince Faller - Predictum

View solution in original post

2 REPLIES 2
vince_faller
Super User (Alumni)

Re: End Of Field(other)

you could preprocess the text file itself to change all of them to tabs then just import like a tsv

 

Names Default to here(1);
file = pick file();
txt = load text file(file);
substitute into(txt, ", ", "\!t");
newfile = convert file path("$TEMP/tempfile.log");
save text file(newfile, txt);


dt=open(newfile, "text", End Of Field("Tab"), Table Contains Column Headers( 0 ));
Vince Faller - Predictum
KST-CPT
Level II

Re: End Of Field(other)

that worked very nicely, thanks