Eng-Tips is the largest engineering community on the Internet

Intelligent Work Forums for Engineering Professionals

  • Congratulations waross on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

reading a large number of files

Status
Not open for further replies.

TheKKK

Mechanical
Mar 22, 2009
21
0
0
GR
I have to read and extract data from a large number of files:

file1.dat
file2.dat
file3.dat
.........
file100.dat

Is there a fast way of reading all these files one at a time?

My simple idea is to write all the filenames in a text file from which i will read a filename at a time. If anyone knows a faster solution im glad to hear it.

Thanx
 
Replies continue below

Recommended for you

Speed of reading files is dependent on the IO method, usually, not how you get your filenames. Nonetheless, since your filenames are sequential, you could just store the common root and append the number for each new file.

As for reading the file itself, the preference would be binary format, since that goes much faster than text.

TTFN

FAQ731-376
 
What if my filenames have not a common root?

If i have:

GR01.dat
GR07.dat
FR01.dat
FR02.dat
IT01.dat
IT06.dat
........

Is there a way fortan could read one by one all these...
if i dont want to write all the filenames in a .txt file?

Thanx!
 
Presumably your data files has some unique defining characteristic, such as all having a .dat filename extension and being the only .dat files in the directory.

If so, have your program initiate a directory operation to list the files into some new file (of a different extension). Once this file has been created, your program can open it, then read the .dat file names one by one and open them for you to extract the stuff you want.

In good ol' DOS, the directory operation would have been
DIR *.DAT /B > FILES.LST
 
Status
Not open for further replies.
Back
Top