Very interesting discussion among experts about optimization of string operations, avoiding memory fragmentation, etc. Though this topic interests me too, at the moment, I am not discussing that topic.
The discussion appears to have started with the requirement of Mr. Paquito to reduce the time to export dbf to a delimited text file.
The txt that I believe, right now, has ~ 130 mb and takes 4 h.
Do you really mean 130 MB ( = 130,000,000 bytes ), not 130GB?
If it is only 130 MB, it should not take more than 2 or 3 minutes without any optimizations, even with standard COPY TO .. DELIMITED WITH ... command.
Instead of COPY TO command, we can even try the (x)Harbour function
DBF2TEXT( bWhile, bFor, aFields, cDelim, hFile, cSep, nCount, cdp )
The above example is a very simple case where all the fields are character fields. There can be cases wehre some fields can be numeric, dates, etc, requirng conversion to strings.
For testing I exported \fwh\samples\customer.dbf 2200 times to customer.txt. This took 2 minutes and produced a text file of size 142.527,000 bytes, i.e., 142.527 MB.
04-05-2018 18:46 142,527,000 customer.txt
1 File(s) 142,527,000 bytes
My test code:
USE CUSTOMER
hFile := FCreate( "customer.txt" )
? "start"
nSecs := SECONDS()
for n := 1 to 2200
CUSTOMER->( DBGOTOP() )
CUSTOMER->( DBF2TEXT( nil, nil, nil, nil, hFile, ";", -1 ) )
next
FClose( hFile )
? SECONDS() - nSecs
Did I wrongly understand the requirement of Mr. Paquito?