Chris Pyrgas replied the topic: Converting VO memo field data to a SQL Db

Guys,

I think I have found a solution for this!

I can't believe I didn't think of this years ago, but it seems it is possible to 
pre-process binary data in such a way that the DBF encoder will translate it to 
the correct 0..255 value range on saving and likewise post-process data read 
from dbf, so that it gets translated back to the same range again.

Please try the attached small class that does this job. You can test it with 
this sample code (of course you will need to adjust the filenames):

USING System.Text
USING System.IO

FUNCTION Start() AS VOID
LOCAL cDbf AS STRING
LOCAL cBinary AS STRING

cDbf := "C:\Test\TestBin.dbf"
cBinary := "C:\Test\adv.png"

IF .not. File.Exists(cDbf)
	DBCreate(cDbf , {{"FLD1" , "M" , 10 , 0}} , "DBFCDX")
	DBUseArea(,"DBFCDX" , cDbf)
	DBAppend()
ELSE
	DBUseArea(,"DBFCDX" , cDbf)
ENDIF

LOCAL aBytes AS BYTE[]
LOCAL c AS STRING

// save binary to dbf
aBytes := File.ReadAllBytes(cBinary)
c := AdjustBinaryData.BeforeSaveBytes(aBytes)
FieldPut(1,c)
DBCloseArea()

// load binary from dbf
DBUseArea(,"DBFCDX" , cDbf)
c := AllTrim(FieldGet(1))
aBytes := AdjustBinaryData.AfterReadToBytes(c)
File.WriteAllBytes(cBinary + "_new" , aBytes)
DBCloseArea()

RETURN

This code writes some binary data to a dbf, then reads it back from the dbf and 
saves it again to an external file. At least in my machine which uses a Greek 
codepage, it works well, the source and output files are identical. 
Furthermore, the data saved in the dbf can be read (and written) fine also by 
VO apps, so it is compatible.

I hope it works ok in your machines as well. If it does, then we finally have 
a solution for this and the same trick can also be used for writing/reading 
binary data in regular files with the F*() functions. Will also optimize a 
bit the code to use a hash table instead of a lookup array that it does now.

Chris 
