Hi, Once I reach a file size of around 60MB the BinaryRead(totalbytes) crashes my web site. using IIS 6 on a windows 2003 server. Updated the metabase.xml file under aspmaxrequestentityallowed to 100MB but the same error happens. There is lots of information on the web about this but no answers... Is it related to a VFP memory limit? Thanks in advance Linny

FoxPro's string size limit is 16mb. It's possible to get more than that in single chunks as long as you don't run functions on the string,but if you manipulate FoxPro will throw an error. Also applies to binary blocks I believe, but I'm not sure how you are using the binary read and what you're doing with the data.
My suggestion would be capture the data in chunks and write it to disk immediately to avoid creating large memory overhead. You can decide what to do with the data once you have it saved in a file - like read it out in smaller chunks.
+++ Rick ---
Thanks for the reply Rick, We are using it as a file upload function on the web site: Fox throws the error at the biData line if the size of the file on the users selected file is greater than 60MB. Thanks Linwood This is the procedure: PUBLIC web_med,FormElm web_med = CREATEOBJECT ("Scripting.Dictionary") FormElm = CREATEOBJECT("Scripting.Dictionary")
local biData, biTmp, sInputName,lnBnd,lnLastBnd
local nPosBegin, nPosEnd, nPos, vDataBounds,lnBndLen, nDataBoundPos
local nPosFile, nPosBound
biData = objRequest.BinaryRead(objRequest.TotalBytes) && All data from the form
nPosBegin = 1
lnBnd=1 && Count of delimiters
lnBedLen=0
nPosEnd = ATcc(Chr(13), biData)
If (nPosEnd-nPosBegin) <= 0 Then && without upload element
RETURN
ENDIF
*//vDataBounds is the delimiter of the sections in the binary string
*//vDataBounds="-----------------------------7d437719c30242" the length:"-":29;"7d437719c30242":14;
*//7d437719c30242 end with chr(13)
vDataBounds = substr(biData, nPosBegin, nPosEnd-nPosBegin)
bnd=substr(biData, nPosBegin, nPosEnd-nPosBegin)
lnBndLen=LEN(bnd)
nDataBoundPos = ATCC(vDataBounds,biData )
LOCAL oUploadFile
LOCAL lnB,lnE,lnEnd,lnCnt
LOCAL lcName,lcFileName,lcContentType,lcElmData,lcTmpData
LOCAL lnElmB,lnElmE,lcElmData
lcName=""
lcFileName=""
lcElmData=""
lcTmpData=""
lnCnt=1
lnB=ATCC(bnd,biData,lnCnt)
lnEnd=0
DO WHILE (lnB>0 AND lnEnd<1)
IF lnB>0 THEN
IF SUBSTR(biData,lnB+44,2)<>"--" THEN
lnE=ATCC(bnd,biData,lnCnt+1)
lcTmpData=substr(bidata,lnB+lnBndLen+2,lnE-2-lnB-lnBndLen-2)
*//get element name
lnElmb=ATCC("name=",lcTmpData)
lcElmData=SUBSTR(lcTmpData,lnElmB+6,LEN(lcTmpData)-lnElmB-6)
lnElmE=ATCC(["],lcElmData)
lcName=SUBSTR(lcElmData,1,lnElmE-1)
*// get filename
lnElmB=ATCC("filename=",lcTmpData)
IF lnElmB>0 THEN
lcElmData=SUBSTR(lcTmpData,lnElmB+10,LEN(lcTmpData)-lnElmB-10)
lnElmE=ATCC(["],lcElmData)
lcFileName=SUBSTR(lcElmData,1,lnElmE-1)
lnslash=0
lnslash=rat("\",lcFileName )
IF lnslash>0 THEN
lcFileName = SUBSTR(lcFileName,lnslash+1,LEN(lcFileName)-lnslash)
ENDIF
ELSE
lcFileName=""
ENDIF
*//get element conten-type
lnElmB=ATCC("Content-Type:",lcTmpData)
IF lnElmB>0 THEN
lcElmData=SUBSTR(lcTmpData,lnElmB+13,LEN(lcTmpData)-lnElmB-13)
lnElmE=ATCC(CHR(13),lcElmData)
lcContentType=SUBSTR(lcElmData,1,lnElmE-1)
ELSE
lcContentType=""
ENDIF
*//get element data
IF LEN(lcFileName)>0 THEN
lnElmB=ATCC(lcContentType,lcTmpData)+LEN(lcContentType)+4
ELSE
lnElmB=ATCC(lcName,lcTmpData)+LEN(lcName)+1
ENDIF
lcElmData=SUBSTR(lcTmpData,lnElmB)
*// push into dictionary
IF LEN(lcFileName)>0 THEN
oUploadFile=CREATEOBJECT("webUploadedFile")
oUploadFile.FileName="" + lcFileName
oUploadFile.ContentType="" +lcContentType
oUploadFile.FileData=lcElmData
WEB_med.Add ("" +lower(lcName), oUploadFile)
oUploadFile=null
ELSE
IF NOT FormElm.exists("" +LOWER(lcName)+ "") THEN
FormElm.add("" + lcName ,lcElmData)
ENDIF
ENDIF
*//goto next element section
lnCnt=lnCnt+1
lnB=ATCC(bnd,biData,lnCnt)
ELSE
lnEnd=1
ENDIF
ENDIF
ENDDO
RETURN
That code will fail long before you get to 60mb.
You're bumping into the 16mb string limit on one of the files most likely on one of the mime parts, and you're then manipulating the string received with string functions.
+++ Rick ---
Hi Rick, Well it's good to know the problem has been identified.. sometimes that is the hardest part. Can you fix it for us? We can fund the effort. Thanks Much

As I mentioned the way to fix this is to read the binaryRead code and then immediately write the captured data to a file on disk. You can then re-open the file and read the file in < 16mb chunks to parse out the individual pieces.
The code currently likely blows up at that first substr()
that tries to split the string - if the string is > 16mb that code will fail.
The file solution is only slightly more complicated than what you are already doing with your parsing with the difference being that you have to track the file chunks as you read them in.
I won't help you fix this because this would be a pain in the ass to set up a test environment for.
Hi Rick, I get the idea, but the code fails on this line: biData = objRequest.BinaryRead(objRequest.TotalBytes) Seems like the binaryRead function is failing once the totalBytes hits some size threshold.
Yes it would be a pain in the ass to setup this platform from stretch and not necessary. If you wanted to look at it we would give you access to one of our VM boxes that is running the test environment ez.
Thanks Merry Christmas.
Right, so read smaller chunks in a loop - you get to specify how much data to read. You'll want to read < 16mb. In fact I'd recommend reading ~1mb chunks and process those or write to file.
FWIW - if the contained data is larger than 16mb you may still have a problem with your only option to write that directly to file without any manipulation.
+++ Rick ---
Good information but I think it is another problem as we can load files up to 45MB with this code. The website is a construction management platform and architectural files of a gig or more are not uncommon. We have had to stick with the financial billing and accounting side because we can't offer the files management feature based on this error. Most files with the exception of architectural plans are well within the 45MB. We stream the files into SQL Server image field. It works well. Do you have a drag and drop interface in your platform that we could bolt on to ours to stream large files into a sql server image field? We would really like to offer architectural plan management.
Thanks Linny lcox@coxinfotech.com