#1 Re: mORMot 2 » Load a really big JSON » 2024-03-07 20:43:26

Thanks for your answers, Ab and Chaa !

After a whole day of trying and getting errors, I realized that there were 11 mistakes in the JSON from MongoDB, and even their Python lib crashed over their export files...
So I switched the exports to CSV, only to realise that even a TStringList.LoadFromFile doesn't like files > 2GB.
The solution I used in the end was TFileReader.ReadLine to split the file and load it in a TStringList.

I'm still very intrigued by mORMot 2 :-)

#2 Re: mORMot 2 » Load a really big JSON » 2024-03-06 23:53:24

(I tried TDocVariantData.InitJsonFromFile but with no success : AnyTextFileToRawUtf8 is doing a call to StringFromFile, which is hard-limited to 2GB despite being AnsiString)

#3 mORMot 2 » Load a really big JSON » 2024-03-06 22:52:53

HelloMog
Replies: 5

Hi,

I'm trying to write a simple loader for big JSON exports coming from a database (mongoDB), and the files are 8GB each, which excludes the JSON tools from FPC, as they all seem based on strings, which are limited to 2GB.

I used a few units from mORMot a long time ago on text parsing and remember it as really faster than anything else, so I'm quite happy to see a V2 with some JSON support.
Is there an easy way to open a big file, then parse it so I can transfer the data in an array of record (it's easier for me to load them like that, I'm from the Turbo Pascal days and not really good with the generics and other new features).
After a quick look at DynArrayLoadJson and JsonRetrieveStringField, I'm not really sure how to use them, and if they're fit to use on "general" JSON and not things coming from TJsonWriter.

Any help or documentation would be kindly appreciated,

Best regards,

Board footer

Powered by FluxBB