#1 2017-07-06 08:23:43

enrcas
Member
Registered: 2013-06-16
Posts: 20

Memory usage in Mormot ORM Batch insert

Following a tip from master ab, I started trying to improve my insertion speed in a test I am making here.

First I started doing the typical Inserts inside a transaction. But finally using ORM with batched inserts.

So, i got a 20 million lines text file, and made a simple test, get line, insert in a simple test table. But started getting out of memory exceptions.

I've used this little function to test used memory

function MemoryUsed: cardinal;
var
    st: TMemoryManagerState;
    sb: TSmallBlockTypeState;
begin
    GetMemoryManagerState(st);
    result := st.TotalAllocatedMediumBlockSize + st.TotalAllocatedLargeBlockSize;
    for sb in st.SmallBlockTypeStates do begin
        result := result + sb.UseableBlockSize * sb.AllocatedBlockCount;
    end;
end;

Ran it again and watched memory going up until it hit 2GB, exception and stopping working.

So I thought that I need to do "db.BatchSend(batch)" every x lines to free the memory used by the Batch process. I am bulding a 32bit executable,
but I think I am doing something bad, because 2GB must be enough for this simple test.

I know the AutomaticTransactionPerRow, so this commiting of mine is superflous?

I am following the ORM batch example: (... = omitted non interesting code) All strings are rawutf8.

db := TSQLRestServerDB.CreateWithOwnModel([TSQLstest],'c:\test.db3');
...
db.DB.LockingMode := lmExclusive;
db.DB.Synchronous := smOff;
db.CreateMissingTables;
...
batch := TSQLRestBatch.Create(db,TSQLstest,1000);
stest := TSQLstest.Create;  //class(TSQLRecord)
...
try
repeat
  readln(tfile,s1);
  inc(cline);
  stest.st:=s1;
  batch.add(stest,true);
  if (cline mod COMMIT_COUNT=0) then
  begin
     db.BatchSend(batch);
     Writeln(cLine);
     Writeln('Memory used intracommits = ', IntToThousandString(MemoryUsed));
  end;
until eof(tfile);
finally
  stest.free;
end;

db.BatchSend(batch); //the rest of pending batch adds
db.Free;  

Very simple stuff, but memory is not free, and keeps growing until it breaks.
I am doing something wrong? do i need to free the batch after sending it, and re-create them? Or use a reset after send?

Edit: tested reseting the batch after a send but memory keeps growing. So Reset don't solved this.

First I thought i was leaking memory. So I've used MadExcept for testing memory leaks and other errors. But this test is not leaking memory, all is green with MadExcept,
but just one case in Mormot initialization (minor one). But not leaking for my part.

If needed i can put here the whole test (a console little program)

Thanks !!!

Edit: has anyone ran the Synopse test cases with something as EurekaLog / madExcept / Fastmm4 with all error detection On?

Last edited by enrcas (2017-07-06 08:26:28)

Offline

#2 2017-07-06 08:46:59

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,183
Website

Re: Memory usage in Mormot ORM Batch insert

The test cases are run regularly with FastMM4 memory leaks on, in FullDebugMode.
And mORMot is used on production on 24/7 servers writing hundredths of GB of data, running without restart during months: we always monitor performance, CPU and RAM to ensure that no leak occur.
I can tell you that memory consumption is very low - nothing to compare with a Garbage Collector server written in .Net or Java.

AFAIR TSQLRestBatch.Reset should be called after BatchSend, to reuse the batch.

I've tried to reproduce the issue with sample ThirdPartyDemos\ORMFastMultiInsertWithIndex.dpr:

        batch.Add(rec,true);
        if i and $2ffff= $2ffff then begin
          write(' send');
          db.batchSend(batch);
          batch.Reset;
          write(' prepare');
        end;

But I found no problem with COUNT = 10,000,000 on a local file.
The memory grow to to 50/60 MB, then stick to it. Such valeues do make sense, since by default the SQLite3 engine defines a 40MB RAM buffer per opened file, for performance.
See https://paste.ee/p/7iTbZ for the console output.

I guess there is something else wrong.
You can put some sample code - but not here directly: please use https://gist.github.com/ or https://paste.ee/

Online

#3 2017-07-06 11:28:24

enrcas
Member
Registered: 2013-06-16
Posts: 20

Re: Memory usage in Mormot ORM Batch insert

You were right (and my test was bad).

batch.reset is necessary to free memory after a batchsend, if i do the combo, memory is stable at aroud 40/50mb independent of number of lines inserted (i tried with a 8gb text file), if i just use the batchsend, memory grows until it overflow.

I am curious, in the line batch := TSQLRestBatch.Create(db,TSQLstest,1000), what is the use of AutomaticTransactionPerRow (1000 here), is data commited?, if so, is safe on disk and  memory could be free?

Offline

#4 2017-07-06 13:21:40

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,183
Website

Re: Memory usage in Mormot ORM Batch insert

batch.Reset is mandatory.
And is clearly documented as such.

AutomaticTransactionPerRow encapsulates the writes into a SQL transaction, for a given number of data rows.
In practice, it speeds up insertion a lot (because maintaining a transaction open is expensive for SQL engines), and modified data rows are committed on disk every transaction.

Online

#5 2017-07-07 07:07:27

enrcas
Member
Registered: 2013-06-16
Posts: 20

Re: Memory usage in Mormot ORM Batch insert

Thanks for answer, and for answering newbie questions! Sometimes is hard to learn new things.

Offline

Board footer

Powered by FluxBB