#1 2015-01-11 18:03:47

Sabbiolina
Member
Registered: 2014-05-20
Posts: 120

[SESSIONS] proposal for improvement

Hello AB,
  I'm curious to know why you have chosen to manage the session list by scrolling and not through a hashlist.

I also noticed that with each new call you test the timeout.
I propose to save the minimum value (the next timeout) to the first test of the entire list by saving the timeout closer. So the next time you call, if the minimum is not reached, you can avoid doing the test on all elements.

Offline

#2 2015-01-12 08:04:46

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,240
Website

Re: [SESSIONS] proposal for improvement

This is indeed an identified potential bottleneck. Especially in multi-thread process, the CPU L1/L2 caches may be flushed for sure by the current implementation.

In fact, we did implement "brute force" O(n) algorithms for in-memory sessions.
In practice, TSQLRestServer.SessionAccess() works very well.
Of course, since it is a O(n)*2 algorithm, if n is huge (I guess more than 100,000), it may cost some CPU.
But I guess that this implementation would be still more efficient than using a database for storing the sessions.

Did you really identify it as a bottleneck, using a Profiling tool, on production?
"Premature optimization is the root of all evil", as Knuth wisely wrote.
smile

I've added a feature request to track this.
http://synopse.info/fossil/tktview/0b661c4eec9b

Offline

#3 2015-01-12 16:00:37

Sabbiolina
Member
Registered: 2014-05-20
Posts: 120

Re: [SESSIONS] proposal for improvement

I was watching the various procedures of hash in mormot, but I can not find the procedure for the single insertion or deletion.
It seems to me that you make extensive use of rehash, I'm wrong?

Offline

#4 2015-01-12 16:44:10

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,240
Website

Re: [SESSIONS] proposal for improvement

Yes, it is meant for adding more than deletion.
Which is the case for 95% of the case with lists.

Offline

#5 2015-01-12 17:37:33

Sabbiolina
Member
Registered: 2014-05-20
Posts: 120

Re: [SESSIONS] proposal for improvement

certainly if you add values definitely unique gains on research on a set of data that does not change.

is a bit like adding a million records to a database without an index and then create it in the end.

but in the case of the sessions, or other data sets that change frequently,
continue to rehash all the elements lose the advantage of instant search.

  HL:=TRawUTF8ListHashed.Create(true);

  for t: = 1 to 50000 do
  begin
   Rec: = TmyOBJ.create;
   Rec.id:='id_'+inttostr(t);
   Rec.msg: = 'Msg';
   HL.AddObjectIfNotExisting (Rec.id, Rec);
  end;

41 seconds on I7-3820

Last edited by Sabbiolina (2015-01-12 17:46:55)

Offline

#6 2015-01-12 18:51:31

ab
Administrator
From: France
Registered: 2010-06-21
Posts: 14,240
Website

Re: [SESSIONS] proposal for improvement

This was indeed far from efficient, since TRawUTF8ListHashed.AddObjectIfNotExisting () relies on TRawUTF8ListHashed.IndexOf() which is not to be run at each time.
But it is documented as such... "rough, but working implementation" words.

In fact, TRawUTF8ListHashed/TRawUTF8List is never used with such pattern in mORMot.
When it is needed, the framework do not use TRawUTF8ListHashed but TDynArrayHashed - e.g. TDynArrayHashed.AddAndMakeUniqueName() or FindHashedForAdding().

I've added a dedicated TRawUTF8ListHashed.AddObjectIfNotExisting() method... and you can check the time difference.
See http://synopse.info/fossil/info/799e4dee78
So it was not an issue by itself.
And this class was NOT used during in-memory sessions.
Thanks for worrying about the framework performance!
smile

Offline

#7 2015-01-12 19:14:24

Sabbiolina
Member
Registered: 2014-05-20
Posts: 120

Re: [SESSIONS] proposal for improvement

So I like it!

You propose to add new features that maintain the consistent hash table, and let the generic ones for the use you've always rightly, ie to create a hash table only at the time of the search.

Or to decide in the creation of the object hash:
LiveHash / onDemandHash

I think that the benefit to the whole system is great, and also help to keep performing the rest of the programming.

For searches that are done on a list in memory always think it's always better to the help of the hash table to always have the results on time. Too many times you create bottlenecks for the lists that grow dramatically, beyond what you thought was right in development of the software.

Offline

Board footer

Powered by FluxBB