You are not logged in.
Pages: 1
Thanks!
How would I represent this in MORMot (SQLite3)?
UPDATE info SET b=true WHERE a=1;
Right now I'm doing a CreateAndFillPrepare and then looping through the result set and updating each record. I'm just checking to see if there is a method I am missing.
That still adds 10MB to the file
ok, I used database.db.executeall('VACUUM') but it didn't have any effect. I have two tables defined but I don't think they are virtual.
It's TSQLRestServerDB
I have a database with ~1mil records. If I delete 5000 entries and then 'VACUUM' it grows by 10MB. Am I doing something wrong?
After the deletions are committed I am issuing:
database.EngineExecuteAll('VACUUM');
Thanks!
Thank you, I have much to learn. The documentation helps but sometimes it is good to have a person fluent in it to give direction. Using the multiple indexes fixed the slowness and I am now using two int64 as you suggested. Thank you again.
My database is about 7 million records using SQLite
I am storing 4 longints in each record.
tinfo = class(TSQLRecord)
private
fs1,fs2,fs3,fs4: longint;
published
property sig1: longint read fs1 write fs1;
property sig2: longint read fs2 write fs2;
property sig3: longint read fs3 write fs3;
property sig4: longint read fs4 write fs4;
end;
procedure TForm4.FormCreate(Sender: TObject);
begin
model:=tsqlmodel.Create([tinfo]);
database:=tsqlrestserverdb.Create(model, extractfilepath(paramstr(0))+'data.dat');
TSQLRestServerDB(database).CreateMissingTables(0);
database.CreateSQLIndex(tinfo, 'info', true, 'index');
end;
This query takes so long it almost seems frozen. I am not sure what I am doing wrong.
function sigfound(p1,p2,p3,p4: longint): boolean;
var info: tinfo;
begin
info := tinfo.Create(Database,'sig1=? AND sig2=? AND sig3=? AND sig4=?',[p1,p2,p3,p4]);
result:=(info.ID <> 0);
info.Free;
end;
thanks! Something isn't right here that maybe you can see:
Idea is to take a full list of files, extract a sub list of files with the same file sizes. Then do some stuff with that sub list, remove them from the full list and repeat. The error I am getting is one the second "CreateAndFillPrepare" that says
"There is no overloaded version of 'CreateAndFillPrepare' that can be called with these arguments"
tfilelist = class(TSQLRecord)
private
ftype,fpform,fname,ffilename,fsig: RawUTF8;
fsize: int64;
foffsets: RawUTF8;
published
property _type: RawUTF8 read ftype write ftype;
property pform: RawUTF8 read fpform write fpform;
property name: RawUTF8 read fname write fname;
property filename: RawUTF8 read ffilename write ffilename;
property size: int64 read fsize write fsize;
property offsets: RawUTF8 read foffsets write foffsets;
property sig: rawutf8 read fsig write fsig;
end;
var
full_list, sub_list: tfilelist;
begin
full_list:=tfilelist.CreateAndFillPrepare(filelist, '*');
while full_list.FillOne do
begin
sub_list:=tfilelist.CreateAndFillPrepare(full_list,'_type = ? AND pform = ? AND name = ? AND size = ?', [full_list._type, full_list.pform, full_list.name, full_list.size]);
if (sub_list.FillTable.RowCount > 6) then
begin
aprogress.Lines.Insert(0, format('Processing %s.%s.%s',[sub_list._type,sub_list.pform,sub_list.name]));
while sub_list.FillOne do
begin
//do some stuff
filelist.Delete(full_list, sub_list.ID);
end;
sub_list.Free;
end;
end;
full_list.Free;
end;
My brain just isn't translating this well even with searches, examples and documentation.
This is where I am coming from:
$result = mysql_query($query);
if ($result) {
$nr = mysql_num_rows($result);
if ($nr > 0) {
while($row = mysql_fetch_array($result, MYSQL_NUM)) {
//do stuff
}
}
how do I translate that function to ORM? Perform a query and then enumerate the result set?
I can just do a static row counter that gets updated when I update the database
I should note, this delay only seems to happen after a reboot when the disk cache is fresh.
Upon further research and instrumentation it appears to be the database.TableRowCount(tinfo) function that is causing the delay:
Got row count in 22.18200 seconds.
Am I doing that wrong?
I just noticed that in some cases it is taking more than 5 minutes. And sometimes it is nearly instant.
this is where it hits the delay
database:=tsqlrestserverdb.Create(model, extractfilepath(paramstr(0))+'database.dat', true, 'password');
The database is 556MB
I am doing
Type
tinfo = class(TSQLRecord)
private
fsig: RawUTF8;
published
property sig: RawUTF8 read fsig write fsig;
end;
tver = class(TSQLRecord)
private
fver: integer;
ftimestamp: tdatetime;
published
property version: integer read fver write fver;
property timestamp: tdatetime read ftimestamp write ftimestamp;
end;
database: TSQLRestServerDB;
model: TSQLModel;
model:=tsqlmodel.Create([tver,tinfo]);
database:=tsqlrestserverdb.Create(model, extractfilepath(paramstr(0))+'database.dat', true, 'password');
My database with around 7 million records seem to take a very long time to load when my program first starts up. It has taken as long as a minute. Is it loading the index or something? Can I speed it up somehow?
The indexed column contains a list of 32 character strings.
Thank you for your patient help ab. I am now successfully using SynSQLite3 for my project and it is working very well.
How do I find the total number of records stored?
I just realized I screwed up the question. It's 500MB, not 500K.
500K is a pretty small amount of data.
How much will it grow?
I'll give that a try and see if it impacts performance too much.
I already do incremental updates but the initial distribution needs to be the full database.
I am already using SynSQLite3 and it works well. The only issue I have is the file size which I would reduce if I could.
I can't have it all in memory because it is too large.
Typically I am adding about 40K entries a day.
So am just asking to make sure I have made the best choice and that there isn't another option that might be better.
SynBigTable also worked well but I couldn't encrypt it.
I have a database of ~7 million strings. There's no other data, just a huge list of strings.
I need fast (indexed) access to the list and it must be encrypted. I am currently using SQLite3.
Problem is, the database needs to be distributed and the database size is about 500MB which puts a burden on the distribution system and end users.
The database will also be growing fairly quickly.
So, what are my options? Thanks!
EDIT: Fixed my mistake of putting K instead of MB
Pages: 1