You are not logged in.
Should I use a transaction for every table passing the table as parameter on each or can I just do something like
rest.TransactionBegin(TSQLRecord, 1);
try
update table1
...
update tablen
rest.commit(1)
exception
rest.rollback(1);
end;
What is the proper way?
Thanks!
Offline
If TransactionBegin use is not mandatory, I would use a Batch for such operations: https://synopse.info/files/html/Synopse … l#TITL_100
Just do not point to a particular TSQLRecord class at TSQLRestBatch creation (use nil instead), so you'll be able to update different tables in one batch.
Offline
Project is already too big (1200+ tables), sometimes I have to run SQL directly too
Offline
I see, I'm trying to follow ORM approach as much as I can. But I understand that not everything can be handled easily, especially with such big legacy DBs.
Hope, somebody else will help you with advice for your case.
Offline
You are not using SQL, but ORM in your code snippet.
So the way to proper transactions for such context is to use an ORM batch.
If you are really using only SQL, don't mess with the ORM/REST transaction method.
Use standard SQL transactions, within an ORM WriteLock/WriteUnlock if you prefer.
Offline
sometimes i mix both, I use the ORM as much as I can but sometimes I need to do some complex queries, it's a single user per db tho
I will do like
fRest.TransactionBegin(TXXX, 1);
try
fRest.ExecuteFmt('update XXX set y = z', [], []);
xxx := TXXX.CreateAndfillPrepare(...)
try
some specific updates
then sql again with some complex joins using TSQLTableJSON
then commit
how screwed am I?
Offline
If you don't access the DB from outside, you can just use TSqlRestServer.WriteLock/WriteUnlock which works like a global transaction to the ORM write methods.
The trick is to do it on the server side, and keep the lock as small as possible.
Offline