You are not logged in.
Pages: 1
Perhaps you are in fact searching for the following function:
function JsonDecode(P: PUtf8Char; out Values: TNameValuePUtf8CharDynArray; HandleValuesAsObjectOrArray: boolean = false): PUtf8Char; overload;
Yes, it seems that TNameValuePUtf8CharDynArray does the trick, thank you!
But with a custom reader, you should know the field names first, so the TJsonParserContext.ParseObject method with the fixed names parameter is to be used.
Unfortunately I don't know field names in advance because I use it as a serializer/deserializer for custom models in generic way. That's why I cannot use static arrays.
Will need to do some more advanced tests, but it looks promising for now
Thank again for fast response, it's much apprecated.
Hi,
I'm trying to migrate my projects from a pretty old version of mORMot from year 2018 and encountered a problem with JSONDecode method, which was changed and no longer accepts Values in dynamic form:
Values: TPUtf8CharDynArray;
While searching forum I've found this post https://synopse.info/forum/viewtopic.ph … 876#p29876 and of course there is also description in documentation (chapter 10.1.6.1. Custom class serialization).
Solution provided there, however, is not satisfying for me, because I don't know in advance how many items I'll need, therefore I can not really use a static array.
I tried to use this:
Values: TValuePUTF8CharArray;
but it gives me AV, and that made me come up with this ugly workaround:
Values: array[0..10000] of TValuePUTF8Char;
I'd really like to know what is the proper way of implementing custom reader nowadays?
Thanks in advance.
Hi, any thoughts about it? Anyway, I'm on a short vacations now with limited net access, so won't be able to do anything more till next Monday.
OK. Here is quick and dirty prototype: https://github.com/dominikcz/custom-serialization-idea
I've tried to make it as close to my original code as possible, so there are some parts that have nothing to do with simple types we discuss here (just ignore them). However in few places I had to remove some functions and substitute them with default Delphi counterparts (to reduce dependency at the cost of performance).
Please note that only parts of writer's code is implemented by registry. Ideally all code from WriteValue() function should be implemented like that. And last but not least, there are no readers implemented.
Tell me what do you think about it
Yes, properties and fields. I have done a prototype of this idea on higher level but I think it would be better to make it a part of Synopse. If you want I should be able to prepare a repository on github with sample application for tomorrow.
Yes it would partly solve the problem, but inheritance is almost never a good choice, at least not in long term. And you won't achieve full DRY principle with hundreds of models, it will be maybe a little less wet
That's why I prefer to encapsulate serialization/deserialization of simple types and this way aid serialization of models that contain them. It's may be a strategy pattern or template method, but either way it's more about composition than inheritance.
dc
As for DTO it doesn't help here because you still have to write custom serialization add/or deserialization code. Problem is with unnecessary repetition of the same code for different simple types in DTO. As far as visibility of fields/properties is considered it's not a problem because I use {$RTTI EXPLICIT ...} directive for my models and put custom serialization and deserialization flow in TJSONSerializer.RegisterCustomSerializer.
dc
Good but what about classes? Will it work with classes too?
I'm familiar with DTO and problem is not in lack of design patterns in my application. Maybe I gave you too trivial examples, but it's not problem with complexity of models, it's more about scale: I have around 160 models with different serialization and deserialization needs. I can usually quite easy define a type that has some specific serialization, and use it across different models. Last time I wrote about decimals but there is also a case for TDateTime, which you may need as:
iso 8601 format yyyy-mm-dd hh:mm:ss
iso 8601 format with T instead of space yyyy-mm-ddThh:mm:ss
iso 8601 format with miliseconds yyyy-mm-dd hh:mm:ss.zzz
custom integer like format yyyymmddhhmmss
...and so on
When you use custom TDateTime types most if not all problems are solved
dc
I have some "data models" in my application, some of them are records but most are classes. From my perspective it's not that important, let's assume I can't change it and would like to be able to work with both.
I need to have much more control of serialization/deserialization than just looking at published. That's why I use RegisterCustomSerializer() to begin with, so I can alert the way serialization and deserialization works. My point is that having only RegisterCustomSerializer() leads to ugly code, which could be easily made better by using some register for simple types. From this perspective using TSynPersistent won't help either.
By simple data types I mean all possible native Delphi data types but records, objects and classes (i.e. integer, currency and so on). To that group I also add custom types like TDecimal2 = Currency from my example code.
Maybe a use case will show it better:
Let's say I need to serialize numbers differently based on precision. Numbers with precision up to 2 fields are supposed to always serialize with two decimal places, and numbers with precision up to 3 always with 3 decimal places.
Then in an application I would just declare two new types TDecimal2 and TDecimal3, booth as Currency, and use a custom reader and writer for serialization and deserialization. It separates my business needs from implementation details and even gives me opportunity to use parts of my code in different places. It can work with JSON serialization but also XML as well.
dc
Hi, I have following example:
type
TDecimal2 = Currency;
TDecimal3 = Currency;
TTestClass1 = class
public
val2: TDecimal2;
val3: TDecimal3;
end;
TTestClass2 = class
public
field: TDecimal3;
end;
I would like to serialize this objects with custom serializer. I know how to do it with TJSONSerializer.RegisterCustomSerializer and it all works great. However I'm not quite happy with that method because you have to either:
prepare serialization individually for different classes and repeat reader and writer code for every custom type, which is not very DRY
prepare one big God like class for serializing everything and make your code look more like spaghetti with bunch of cases and type comparations
Either way code gets hard to maintain very quickly.
What I'd like to do is rather:
register custom reader and writer for main class(es) you need or let's say TObject which covers only serialization/deserialization flow
register custom reader and writer for specific simple types like TDecimal2, TDecimal3 which would use registered custom readers/writers
I've spend some time on analyzing SynCommons.pas and mORMot.pas code but can't find a way to register readers and writers for simple types. Is it correct or do I miss something? I think such a feature would be beneficial for the framework and it's something I can contribute to. What do you think?
dc
Thank you for merging
I have updated this fix to current code base and opened a pull request on github. Can someone please take a look at it?
dc
JWT are just as much vulnerable to MITM attacks as any other authentication scheme and OAuth has nothing to do with it. However solutions do exists, like using HTTPS only cookies (HttpOnly; Secure). You should also consider prevention of CSRF attacks and replay attacks. For CSRF you may consider using another cookie (domain cookie, accessible from JavaScript and send it alongside request). For replay attacks prevention you can use exp and iat claims of JWT. There is also jti claim (unique JWT ID) which can be used for example to blacklist used tokens.
If this information is too condensed I can try to elaborate a little
Any comments for this patch? Is there a chance to include it in official source?
OK, I've made some tests with different ways of handling output parameters in mORMot and it looks like mORMot used to work better in 2013 :(
First of all here is my test procedure:
create procedure test_output_params @param1 varchar(30) OUTPUT, @param2 int OUTPUT, @param3 varchar(30) OUTPUT
as
begin
SET NOCOUNT ON
set @param1 = 'echo: '+@param1
set @param2 = 100 + @param2
set @param3 = 'test param3'
end
It accepts 3 parameters @p1, @p2 and @p3. Values of @p1 and @p2 are changed depending on input, while value of parameter@p3 is only set inside procedure.
You can test it with SQL statement like this one:
declare
@p1 varchar(30),
@p3 varchar(30),
@p2 int;
set @p1 = 'test'
set @p2 = 100
exec test_output_params @p1 OUTPUT, @p2 OUTPUT, @p3 OUTPUT
select @p1 as p1, @p2 as p2, @p3 as p3
Query above returns:
p1 p2 p3
------------------------------ ----------- ------------------------------
echo: test 200 test param3
Ok, now back to mORMot. Here is complete test code:
program stored_proc_output_params;
{$APPTYPE CONSOLE}
{$R *.res}
uses
SysUtils, SynDB, SynOleDB;
var
stmt: TSQLDBStatement;
param1, param3: string;
param2: integer;
outParam: Variant;
vConnProp: TOleDbConnectionProperties;
conn: TSQLDBConnection;
begin
vConnProp := TOleDBMSSQLConnectionProperties.Create('localhost', 'master', '', '');
conn := vConnProp.ThreadSafeConnection;
conn.Connect;
stmt := conn.NewStatement;
stmt.Prepare('exec test_output_params ? output, ? output, ? output', true);
param1:= 'test 123';
param2 := 123;
stmt.BindTextS(1, param1, paramOut);
stmt.Bind(2, param2, paramOut);
stmt.BindTextS(3, ' ', paramOut);
stmt.ExecutePrepared;
Writeln('expected stored procedure output:');
Writeln(format(' param1: "echo: %s"', [param1]));
Writeln(format(' param2: %d', [param2 + 100]));
Writeln(' param3: "test param3"');
Writeln('actual stored procedure output:');
stmt.ParamToVariant(1, outParam, false);
param1 := outParam;
stmt.ParamToVariant(2, outParam, false);
param2 := outParam;
stmt.ParamToVariant(3, outParam, false);
param3 := outParam;
Writeln(format(' param1: "%s"', [param1]));
Writeln(format(' param2: %d', [param2]));
Writeln(format(' param3: "%s"', [param3]));
stmt.Free;
Writeln('press [ENTER] to exit');
conn.Free;
Readln;
end.
As in SQL example expected output would be:
expected stored procedure output:
param1: "echo: test 123"
param2: 223
param3: "test param3"
instead in current version of mORMot we get:
actual stored procedure output:
param1: "test 123"
param2: 123
param3: "t"
It looks like param3 here is the problem and4sh wrote earlier. If you bind ' ' (space) you only get first letter as output. In this example it's only "t" instead of "test param3". If you bind a longer string you get longer result.
In old version of mORMmot (1f0ccd0924 from 2013) there were same problems for param1 and param2 but param3 used to work as expected. Here is an output from old version:
expected stored procedure output:
param1: "echo: test 123"
param2: 223
param3: "test param3"
actual stored procedure output:
param1: "test 123"
param2: 123
param3: "test param3"
Uff... hope it clarifies this topic a bit. Sorry for making it that long.
dc
I've recently upgraded to current mORMot sources and now I wonder which of my patches to mORMot should go back to your sources. As for now I think you should consider correction for SynDB.pas and SynXXXDB.pas.
Patch is about letting user (not mORMot) decide if he wants to strip last semicolon in query. It's backward compatible of course.
Here is patch for SynDB.pas:
1849d1848
< fStripSemicolon: boolean;
2333d2331
< property StripSemicolon: boolean read fStripSemicolon write fStripSemicolon;
6721d6718
< fStripSemicolon := true;
7444,7449c7441,7443
< if StripSemicolon then
< begin
< if (L>5) and (aSQL[L]=';') and // avoid syntax error for some drivers
< not IdemPChar(@aSQL[L-4],' END') then
< fSQL := copy(aSQL,1,L-1)
< end else
---
> if (L>5) and (aSQL[L]=';') and // avoid syntax error for some drivers
> not IdemPChar(@aSQL[L-4],' END') then
> fSQL := copy(aSQL,1,L-1) else
and corresponding one for SynOleDB.pas:
1808,1812c1808,1809
< if StripSemicolon then
< begin
< while (L>0) and (fSQL[L] in [#1..' ',';']) do
< dec(L); // trim ' ' or ';' right (last ';' could be found incorrect)
< end;
---
> while (L>0) and (fSQL[L] in [#1..' ',';']) do
> dec(L); // trim ' ' or ';' right (last ';' could be found incorrect)
One of examples when it's not allowed to strip last semicolon is a MERGE statement in SQL.
Here is preparation code:
create table test(
id int,
txt varchar(100),
primary key (id)
)
delete from test;
insert into test values (1, 'item 1'), (2, 'item 2');
and actual MERGE statement:
with new(id, txt) as (select 1, 'aqq' union select 5, 'aaaa')
merge test as old
using new on old.id = new.id
when matched then update set old.txt = new.txt
when not matched then insert values(new.id, new.txt);
if you now run select * from test you should get:
id txt
----------- ----------------------------------------------------------------------------------------------------
1 aqq
2 item 2
5 aaaa
If you remove semicolon from MERGE statement you get an error:
Msg 10713, Level 15, State 1, Line 5
A MERGE statement must be terminated by a semi-colon (;).
dc
Or... maybe it's a bug/missing feature of handling output params correctly in mORMot SynOleDb? I've encountered something very similar in post from 2013-03-16 18:59:40
dc
Thank you for committing this patch. There is another thing that bothers me though. Since this patch makes code compatible with old releases (from year 2013) and change in behavior wasn't discovered by current tests it clearly means there is no test for registering custom serializations. Maybe it's time to add one, just to make long term support easier?
dc
Can I count on a response? This patch is important for me to stay in sync with current mORMot sources.
Regards,
dc
It's nice to see that something is happening with FPC to bring it closer to Delphi developers. Hope it won't end up as big as CodeTyphon - though CT is a lot better than native FPC for me, it takes "a while" to make it ready to use.
Anyway good luck with this project
dc
Hi,
I've recently upgraded to newest Synopse (2016-06-30e55c874993) and noticed a problem with serialization. On old version (1f0ccd0924 from 2013) it was possible to register serialization callback for any object like that:
TJSONSerializer.RegisterCustomSerializer(TObject, TSerializer.ObjectReader, TSerializer.ObjectWriter);
Now it won't work because of change in function JSONObject() in mORMot.pas which now works only for exact types ignoring inheritance. I think it was intentional change but maybe there is still hope to change it. I have around 160 types of objects that I want to serialize and it's troublesome to register them all by hand.
instead of:
if aClassType<>nil then begin
aCustomIndex := JSONCustomParsersIndex(aClassType,aExpectedReadWriteTypes);
if aCustomIndex>=0 then begin
result := oCustom; // found exact custom type (ignore inherited)
exit;
end;
repeat // guess class type (faster than multiple InheritsFrom calls)
i := PtrUIntScanIndex(@TYP,MAX+1,PtrUInt(aClassType));
there should be:
if aClassType<>nil then begin
repeat // guess class type (faster than multiple InheritsFrom calls)
aCustomIndex := JSONCustomParsersIndex(aClassType,aExpectedReadWriteTypes);
if aCustomIndex>=0 then begin
result := oCustom;
exit;
end;
i := PtrUIntScanIndex(@TYP,MAX+1,PtrUInt(aClassType));
If you prefer patch file format here it is:
45851,45855d45850
< aCustomIndex := JSONCustomParsersIndex(aClassType,aExpectedReadWriteTypes);
< if aCustomIndex>=0 then begin
< result := oCustom; // found exact custom type (ignore inherited)
< exit;
< end;
45856a45852,45856
> aCustomIndex := JSONCustomParsersIndex(aClassType,aExpectedReadWriteTypes);
> if aCustomIndex>=0 then begin
> result := oCustom;
> exit;
> end;
Regards,
dc
Only single statement execution is supported yet.
Multiple statement execution is a feature not implemented in all libraries yet, so we stick to the simplest model, that is one SQL statement per SynDB statement.
Hmm... maybe it's not supported but it works and we are still using old version from march or april 2013. We use other multiple statements in different places and it doesn't produce any transactions errors later. Probably mORMot just works well with MS SQL . Anyway my post was just to let other people know what might be the reason for this error.
dc
1. sometimes, and I honestly don't know why MainConnection.StartTransaction raises exception saying
A request to establish a connection with the transaction manager was denied - A request to establish a connection with the transaction manager was denied
While using ODBC drivers you get a little more meaningful error:
A request to establish a connection with the transaction manager was denied - the transaction could not be started because it is used more than one ODBC connection.
....
dc
I have found a way to reproduce this error and thought I leave a comment here for those struggling with it. In our case reason for this error was using multiple statements in one, like this:
stmt := dbConn.NewStatementPrepared('set ROWCOUNT 1 '
+' select * from yourtable order by id desc '
+' set ROWCOUNT 0 ', false, true);
stmt.ExecutePrepared;
First call to BeginTransaction after this query always generates above error. In this case we could easily change query to use "select top 1 * from yourtable order by id desc" instead.
Hope this saves someone's time.
dc
Your JSON input is pretty weird.
Using an evolving integer value, changed into string, as key, is not common!Are you sure you would not be able to use an array instead?
In all cases, with custom serialization, every layout is possible.
To fetch a property name, you can just call GetJSONPropName().
Then JsonToObject() for the value.
It's not possible to change the json structure. Integer key as string in this case was used because of json limitations (or features) - you can not have a numerical property. It's just one of many examples, other dictionaries have real string keys. Anyway I've managed to write a generic parser and it works fast! Thank you very much for your help.
dc
Hi, I would like to know what is the best (most efficient) way to parse json containing a dictionary structure like this one:
{
"24203": {
"prop1": 0,
"prop2": "test 24203"
},
"24202": {
"prop1": 1,
"prop2": "test 24202"
},
"24201": {
"prop1": 2,
"prop2": "test 24201"
}
}
What I'd like to achieve is deserializing it directly into this structure:
type
TMyClass = class
prop1: integer;
prop2: string;
end;
TMyDictionary = TObjectDictionary<string, TMyClass>;
var
myDict: TMyDictionary;
// and then use sth. like this
JSONToObject(myDict, jsonStr, valid, TMyDictionary);
I know I have to use TJSONSerializer.RegisterCustomSerializer, but i'd like to know what is the best way to work with PUTF8Char type? I've seen some functions in SynCommons like GetJSON* but don't have a clear view of the parsing process yet. I would like to have a generic way to load TObjectDictionary<TKey,TValue> from json where TValue will be a simple class. Could you help me to get started?
dc
Glad I could contribute to your great project
Hi, there is no overriden UpdateCount method in TOleDBStatement.
I've made a quick hack and it seams to work properly with SQLServer. Not sure though whether it is consistent with your vision. Here is patch made from WinMerge (file SynOleDB.pas based on leaf 672e31903b):
999a1000
> function UpdateCount: integer; override;
1592c1593
< OleDBConnection.OleDBCheck(fCommand.Execute(nil,DB_NULLGUID,fDBParams,nil,nil));
---
> OleDBConnection.OleDBCheck(fCommand.Execute(nil,DB_NULLGUID,fDBParams,@fTotalRowsRetrieved,nil));
1675a1677,1683
> end;
>
> function TOleDBStatement.UpdateCount: integer;
> begin
> if not fExpectResults then
> result := TotalRowsRetrieved
> else result := 0;
regards
dc
I don't really care how it will be done at SQLite3 or mORMot internal level as far as I can acomplish my goal, which is correctly storing Delphi's TDateTime values into appropriate fields in DB. There are many databases and many datetime-like types around. In SQL Server alone you have:
- date - only date part, so no time issues here and ISO is fine
- datetime - old type rounded to increments of .000, .003, or .007 seconds (3.33 milliseconds)
- datetime2 - new type since SQL 2008 - accuracy of 100 nanoseconds.
- datetimeoffset - same as datetime2 but with additional time zone offset.
- smalldatetime - accuracy of 1 minute (without seconds part)
- time - accuracy of 100 nanoseconds.
Apart from datetime type all other types fully cover Delphi's TDateTime or TTime types.
There is however a problem with using string literals as date and time values (http://msdn.microsoft.com/en-us/library … 05%29.aspx):
SQL Server might interpret a date value in a string literal format, input by an application or user for storage or to a date function, as different dates. The interpretation depends on the combination of string literal format, data type, and runtime SET DATEFORMAT, SET LANGUAGE and default language option settings.
...
The ISO 8601 formats, '1998-02-23T14:23:05' and '1998-02-23T14:23:05-08:00' are the only formats that are an international standard. They are not DATEFORMAT or default login language dependent and are multilanguage.
Format of ISO 8601 with milliseconds part (as I assume you want to use) is language dependent, so it will lead to problems when using old datetime type instead of new datetime2:
DECLARE
@today2 datetime2,
@today datetime
set LANGUAGE French -- you can try Italian or Polish too
set @today2 = '2013-03-22 11:12:13.456' -- this is OK
begin try
set @today = '2013-03-22 11:12:13.456' -- this will raise error
end try
begin catch
set @today = @today2 -- this is OK
end catch
select @today2 as 'asDateTime2', @today as 'asDateTime', DATENAME(month, @Today) AS 'Month Name'
SET LANGUAGE us_english
set @today2 = '2013-03-22 11:12:13.456' -- this is OK
set @today = '2013-03-22 11:12:13.456' -- this is OK
select @today2 as 'asDateTime2', @today as 'asDateTime', DATENAME(month, @Today) AS 'Month Name'
I would not use string literals for queries' parameters. Binding native TDateTime type or maybe special type (like DateTimeToSQLTimeStamp from Data.SqlTimSt.pas) would be better?
I hope you find above info helpful.
dc
Thank you, it's really good news for me
About DateTime you have to use DateToSQL() / DateTimeToSQL() functions when transmitting such a parameter, as stated by the documentation. It should work on all targets.
About BigInt, what is the exact problem?Stored procedures are a bit out of scope of the ORM, IMHO.
They are available for convenience, not fully tested.
So any patch to fix is welcome.
As for bigint it seems there is no problem, there was a problem with devart drivers and SQL Server 2005 (you had to cast it to string for values > maxint). With mORMot and SQL 2008 it's working correctly.
There is however problem with DateTimeToSQL() function- it converts to ISO 8601 and therefore looses millisecond part of time. When you pass TDateTime directly it behaves differently depending on Bind*() method used. Take a look at this example:
dt := EncodeDateTime(2013, 03, 21, 13, 20, 21, 234);
stmt := dbConn.NewStatementPrepared('insert into dummy_table (col_datetime) values(?)', false, true);
stmt.BindDateTime(1, dt);
stmt.ExecutePrepared;
stmt := dbConn.NewStatementPrepared('insert into dummy_table (col_datetime) values(?)', false, true);
stmt.Bind([dt]);
stmt.ExecutePrepared;
you will have two wrong dates in database:
from first insert: 2013-03-21 13:20:21.000 (because of DateTime to ISO conversion)
and from second: 2013-03-23 13:20:21.233 (1 ms difference)
Of course when using ODBC you don't ever get milliseconds part...
As for stored procedures I'll gladly share my fixes when I have them
regards,
dc
I understand your points about KISS strategy and agree with them, so maybe it's better to implement nested transactions in user code. On the other hand naming transactions doesn't seem to be in conflict with KISS principles and it could be easily implemented in TSQLDBConnection.StartTransaction.
As you said
But since OleDB is deprecated, and ODBC does not support all features, won't it be a dead-end?
As Microsoft states in their roadmap from January 2013 (http://msdn.microsoft.com/en-us/library/ms810810.aspx):
- OleDB will be supported only up to SQLServer 2012 (but corporations are usually slowly adopting, so I suspect that version 2012 will be still in use for next 5-10 years, as I know some big companies that are slowly moving from version 2005 to 2008 nowadays)
- ODBC will be supported "better" from now on - that's good because now it's not as good as OleDB
From user point of view it would be probably better to switch to ODBC for future versions of code, however I would expect OleDB and ODBC in mORMot to behave identically, so when I change connection from OleDB to ODBC for the same database it would just work. What I mean here is inconsistent behavior of:
- datetime
- bigint
- stored procedure parameters and result codes http://synopse.info/forum/viewtopic.php?pid=7056#p7056
and so on...
dc
What you are naming a dbConnection is not... a connection I suppose.
This is a TSQLDBConnectionProperties instance.
I am aware of that.
So you are mixing some commands on the MainConnection and some on the thread safe pool.
1. Use a single TSQLDBConnection instance for all your requests.
2. Be aware that calling Execute() as you do return a ISQLDBRows instance, which should be released on purpose.
Hmmm...
Ok, I was just ctrl+clicking through code and it made me think that dbConnProp.MainConnection is handled by TSQLDBConnectionProperties.GetMainConnection method while in fact it's TSQLDBConnectionPropertiesThreadSafe.GetMainConnection. So basically i was wrong assuming that there is only a single connection used. Now when it all become clear and I feel quite embarrassed...
However when I use dbConnProp.ForceOnlyOneSharedConnection := true it should work properly (with one connection) shoudn't it?
And two more things:
1. Do you plan to add support for named transactions (using ITransactionOptions.SetOptions)? It would be really helpful for logging/debugging purposes
2. Is it possible to add support for nested transactions for SQL Server? I know OleDB drivers for SQL Server don't support it natively but SQL Server does support it. Now when you try to use conn.StartTransaction twice you get an Run-time error '-2147168237 (8004d013)': Only one transaction can be active on this session.
However I think you can "unlock" nested transactions by doing this:
- use "SET XACT_ABORT ON" as soon as possible (for example just after creating connection and connecting do SQL Server)
- when calling StartTransaction you have to check if fTransactionCount > 0. If it is instead of trying to do usual stuff:
fTransaction.StartTransaction(ISOLATIONLEVEL_READCOMMITTED,0,nil,nil)
you should just execute "begin transaction" statement and increase fTransactionCount
Similar approach should be used for commit and rollback. Of course this should not be default functinality and accessible only by using some settings like UseNestedTransactions := true;
What do you think?
dc
Btw, your code does not release the statements.
:-)
Yeah I know... in my defense I don't release statements in both approaches and it gives errors only when using dbConnection.Execute (or as I suppose NewStatementPrepared).
dc
Thank you for reply, I can't check your fixes right now but will do it in the morning. As far as threads are concerned, it's not a case here since I've tested transactions in a simple application in main thread.
dc
Hi,
there were some issues regarding working with transactions in SQL Server and OleDB drivers raised in this topic by p.s.t.: http://synopse.info/forum/viewtopic.php?id=1131 but since there are no responses I would like to summarize problems with transactions in SQL Server.
1. sometimes, and I honestly don't know why MainConnection.StartTransaction raises exception saying
A request to establish a connection with the transaction manager was denied - A request to establish a connection with the transaction manager was denied
While using ODBC drivers you get a little more meaningful error:
A request to establish a connection with the transaction manager was denied - the transaction could not be started because it is used more than one ODBC connection.
2. When you use at least two Execute() statements inside transaction you get an error
Cannot create new connection because in manual or distributed transaction mode.
Since I know this error from working with devart drivers I was able to quickly fix it. Instead of calling:
dbConnection.MainConnection.StartTransaction;
dbConnection.Execute('select 1', []);
dbConnection.Execute('select 2', []); // - here you get error
i used this:
function Exec(query: string; params: array of const; const expectResults: Boolean = true): TSQLDBStatement;
begin
result := dbConnection.NewThreadSafeStatement;
result.Prepare(query, expectResults);
result.Bind(params);
result.ExecutePrepared;
end;
...
dbConnection.MainConnection.StartTransaction;
Exec('select 1', []);
Exec('select 2', []);
It looks like there is a bug in NewStatementPrepared that doesn't close statements which return results but user code doesn't explicitly read/close them. Maybe it should not be in base class and be overridden by ancestors just as NewStatement is?
3. Sometimes, and again I don't know why StartTransaction call passes OK, however call to commit gives error
Invalid TOleDBConnection.Commit call.
And this (I suppose) is the case of p.s.t post.
dc
Hi, here is another difference between ODBC and OLEDB in mORMot. When you have tables containing dates declared as datetime type there are incorrect results when querying tables by date values. Here is sample table with 1000 rows of dates with millisecond precision:
create table test_dates
(
id int not null identity,
dt datetime not null
)
delete from test_dates
declare
@i int,
@dt datetime;
set @i = 1
set @dt = getdate();
while @i <= 1000
begin
set @dt = DATEADD(millisecond, 5 * rand(@i)*10000, @dt)
set @i = @i + 1
insert into test_dates(dt) values (@dt)
end
pick any date from this table and try to select it like this:
dt := EncodeDateTime(2013, 3, 16, 23, 18, 21, 293);
printResultSet(dbConnection.Execute('select top 3 id, dt from test_dates where dt >= ?', [dt]));
printResultSet(dbConnection.Execute('select top 3 id, dt from test_dates where dt >= ?', [DateTimeToStr(dt)]));
for ODBC both queries work correctly, however for OleDB only second (?!) one. This is very strange as I would expect passing TDateTime to query to behave better than passing string with date, which is very vulnerable for local settings differences between client and server.
dc
Sounds like if they are using IMultipleResults interface in such cases, which is not the case of SynOleDB.
Did you try to access the data via ODBC instead of OleDB?
Maybe ODBC does not suffer from this issue.
Hi, I've finally found some time to play with mORMot and stored procedures in SQL Server. Now I can confirm that ODBC drivers work correctly and are not affected by SET NOCOUNT. Unfortunately there is another issue I found, have no idea how accessing output result from stored procedure should be done. Here is example procedure I used for my tests:
create procedure test_params
@param1 smallint,
@param2 int OUTPUT
as
begin
set @param2 = @param1 + 1000;
return(18)
end
as you can see it takes 2 parameters, param1 as input and param2 as output. It also returns value 18 as result. You can check it by running this script in SQL Server Management Studio:
declare
@rc int,
@param2 int;
exec @rc = test_params 123, @param2 output
select @rc as result, @param2 as param2
when using dbexpress one would use something like this code:
ds.CommandType:= ctStoredProc;
ds.CommandText:= 'test_params';
ds.Params[1].AsInteger:= 123;
ds.ExecSQL;
rc := ds.Params[0].AsInteger;
param2 := ds.Params[2].AsInteger;
to acces both params and return value, but how do you do it using mORMot?
I've tried this way:
stmt.Prepare('exec ? = test_params ?, ?', false);
stmt.Bind(1, 0, paramOut);
stmt.Bind(2, 123, paramIn);
stmt.Bind(3, 0, paramOut);
stmt.ExecutePrepared;
stmt.ParamToVariant(1, rc, false);
stmt.ParamToVariant(2, param1, false);
stmt.ParamToVariant(3, param2, true);
It works when I use TOleDBODBCSQLConnectionProperties (rc = 18, param1 = 123, param2 = 1123). However when i switch to TOleDBMSSQLConnectionProperties i get rc =18 and param1 = 123, but param2 = 0
Is it a bug in OleDB drivers or am I doing it wrong?
And another question: do you have any plans to use IMultipleResults with mORMot?
regards
dc
As stated by http://technet.microsoft.com/en-us/libr … 17719.aspx it sounds like if it is pretty common to force SET NOCOUNT OFF for the client side.
What is the problem of using it as workaround?
Sometimes it can be troublesome. When you use SET NOCOUNT ON there are some good and bad effects, good is speed increase and in case of mORMot correct result set's handling. Bad is that it doesn't support return code value from stored procedures (at least it was a case with SQL Server 2005 and devart drivers). If you only want to get resultset or params from stored procedure it's ok, but I have to check if it behavies properly when you need return code too.
Sounds like if it is used explicitly when working with OleDB clients.
I was not able to find out how DONE_IN_PROC messages are translated at OleDB level.
In fact, DONE_IN_PROC are part of the TDS-speak.
Is it not a problem of OleDB itself?
I think it's not a problem at OleDB level because devart drivers for SQL Server do it properly, and since they've managed to do it I assume there is a way.
Regards,
dc
Hi,
I did some tests and it looks like there really is a problem with mORMot transactions in MS SQL Server.
As far as stored procedures are considered there is a problem too, however it's not the temporary table problem.
Thing is that SQL Server sends (as a part of result set) messages with number of rows affected by each statement done in stored procedure. This is default behavior, and can be changed with "SET NOCOUNT ON" directive. In other words, if your stored procedure does some other operations than just using select statements (and it usually does much more) it will not show resultset in mORMot (ISQLDBRows.Step returns false). There is also another problem (missing feature?) with SynOleDB.TOleDbConnectionProperties.Execute, what if you get multiple result sets? How do you handle them?
Here is a simple script to show those problems:
create database dev;
go
use dev;
create table dummy_table(
id integer not null identity,
col1 integer
)
go
-- this procedure returns 3 resultsets
create procedure test_resultsets(@withNoCount smallint = 0)
as
begin
if IsNull(@withNoCount, 0) = 1
SET NOCOUNT ON
insert into dummy_table(col1) values(1);
select 'resultset1: aqq 1.1' as col1, 1100 as col2
union
select 'resultset1: aqq 1.2' as col1, 1200 as col2;
select 'resultset2: aqq 2' as col1, 2345 as col2;
select 'resultset3, from dummy_table:' as col1, id from dummy_table;
end
To test this you can now open SQL Server Management Studio and connect to your local server instance. After that, press Ctrl+N (it will open "New Query" tab), paste above code and click "execute" button or press F5. It creates new database "dev", table "dummy_table" and stored procedure "test_resultsets". You can now open new query window and type "exec test_resultsets". When you press F5 you see 3 resultsets as here:
When you now switch to "Messages" tab you see messages that SQL Server send to client during execute precess (first message is from insert statement). Here is a screenshot:
Of course second and third result sets are not necessary to show that mORMot handles messages incorrectly (it's enough to execute insert in stored procedure and then select statement), they are here just to show another problem: handling multiple result sets.
Here is project to test how mORMot handles it:
program mORMot_StoredProcTest;
{$APPTYPE CONSOLE}
{$R *.res}
uses
SynOleDB,
SynCommons,
SynDB,
SysUtils;
var
dbConnection: TOleDbConnectionProperties;
i: integer;
procedure printResultSet(rows: ISQLDBRows);
var
s, ss, pad: string;
i: integer;
begin
pad := ' ';
s := StringOfChar(' ', 5) + pad;
ss := StringOfChar('-', 5) + pad;
for i := 0 to rows.ColumnCount - 1 do
begin
s := s + Format('%30s', [trim(rows.ColumnName(i))]) + pad ;
ss:= ss + StringOfChar('-', 30) + pad ;
end;
writeln(s);
writeln(ss);
i := 0;
while rows.Step do
begin
inc(i);
s:= format('%5d ', [i]);
for i := 0 to rows.ColumnCount - 1 do
s := s + Format('%30s', [Copy(rows.ColumnString(i), 1, 30)]) + pad ;
writeln(s);
end;
writeln(ss);
writeln(Format('%d row(s) returned.', [i]));
writeln;
end;
begin
// connection made using windows authentication to database "dev"
// if using SQLServer authentication supply user and passsword too
dbConnection := TOleDBMSSQLConnectionProperties.Create('localhost\sqlexpress', 'dev', '', '');
writeln('default behavior ("exec test_resultsets"):');
writeln;
printResultSet(dbConnection.Execute('exec test_resultsets ?', [0]));
writeln('NOCOUNT=ON ("exec test_resultsets 1"):');
writeln;
printResultSet(dbConnection.Execute('exec test_resultsets ?', [1]));
readln;
end.
It looks like mORMot doesn't handle DONE_IN_PROC messages sent by SQL Server correctly (it should ignore them and wait for real result if there is any). You can check documentation here: http://msdn.microsoft.com/en-us/library/ms189837.aspx
Hope this post helps to clarify things a bit.
Regards,
dc
Thank you very much, works great.
Which provider do you use?
SQLNCLI10?I tried to fix the issue in http://synopse.info/fossil/info/a58196185f
Yes, thank you your patch fixes both issues (rowid and type), however there is one more problem; when I ty to execute something like this:
DB.Delete(TSQLMyClass, 'PLU < ?', [20000]);
i get error "OLEDB Error 80040E14 - (line 1): Incorrect syntax near ':'." and query in ExecuteDirect method is: "delete from dbo.Assortment where PLU < :(20000):"
dc
OK, I've stepped through code and there is this peace of code
function TOleDBConnectionProperties.ColumnTypeNativeToDB(
const aNativeType: RawUTF8; aScale: integer): TSQLDBFieldType;
begin
result := OleDBColumnToFieldType(GetInteger(pointer(aNativeType)),aScale)
end;
Problem is, that function GetInteger(pointer(aNativeType)) returns 0 for all native types used in my example ('bigint', 'nvarchar', 'datetime', 'money'). In TSQLRestServerStaticExternal.ExecuteFromJSON fFields has this value:
(
('ID', 'bigint', ftUnknown, 0, 19, 0, False),
('PLU', 'bigint', ftUnknown, 0, 19, 0, False),
('Name', 'nvarchar', ftUnknown, 250, 0, 0, False),
('ArtNr', 'nvarchar', ftUnknown, 100, 0, 0, False),
('PurchDate', 'datetime', ftUnknown, 0, 0, 0, False),
('ExpiryDate', 'datetime', ftUnknown, 0, 0, 0, False),
('Price', 'money', ftUnknown, 0, 19, 4, False)
)
dc
Sounds right to me.
I can't find out directly what is wrong here.
The error "Invalid Types[0]=0" sounds like a column with an invalid type.
Can you debug your code, and check which column/field is faulty, and how it is retrieved from the DB point of view, at the SELECT level?
At the select level it is adding RowId column to query. Here is SQLTableSimpleFields from debugger (table name was changed to Assortment, but it's irrelevant):
(
('PLU,Name,ArtNr,PurchDate,ExpiryDate,Price', 'Assortment.PLU,Assortment.Name,Assortment.ArtNr,Assortment.PurchDate,Assortment.ExpiryDate,Assortment.Price'),
('RowID,PLU,Name,ArtNr,PurchDate,ExpiryDate,Price', 'Assortment.RowID,Assortment.PLU,Assortment.Name,Assortment.ArtNr,Assortment.PurchDate,Assortment.ExpiryDate,Assortment.Price')
)
One more thing: there are proper values in Decoder.FieldValues, however Types has only ftUnknown values (then exception "ExecuteFromJSON: Invalid Types[%d]=%d" is raised.
Hi, here is definition of TSQLMyClass
TSQLMyClass= class(TSQLRecord)
private
fPLU: Int64;
fArtNr: RawUTF8;
fName: RawUTF8;
fPurchDate: TDateTime;
fExpiryDate: TDateTime;
fPrice: Currency;
published
property PLU: Int64 read fPLU write fPLU;
property Name: RawUTF8 read fName write fName;
property ArtNr: RawUTF8 read fArtNr write fArtNr;
property PurchDate: TDateTime read fPurchDate write fPurchDate;
property ExpiryDate: TDateTime read fExpiryDate write fExpiryDate;
property Price: Currency read fPrice write fPrice;
end;
Hi,
We are currently investigating possibility of using your great framework with our projects. However we have problem using it with MS SQLServer. Our example project works perfectly with default settings, but since we want to use SQLServer I had to add those two lines:
Props := TOleDBMSSQLConnectionProperties.Create('localhost\SQLEXPRESS', 'testDB', 'sa', '12Qwerty');
...
VirtualTableExternalRegister(Model, TSQLMyClass, Props, 'dbo.MyClass');
It works OK with insert, delete and update, however when it comes to select it fails on
TSQLMyClass.CreateAndFillPrepare
with exception Invalid column name ''RowID''. When I tried to investigate it further I found that there is inner exception ExecuteFromJSON: Invalid Types[0]=0.
Am I doing something wrong or is it a bug? Thanks in advance for any info.
Regards
dc
Pages: 1