You are not logged in.
Pages: 1
In an interesting question in StackOverflow, I found out this:
Hello, in my recents researches on Internet about Object Databases and/or frameworks for OO persistance in relational databases, I found an interesting paper of Mr. Cook and Rai named Native Query (http://en.wikipedia.org/wiki/Native_Queries).
His proposal is that if you use an object oriented language and use an object oriented framework to persist (doesn't matter if the database is OO or not), you should be able to create safe object oriented queries when returning data from the database.
Instead of creating queries where you should provide object/table's columns and filter values (query.AddEquals('Name', 'John')) or creating/using any language to describe queries (like SQL does: "WHERE Person.Name = 'John'"), you would be able to construct an expression in the language you are programing, with the power of its resources and being compilelable.
In Delphi, his proposal would result in something like:
TNativeQuery query = db.Query(Person, reference to function(Person personObj): Boolean begin Result := personObj.Name = 'John'; end;
or in another exemple:
TNativeQuery query = db.Query(Person, reference to function(Person personObj): Boolean begin Result := (personObj.Points > 100 and personObj.Points < 200) and (personObj.Active); end;
In the author's idea, the block of code written in the reference function (an adaptation for his suggestion, wich originally is in Java) will be used to evaluate the results in the database and determine what results should be put in the result set.
The first solution would be execute that reference to function against all records in the database. Or in other words, select and instantiate all Person in the database and executing the function against each Person instantiated to decide if it matches or not. That, of course, it's a terrible idea. Would take too long time and too much PC resources.
What Mr. Cock and Rai suggests actually is to analyze and use what is in the reference function to execute your query. In our exemples, we should be able to make an indexed search using indexes for fields Name or Points. In that case, we should be able to "read" the code of the function, "understand" that the field 'Name' (in first example) will be used to filter the result and that, "John" is the value to used to filter that field. With the second example, it's even worst.
The paper points two solutions: you should be able to read the code before compiler doest it's work or after it. And now, after this wall of text, I finally have my question: How to read/parse the source code of that reference function?
To sum up,
a. I have to know mainly what object properties/methods, boolean operations and constant values were used in that code (at least!) and
b. That code should not be an string (it has to be pure Delphi code)
What suggestions would you guys give me:
I should try some post-build task which would re-read the source codes, identifying these functions and keeping and extra resource file with the real code for the functions (Don't like these aproaching)?
I have no option but reading the compiled code in Assembly, creating a complex and hard work function with identifing the symbols, functions and being a prisioner of each Delphi new compiler?
I should just abandon this, use some easier implementation solution (like a string with some pseudo-language filter)?
I would like to say yet that db4o database used the hard solution: they read and evaluate the compiled code. Of course, I saw the .NET version of db4o and the 'compiled code' of a .NET application is not that hard to understand (not as ASM).
Offline
This question sounded rather familiar to me.
I was wondering myself for the same feature for our ORM...
Since disassembling of the generated code is impossible (or not worth it), I'll see three possibilities:
1. Use a map/reduce mechanism, therefore rely on an iteration to all data rows, then applying the reduce function, created as a Delphi delegate.
2. For using indexes, we could need to parse then create an AST from the source code of the function, then convert it to the corresponding SQL statement.
3. Write some SQL WHERE statement inside your Delphi code.
About point 3, with an ORM, it's not so difficult to have a dedicated SQL WHERE statement, directly written in SQL, but mapping the Delphi code. That was my approach to this question.
I don't think that the following code (typical for our ORM framework) is neither difficult to write nor to read:
aPerson := TSQLPerson.CreateAndFillPrepare(RestClient,
'Points > :(%): AND Points < :(%): AND Active <> 0', [100,200]);
while aPerson.FillOne do
DoSomeThingWith(aPerson);
aPerson.Free;
Perhaps the only not obvious syntax is the use of : ( ) : for marking prepared parameters for the SQL statement.
I could have written the request like that (with no prepared statement):
aPerson := TSQLPerson.CreateAndFillPrepare(RestClient,
'Points > 100 AND Points < 200 AND Active <> 0');
while aPerson.FillOne do
DoSomeThingWith(aPerson);
aPerson.Free;
For more complex requests (e.g. with some JOIN clause), our ORM will use a TSQLTable instance. For Many To Many requests, there are some dedicated methods.
So my answer could be: is it really necessary?
Offline
Additional note:
I sounds like to me that LINQ is using my proposal point 2.
But as I commented below, LINQ doesn't use the standard C# native syntax, but a SQL-like syntax, known by the IDE and compiled at run time it to SQL. If Microsoft experts didn't succeed into using a "native" syntax, I guess we won't do that by ourselves...
In all cases, my Delphi code above can be compared to this one (perhaps wrong I'm no DotNet expert) - and Delphi will be faster:
var q =
from p in db.Persons
where 100 < p.Points < 200
select p;
foreach (Person p in q)
dosomethingwith(p);
But one drawback of LINQ is its speed.
LINQ to SQL must translate LINQ queries to SQL every time a query executes; this involves recursing the expression tree that makes up the query in several stages. It sounds worse than it is: the computation cost is not enormous in the overall scheme of things (certainly not as big as the cost of building an AttributeMappingSource when lots of entities are involved). Nonetheless, you can avoid paying the price on each query execution by precompiling the query using the CompiledQuery class.
See http://www.albahari.com/nutshell/speedi … tosql.aspx
IMHO the feature saying that "LINQ code is native C#" is almost marketing.
The Visual Studio IDE understands about LINQ syntax (and allows code completion). That's the main point of it.
But IMHO it's always an interpreted SQL-like syntax, just written in the middle of C# code.
It's definitively not "native syntax", as stated by the wikipedia article quoted above.
Our framework speed is incredibly fast, when compared to this high-level syntax sugar.
Thanks to native Delphi compilation and efficient RTTI.
Additional note: it sounds funny to me that the main marketing feature of C# version 4 is the late-binding to COM objects.
Delphi did this since early version, perhaps 10 years ago...
I used this late-binding to allow easy access to field content in our TSynBigTableRecord / TSynBigTableMetaData classes.
Offline
Pages: 1