Get Help:Ask a Question in our Forums|Report a Bug|More Help Resources
Last post Mar 19, 2007 10:18 AM by Syndra
Aug 02, 2005 08:16 PM|LINK
As far as "objects building their own queries" -- I think you misunderstood what I was talking about. In CSLA, the business objects do not use a seperate layer: the busines object does its own sql (and that of its children) internally. There is no repository
you go to generate (or create your own) dynamic sql -- rather the business classes have static methods, where one "root" object gets all of its children, grand children, etc. i've modified the framework a bit to better support lazy loading and seperating
out the sql specific stuff from teh business objects, but they are still tightly intervoven. One of the things that attracts me to Paul's work is that you can ask the repository for the objects in a variety of different ways, rather than embedding all that
data access logic in the object hierarchy itself. The business objects seem blissfuly unaware that they are persisted at all, keeping the two concerns a lot more seperate. Also, the ability to not have to touch sql (for most things) as you mentioned.
Aug 16, 2005 03:40 AM|LINK
Aug 16, 2005 08:44 AM|LINK
Aug 16, 2005 08:51 AM|LINK
I'm jumping in a little late to the party, forgive me. The quality of discussion is amazingly high. It's great to see Frans weighing in (I have tremendous respect for his work and his contributions on persistence subjects).
To make that evident, let me confess that our strongly-typed object query language is not as rich as Hibernate or WORM's (haven't looked as closely at LLBLGen yet). Customers haven't complained but there are definitely useful queries that we cannot express
in our OQL. Our fall back is SQL PassThru, Stored Procedures, or (worst case) Remote Procedure Calls.
BTW, we have recently added support for Identity columns (for SQL Serverso far) . The prior work-arounds were not ideal. We finally found a way to cope with the problem of transaction rollbacks of new objects after their Identity column ids have already been
assigned and used as foreign keys by other objects.
Let me illustrate. Suppose you add new Order 'A' with its new order items 'A1', 'A2'.... You make a small change to Order's existing Customer 'C' and then save the entire bunch transactionally. The db gives you a new Id for A which you stick in the ParentOrderId
columns of A1, A2, etc. Unfortunately, you get a concurrency failure because someone modified Customer C and saved it before you did. The database roles back the transaction but your A1, A2, etc are sitting in cache with A's fixed up ID - an ID that it can
never actually have. That ID is gone forever. What do you do? I wonder how other vendors manage this.
Moving on. Would you be willing to amplify on your comments about intuitiveness, collections, and inheritance? Happy to have this conversation elsewhere if this is not the right place.
Just to get the juices flowing, let's talk about inheritance. As Frans made clear, it's easy to change "types" in a database table but it's - shall we say - "problematic" to have a Manager class that inherits from Employee (nor should SalesRep nor BoardMember
nor PartTime, to mention a few of the other potential hats the Employee may put on and take off). On the other hand, it makes good sense for a number of otherwise un-related classes to inherit common behavior from a base class. For example, I might want some
subset of my entities to share the same logging or auditing behaviors. The ORM facility should enable the developer to insert an Audit support class into the inheritance chain. That Audit class is probably not a persistable entity in its own right.
Aug 16, 2005 09:37 AM|LINK
We finally found a way to cope with the problem of transaction rollbacks of new objects after their Identity column ids have already been assigned and used as foreign keys by other objects.
Total non-issue. See, as id-columns are not supposed to be end user known identities (like order number), loosing numbers is just a total non-issue. This is basic SQL 101 for beginners. There is no need for a solution unless you make up a problem by abusing
SQL features for things they are not. It is a typical beginner mistake to abuse them as things like order numbers.
Aug 16, 2005 09:49 AM|LINK
Aug 16, 2005 12:16 PM|LINK
Interesting point. It would point, though, to a crap internal architecture.
Using a tiered approach, the data of bhe objects would goto a non-chatty DAL and the return values would then be submitted to the objects only after the db operations have been performed. At least this is how we do it in the EntityBroker.
Plus, Transaction 101 and Windows standard approaches (COM+) demand that transactions having an exception are not resubmitted but thrown away, which means - the issue is a non issue again.
Aug 16, 2005 12:26 PM|LINK
Aug 17, 2005 12:21 AM|LINK
I'm not trying to post a commercial.
You are pretty bad in your english, because you DID post just a commercial, you know.
Somehow your statement really reminds me of the "this is not spam" lines in - spam.
Aug 17, 2005 08:45 AM|LINK
That would require a serious structure to keep track of all the new ID's in a hierarchical save, while you could solve it more elegantly by leaving it to the objects
At the price of not being able to use the DAL efficiently through a remoting scenario (not the technology - the scenario). Once your DAL has to do this anyway, the infrastructure is there anyway. Also, as/if you are throwing change events, you want to suppres/delay
them until the DAL has finished. But basically, to properly support any form of layer-border between the DAL and your runtime as tiers, you need this anyway.
Not necessarily, there are scenario's where you retry a couple of times before giving up (essentially restarting the transaction but with the same entities).
Simplistic ones only, given that the DTC / System.Transaction functionality does not support it.