Search This Blog

Wednesday, 19 June 2013

Why the need for ResultTransformers ?

Consider the below criteria example which simply loads the Entity records from the database with ids less than 6.
public static void testSimpleFetch() {
    final Session session = sessionFactory.openSession();
    final Criteria criteria = session.createCriteria(Entity.class);
    criteria.add(Restrictions.le("id", 6));
    int i = 1;
    List<Entity> entities = criteria.list();
    for (Entity entity : entities) {
        System.out.println((i++) + " : " + entity + " - " + entity.hashCode());
        for (Child child : entity.getChildren()) {
            System.out.println(child + " - " + child.hashCode());
        }
    }
}
There is nothing special about the code except I have tweaked the Fetching Strategy a little.
<set name="children" cascade="all-delete-orphan" inverse="true" fetch="join" >
    <key column="ENTITY_ID" not-null="true" />
    <one-to-many class="Child" />
</set>
The children set has been changed to have a join select. On executing the below code the result is as follows:
select
    this_.ID as ID0_1_,
    this_.NAME as NAME0_1_,
    this_.DATE as DATE0_1_,
    this_.MASTER_ID as MASTER4_0_1_,
    children2_.ENTITY_ID as ENTITY3_3_,
    children2_.ID as ID3_,
    children2_.ID as ID2_0_,
    children2_.`KEY` as KEY2_2_0_,
    children2_.ENTITY_ID as ENTITY3_2_0_ 
from
    ENTITY this_ 
left outer join
    CHILD_ENTITY children2_ 
        on this_.ID=children2_.ENTITY_ID 
where
    this_.ID<=?
1 : [Entity] : ( id 2 , data : entity1 , master.Id : 1 , date : null )] - 9800632
[Child] : ( id 1 , key : 1001 , parent.Id : 2 )] - 13905160
[Child] : ( id 4 , key : 1004 , parent.Id : 2 )] - 9740137
[Child] : ( id 3 , key : 1003 , parent.Id : 2 )] - 18306082
2 : [Entity] : ( id 2 , data : entity1 , master.Id : 1 , date : null )] - 9800632
[Child] : ( id 1 , key : 1001 , parent.Id : 2 )] - 13905160
[Child] : ( id 4 , key : 1004 , parent.Id : 2 )] - 9740137
[Child] : ( id 3 , key : 1003 , parent.Id : 2 )] - 18306082
3 : [Entity] : ( id 2 , data : entity1 , master.Id : 1 , date : null )] - 9800632
[Child] : ( id 1 , key : 1001 , parent.Id : 2 )] - 13905160
[Child] : ( id 4 , key : 1004 , parent.Id : 2 )] - 9740137
[Child] : ( id 3 , key : 1003 , parent.Id : 2 )] - 18306082
4 : [Entity] : ( id 3 , data : entity2 , master.Id : 1 , date : null )] - 28890871
[Child] : ( id 2 , key : 1002 , parent.Id : 3 )] - 23965177
5 : [Entity] : ( id 4 , data : entity100 , master.Id : 2 , date : null )] - 29887233
[Child] : ( id 5 , key : 4 , parent.Id : 4 )] - 2691004
6 : [Entity] : ( id 5 , data : entity102 , master.Id : 2 , date : null )] - 25378506
[Child] : ( id 6 , key : 43 , parent.Id : 5 )] - 4890830
As can be seen the query returned 6 records of entities. But if we look at the details of the Entity records we can see that only 4 records were returned - those with ids 2,3,4,5 As Entity with id 2 had three child records, the record occurred thrice in the result. 
A similar query in HQL(one with an explicit left join fetch) would return an object array of records, in Criteria while we do get the root entity, we also end up with a lot of duplicates in the record. This is because Criteria unlike HQL respects the fetching strategy specified. A similar result could be achieved by altering the Fetch Mode dynamically. I removed the fetch mode from the configuration and applied it dynamically to get the same result:
final Criteria criteria = session.createCriteria(Entity.class);
criteria.setFetchMode("children", FetchMode.JOIN);
criteria.add(Restrictions.le("id", 6));
The good thing is Hibernate is smart enough to not create a different Entity for each of the record (That would then break the Session cache guarantee too). Hibernate only creates every record once and returns us a list of references. This can be seen from the hash-code of the Entity objects (I haven't implemented hash-code here). A simple solution to remove the duplicate Entity references would to be to add the result to a Set. As sets don't have duplicates the result would now be clean of duplicate entities.
public static void testSimpleFetchViaSet() {
    final Session session = sessionFactory.openSession();
    //a join here causes extra records to be loaded
    Criteria criteria = session.createCriteria(Entity.class);
    criteria.setFetchMode("children", FetchMode.JOIN);
    criteria.add(Restrictions.le("id", 6));
    int i = 1;
    Set<Entity> entities = new LinkedHashSet<Entity>(criteria.list());
    for (Entity entity : entities) {
        System.out.println((i++) + " : " + entity + " - " + entity.hashCode());
        for (Child child : entity.getChildren()) {
            System.out.println(child + " - " + child.hashCode());
        }
    }
}
The Output displayed is :
1 : [Entity] : ( id 2 , data : entity1 , master.Id : 1 , date : null )] - 9800632
[Child] : ( id 1 , key : 1001 , parent.Id : 2 )] - 13905160
[Child] : ( id 4 , key : 1004 , parent.Id : 2 )] - 9740137
[Child] : ( id 3 , key : 1003 , parent.Id : 2 )] - 18306082
2 : [Entity] : ( id 3 , data : entity2 , master.Id : 1 , date : null )] - 28890871
[Child] : ( id 2 , key : 1002 , parent.Id : 3 )] - 23965177
3 : [Entity] : ( id 4 , data : entity100 , master.Id : 2 , date : null )] - 29887233
[Child] : ( id 5 , key : 4 , parent.Id : 4 )] - 2691004
4 : [Entity] : ( id 5 , data : entity102 , master.Id : 2 , date : null )] - 25378506
[Child] : ( id 6 , key : 43 , parent.Id : 5 )] - 4890830
The count now was 4.
It is important to use a LinkedHashSet here if we want the order of the result set to be preserved. The other option is to add a Result Transformer:
public static void testSimpleFetchUsingResultTransformer() {
    final Session session = sessionFactory.openSession();
    Criteria criteria = session.createCriteria(Entity.class);
    criteria.setFetchMode("children", FetchMode.JOIN);
    criteria.add(Restrictions.le("id", 6));
    criteria.setResultTransformer(Criteria.DISTINCT_ROOT_ENTITY);
    int i = 1;
    List<Entity> entities = criteria.list();
    for (Entity entity : entities) {
        System.out.println((i++) + " : " + entity + " - " + entity.hashCode());
        for (Child child : entity.getChildren()) {
            System.out.println(child + " - " + child.hashCode());
        }
    }
}
The output is :
select
     this_.ID as ID0_1_,
     this_.NAME as NAME0_1_,
     this_.DATE as DATE0_1_,
     this_.MASTER_ID as MASTER4_0_1_,
     children2_.ENTITY_ID as ENTITY3_3_,
     children2_.ID as ID3_,
     children2_.ID as ID2_0_,
     children2_.`KEY` as KEY2_2_0_,
     children2_.ENTITY_ID as ENTITY3_2_0_ 
from
    ENTITY this_ 
left outer join
    CHILD_ENTITY children2_ 
        on this_.ID=children2_.ENTITY_ID 
where
    this_.ID<=?
1 : [Entity] : ( id 2 , data : entity1 , master.Id : 1 , date : null )] - 9800632
[Child] : ( id 1 , key : 1001 , parent.Id : 2 )] - 13905160
[Child] : ( id 4 , key : 1004 , parent.Id : 2 )] - 9740137
[Child] : ( id 3 , key : 1003 , parent.Id : 2 )] - 18306082
2 : [Entity] : ( id 3 , data : entity2 , master.Id : 1 , date : null )] - 28890871
[Child] : ( id 2 , key : 1002 , parent.Id : 3 )] - 23965177
3 : [Entity] : ( id 4 , data : entity100 , master.Id : 2 , date : null )] - 29887233
[Child] : ( id 5 , key : 4 , parent.Id : 4 )] - 2691004
4 : [Entity] : ( id 5 , data : entity102 , master.Id : 2 , date : null )] - 25378506
[Child] : ( id 6 , key : 43 , parent.Id : 5 )] - 4890830
The query remained unchanged. However Hibernate removes the duplicate references to the Entity records thus giving us the correct result. There are more Result Transformers which we shall visit in the next post.

2 comments: