Hibernate Error: org.hibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session

I have two user Objects and while I try to save the object using

session.save(userObj);

I am getting the following error:

Caused by: org.hibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session:
[com.pojo.rtrequests.User#com.pojo.rtrequests.User@d079b40b]

I am creating the session using

BaseHibernateDAO dao = new BaseHibernateDAO();


rtsession = dao.getSession(userData.getRegion(),
BaseHibernateDAO.RTREQUESTS_DATABASE_NAME);


rttrans = rtsession.beginTransaction();
rttrans.begin();


rtsession.save(userObj1);
rtsession.save(userObj2);


rtsession.flush();
rttrans.commit();


rtsession.close(); // in finally block

I also tried doing the session.clear() before saving, still no luck.

This is for the first I am getting the session object when a user request comes, so I am getting why is saying that object is present in session.

Any suggestions?

257649 次浏览

Are your Id mappings correct? If the database is responsible for creating the Id through an identifier, you need to map your userobject to that ..

I have had this error many times and it can be quite hard to track down...

Basically, what hibernate is saying is that you have two objects which have the same identifier (same primary key) but they are not the same object.

I would suggest you break down your code, i.e. comment out bits until the error goes away and then put the code back until it comes back and you should find the error.

It most often happens via cascading saves where there is a cascade save between object A and B, but object B has already been associated with the session but is not on the same instance of B as the one on A.

What primary key generator are you using?

The reason I ask is this error is related to how you're telling hibernate to ascertain the persistent state of an object (i.e. whether an object is persistent or not). The error could be happening because hibernate is trying to persist an object that is already persistent. In fact, if you use save hibernate will try and persist that object, and maybe there is already an object with that same primary key associated with the session.

Example

Assuming you have a hibernate class object for a table with 10 rows based on a primary key combination (column 1 and column 2). Now, you have removed 5 rows from the table at some point of time. Now, if you try to add the same 10 rows again, while hibernate tries to persist the objects in database, 5 rows which were already removed will be added without errors. Now the remaining 5 rows which are already existing, will throw this exception.

So the easy approach would be checking if you have updated/removed any value in a table which is part of something and later are you trying to insert the same objects again

Another thing that worked for me was to make the instance variable Long in place of long


I had my primary key variable long id; changing it to Long id; worked

All the best

Get the object inside the session, here an example:

MyObject ob = null;
ob = (MyObject) session.get(MyObject.class, id);

This is only one point where hibernate makes more problems than it solves. In my case there are many objects with the same identifier 0, because they are new and don't have one. The db generates them. Somewhere I have read that 0 signals Id not set. The intuitive way to persist them is iterating over them and saying hibernate to save the objects. But You can't do that - "Of course You should know that hibernate works this and that way, therefore You have to.." So now I can try to change Ids to Long instead of long and look if it then works. In the end it's easier to do it with a simple mapper by your own, because hibernate is just an additional intransparent burden. Another example: Trying to read parameters from one database and persist them in another forces you to do nearly all work manually. But if you have to do it anyway, using hibernate is just additional work.

You always can do a session flush. Flush will synchronize the state of all your objects in session (please, someone correct me if i'm wrong), and maybe it would solve your problem in some cases.

Implementing your own equals and hashcode may help you too.

I also ran into this problem and had a hard time to find the error.

The problem I had was the following:

The object has been read by a Dao with a different hibernate session.

To avoid this exception, simply re-read the object with the dao that is going to save/update this object later on.

so:

class A{


readFoo(){
someDaoA.read(myBadAssObject); //Different Session than in class B
}


}


class B{






saveFoo(){
someDaoB.read(myBadAssObjectAgain); //Different Session than in class A
[...]
myBadAssObjectAgain.fooValue = 'bar';
persist();
}


}

Hope that save some people a lot of time!

Check if you forgot to put @GenerateValue for @Id column. I had same problem with many to many relationship between Movie and Genre. The program threw Hibernate Error: org.hibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session error. I found out later that I just have to make sure you have @GenerateValue to the GenreId get method.

You can check your Cascade Settings. The Cascade settings on your models could be causing this. I removed Cascade Settings (Essentially not allowing Cascade Inserts/Updates) and this solved my problem

USe session.evict(object); The function of evict() method is used to remove instance from the session cache. So for first time saving the object ,save object by calling session.save(object) method before evicting the object from the cache. In the same way update object by calling session.saveOrUpdate(object) or session.update(object) before calling evict().

just check the id whether it takes null or 0 like

if(offersubformtwo.getId()!=null && offersubformtwo.getId()!=0)

in add or update where the content are set from form to Pojo

I found this error as well. What worked for me is to make sure that the primary key (that is auto-generated) is not a PDT (i.e. long, int, ect.), but an object (i.e. Long, Integer, etc.)

When you create your object to save it, make sure you pass null and not 0.

As somebody already pointed above i ran into this problem when i had cascade=all on both ends of a one-to-many relationship, so let's assume A --> B (one-to-many from A and many-to-one from B) and was updating instance of B in A and then calling saveOrUpdate(A) , it was resulting in a circular save request i.e save of A triggers save of B that triggers save of A... and in the third instance as the entity( of A) was tried to be added to the sessionPersistenceContext the duplicateObject exception was thrown.
  I could solve it by removing cascade from one end.

I ran into this problem by:

  1. Deleting an object (using HQL)
  2. Immediately storing a new object with the same id

I resolved it by flushing the results after the delete, and clearing the cache before saving the new object

String delQuery = "DELETE FROM OasisNode";
session.createQuery( delQuery ).executeUpdate();
session.flush();
session.clear();

I'm new to NHibernate, and my problem was that I used a different session to query my object than I did to save it. So the saving session didn't know about the object.

It seems obvious, but from reading the previous answers I was looking everywhere for 2 objects, not 2 sessions.

Does this help?

User userObj1 = new User();
User userObj2 = userObj1;
.
.
.
rtsession.save(userObj1);
rtsession.save(userObj2);

I encountered this problem with deleting an object, neither evict nor clear helped.

/**
* Deletes the given entity, even if hibernate has an old reference to it.
* If the entity has already disappeared due to a db cascade then noop.
*/
public void delete(final Object entity) {
Object merged = null;
try {
merged = getSession().merge(entity);
}
catch (ObjectNotFoundException e) {
// disappeared already due to cascade
return;
}
getSession().delete(merged);
}

This can happen when you have used same session object for read & write. How? Say you have created one session. You read a record from employee table with primary key Emp_id=101 Now You have modified the record in Java. And you are going to save the Employee record in database. we have not closed session anywhere here. As the object that was read also persist in the session. It conflicts with the object that we wish to write. Hence this error comes.

It's because you have open a session maybe for get data and then you forget to close it. When you delete you open session again then it becomes error.

SOLUTION: every function should open and close session

session.getTransaction.begin(); /* your operation */ session.close()

@GeneratedValue(strategy=GenerationType.IDENTITY), adding this annotation to the primary key property in your entity bean should solve this issue.

In my model object class i ha defined the annotations like this

@Entity
@Table(name = "user_details")
public class UserDetails {
@GeneratedValue
private int userId;
private String userName;


public String getUserName() {
return userName;
}


public void setUserName(String userName) {
this.userName = userName;
}


@Id
public int getUserId() {
return userId;
}


public void setUserId(int userId) {
this.userId = userId;
}
}

the issue resolved when I writing the both @Id and @GenerateValue annotation together @ the variable declaration.

@Entity
@Table(name = "user_details")
public class UserDetails {
@Id
@GeneratedValue
private int userId;
private String userName;


public String getUserName() {
return userName;
}


public void setUserName(String userName) {
this.userName = userName;
}


public int getUserId() {
return userId;
}
...
}

Hope this is helpful

I resolved this problem .
Actually this is happening because we forgot implementation of Generator Type of PK property in the bean class. So make it any type like as

@Id
@GeneratedValue(strategy=GenerationType.IDENTITY)
private int id;

when we persist the objects of bean ,every object acquired same ID ,so first object is saved ,when another object to be persist then HIB FW through this type of Exception: org.hibernate.NonUniqueObjectException: a different object with the same identifier value was already associated with the session.

In my run I follows

1- Switch to Lazy keys in the entity 2- download the most up to date from Maven

--http://mvnrepository.com/artifact/org.javassist/javassist/3.19.0-GA

I have solved a similar problem like that:

plan = (FcsRequestPlan) session.load(plan.getClass(), plan.getUUID());
while (plan instanceof HibernateProxy)
plan = (FcsRequestPlan) ((HibernateProxy) plan).getHibernateLazyInitializer().getImplementation();

Try this. The below worked for me!

In the hbm.xml file

  1. We need to set the dynamic-update attribute of class tag to true:

    <class dynamic-update="true">
    
  2. Set the class attribute of the generator tag under unique column to identity:

    <generator class="identity">
    

Note: Set the unique column to identity rather than assigned.

before the position where repetitive objects begin , you should close the session and then you should start a new session

session.close();
session = HibernateUtil.getSessionFactory().openSession();

so in this way in one session there is not more than one entities that have the same identifier.

The problem happens because in same hibernate session you are trying to save two objects with same identifier.There are two solutions:-

  1. This is happening because you have not configured your mapping.xml file correctly for id fields as below:-

    <id name="id">
    <column name="id" sql-type="bigint" not-null="true"/>
    <generator class="hibernateGeneratorClass"</generator>
    </id>
    
  2. Overload the getsession method to accept a Parameter like isSessionClear, and clear the session before returning the current session like below

    public static Session getSession(boolean isSessionClear) {
    if (session.isOpen() && isSessionClear) {
    session.clear();
    return session;
    } else if (session.isOpen()) {
    return session;
    } else {
    return sessionFactory.openSession();
    }
    }
    

This will cause existing session objects to be cleared and even if hibernate doesn't generate a unique identifier ,assuming you have configured your database properly for a primary key using something like Auto_Increment,it should work for you.

I had a similar problem. In my case I had forgotten to set the increment_by value in the database to be the same like the one used by the cache_size and allocationSize. (The arrows point to the mentioned attributes)

SQL:

CREATED         26.07.16
LAST_DDL_TIME   26.07.16
SEQUENCE_OWNER  MY
SEQUENCE_NAME   MY_ID_SEQ
MIN_VALUE       1
MAX_VALUE       9999999999999999999999999999
INCREMENT_BY    20 <-
CYCLE_FLAG      N
ORDER_FLAG      N
CACHE_SIZE      20 <-
LAST_NUMBER     180

Java:

@SequenceGenerator(name = "mySG", schema = "my",
sequenceName = "my_id_seq", allocationSize = 20 <-)

One workaround to solve this issue is try to read the object back from hibernate cache/db before you make any updates and then persist.

Example:

            OrderHeader oh = orderHeaderDAO.get(orderHeaderId);
oh.setShipFrom(facilityForOrder);
orderHeaderDAO.persist(oh);

Note: Keep in mind that this does not fix the root cause but solves the issue.

Otherwise than what wbdarby said, it even can happen when an object is fetched by giving the identifier of the object to a HQL. In the case of trying to modify the object fields and save it back into DB(modification could be insert, delete or update) over the same session, this error will appear. Try clearing the hibernate session before saving your modified object or create a brand new session.

Hope i helped ;-)

I have the same error I was replacing my Set with a new one get from Jackson.

To solve this I keep the existing set, I remove from the old set the element unknown into the new list with retainAll. Then I add the new ones with addAll.

    this.oldSet.retainAll(newSet);
this.oldSet.addAll(newSet);

No need to have the Session and manipulate it.

Late to the party, but may help for coming users -

I got this issue when i select a record using getsession() and again update another record with same identifier using same session causes the issue. Added code below.

Customer existingCustomer=getSession().get(Customer.class,1);
Customer customerFromUi;// This customer details comiong from UI with identifer 1


getSession().update(customerFromUi);// Here the issue comes

This should never be done . Solution is either evict session before update or change business logic.

You can use session.merge(obj), if you are doing save with different sessions with same identifier persistent object.
It worked, I had same issue before.

This problem occurs when we update the same object of session, which we have used to fetch the object from database.

You can use merge method of hibernate instead of update method.

e.g. First use session.get() and then you can use session.merge (object). This method will not create any problem. We can also use merge() method to update object in database.

I just had the same problem .I solve it by adding this line:

@GeneratedValue(strategy=GenerationType.IDENTITY)

enter image description here

By default is using the identity strategy but I fixed it by adding

@ID
@GeneratedValue(strategy = GenerationType.IDENTITY)

We are using an old version hibernate (3.2.6) and the problem for us was, that Hibernate expected the first column in the result set to be the generated primary key. Took me ages to figure that out.

Solution: Ensure in that in the DDL the generated primary key is always the first column. Solution 2: Update hibernate