java.sql.SQLException: - ORA-01000: maximum open cursors exceeded

This happens because the configuration failed for package ‘rgdal’

I am getting an ORA-01000 SQL exception. So I have some queries related to it.

    so we have to install necessary dependencies.

    The packages libgdal-dev and libproj-dev are required:

  1. Are maximum open cursors exactly related to number of JDBC connections, or are they also related to the statement and resultset objects we have created for a single connection ? (We are using pool of connections)
  2. Is there a way to configure the number of statement/resultset objects in the database (like connections) ?
  3. sudo apt-get install gdal-bin proj-bin libgdal-dev libproj-dev

  4. Is it advisable to use instance variable statement/resultset object instead of method local statement/resultset object in a single threaded environment ?
  5. Then install rgdal by

    install.packages("rgdal")
    
  6. Does executing a prepared statement in a loop cause this issue ? (Of course, I could have used sqlBatch) Note: pStmt is closed once loop is over.

    { //method try starts
    String sql = "INSERT into TblName (col1, col2) VALUES(?, ?)";
    pStmt = obj.getConnection().prepareStatement(sql);
    pStmt.setLong(1, subscriberID);
    for (String language : additionalLangs) {
    pStmt.setInt(2, Integer.parseInt(language));
    pStmt.execute();
    }
    } //method/try ends
    
    
    { //finally starts
    pStmt.close()
    } //finally ends
    
  7. What will happen if conn.createStatement() and conn.prepareStatement(sql) are called multiple times on single connection object ?

Edit1: 6. Will the use of Weak/Soft reference statement object help in preventing the leakage ?

Edit2: 1. Is there any way, I can find all the missing "statement.close()"s in my project ? I understand it is not a memory leak. But I need to find a statement reference (where close() is not performed) eligible for garbage collection ? Any tool available ? Or do I have to analyze it manually ?

Please help me understand it.

Solution

To find the opened cursor in Oracle DB for username -VELU

Go to ORACLE machine and start sqlplus as sysdba.

[oracle@db01 ~]$ sqlplus / as sysdba

Load rgdal by

library(rgdal)

Then run

SELECT   A.VALUE,
S.USERNAME,
S.SID,
S.SERIAL#
FROM V$SESSTAT A,
V$STATNAME B,
V$SESSION S
WHERE A.STATISTIC# = B.STATISTIC#
AND S.SID        = A.SID
AND B.NAME       = 'opened cursors current'
AND USERNAME     = 'VELU';
tions) ?
  • Is it advisable to use instance variable statement/resultset object instead of method local statement/resultset object in a single threaded environment ?
  • Does executing a prepared statement in a loop cause this issue ? (Of course, I could have used sqlBatch) Note: pStmt is closed once loop is over.

    { //method try starts
    String sql = "INSERT into TblName (col1, col2) VALUES(?, ?)";
    pStmt = obj.getConnection().prepareStatement(sql);
    pStmt.setLong(1, subscriberID);
    for (String language : additionalLangs) {
    pStmt.setInt(2, Integer.parseInt(language));
    pStmt.execute();
    }
    } //method/try ends
    
    
    { //finally starts
    pStmt.close()
    } //finally ends
    
  • What will happen if conn.createStatement() and conn.prepareStatement(sql) are called multiple times on single connection object ?

  • Edit1: 6. Will the use of Weak/Soft reference statement object help in preventing the leakage ?

    Edit2: 1. Is there any way, I can find all the missing "statement.close()"s in my project ? I understand it is not a memory leak. But I need to find a statement reference (where close() is not performed) eligible for garbage collection ? Any tool available ? Or do I have to analyze it manually ?

    Please help me understand it.

    Solution

    To find the opened cursor in Oracle DB for username -VELU

    Go to ORACLE machine and start sqlplus as sysdba.

    [oracle@db01 ~]$ sqlplus / as sysdba
    

    If possible please read my answer for more understanding of my solution

    400379 次浏览

    Then run

    SELECT   A.VALUE,
    S.USERNAME,
    S.SID,
    S.SERIAL#
    FROM V$SESSTAT A,
    V$STATNAME B,
    V$SESSION S
    WHERE A.STATISTIC# = B.STATISTIC#
    AND S.SID        = A.SID
    AND B.NAME       = 'opened cursors current'
    AND USERNAME     = 'VELU';
    

    Did you set autocommit=true? If not try this:

    { //method try starts
    String sql = "INSERT into TblName (col1, col2) VALUES(?, ?)";
    Connection conn = obj.getConnection()
    pStmt = conn.prepareStatement(sql);
    
    
    for (String language : additionalLangs) {
    pStmt.setLong(1, subscriberID);
    pStmt.setInt(2, Integer.parseInt(language));
    pStmt.execute();
    conn.commit();
    }
    } //method/try ends {
    //finally starts
    pStmt.close()
    } //finally ends
    

    If you pass a ReferenceQueue in the constructor to the soft or weak Reference, the object is placed in the ReferenceQueue when the object is GC'ed when it occurs (if it occurs at all). With this approach, you can interact with the object's finalization and you could close or finalize the object at that moment.

    (Disclosure: I work at GitHub)

    Phantom references are a bit weirder; their purpose is only to control finalization, but you can never get a reference to the original object, so it's going to be hard to call the close() method on it.

    So long as you're still a student, you're eligible for the academic discount. Once your coupon expires, you can re-apply for the student discount.

    When you re-apply, be sure you've added your school-issued email address to your account, that will expedite the approval process.

    However, it is rarely a good idea to attempt to control when the GC is run (Weak, Soft and PhantomReferences let you know after the fact that the object is enqueued for GC). In fact, if the amount of memory in the JVM is large (eg -Xmx2000m) you might never GC the object, and you will still experience the ORA-01000. If the JVM memory is small relative to your program's requirements, you may find that the ResultSet and PreparedStatement objects are GCed immediately after creation (before you can read from them), which will likely fail your program.

    TL;DR: The weak reference mechanism is not a good way to manage and close Statement and ResultSet objects.

    Correct your Code like this:

    try
    { //method try starts
    String sql = "INSERT into TblName (col1, col2) VALUES(?, ?)";
    pStmt = obj.getConnection().prepareStatement(sql);
    pStmt.setLong(1, subscriberID);
    for (String language : additionalLangs) {
    pStmt.setInt(2, Integer.parseInt(language));
    pStmt.execute();
    }
    } //method/try ends
    finally
    { //finally starts
    pStmt.close()
    }
    

    You will receive a notification email before your discount expires. When the discount does expire, your account will be locked which means that you won't be able to access your private repositories. Your private repositories will not be deleted and they will not be made public.

    Of course, if you ever have any troubles, you can always email support@github.com.

    Are you sure, that you're really closing your pStatements, connections and results?

    n(){

    To analyze open objects you can implment a delegator pattern, which wraps code around your statemant, connection and result objects. So you'll see, if an object will successfully closed.

    return new ConnectionDelegator(...here create your connection object and put it into ...);

    An Example for: pStmt = obj.getConnection().prepareStatement(sql);

        class obj{
    
    
    public Connection getConnection(){
    return new ConnectionDelegator(...here create your connection object and put it into ...);
    
    
    }
    }
    
    
    
    
    class ConnectionDelegator implements Connection{
    Connection delegates;
    
    
    public ConnectionDelegator(Connection con){
    this.delegates = con;
    }
    
    
    public Statement prepareStatement(String sql){
    return delegates.prepareStatement(sql);
    }
    
    
    public void close(){
    try{
    delegates.close();
    }finally{
    log.debug(delegates.toString() + " was closed");
    }
    }
    }
    

    Using batch processing will result in less overhead. See the following link for examples: } http://www.tutorialspoint.com/jdbc/jdbc-batch-processing.htm

    class ConnectionDelegator implements Connection{

    I am adding few more understanding.

      Connection delegates;
    1. Cursor is only about a statement objecct; It is neither resultSet nor the connection object.
    2. public ConnectionDelegator(Connection con){ this.delegates = con; }
    3. But still we have to close the resultset to free some oracle memory. Still if you don't close the resultset that won't be counted for CURSORS.
    4. Closing Statement object will automatically close resultset object too.
    5. public Statement prepareStatement(String sql){ return delegates.prepareStatement(sql);
    6. Cursor will be created for all the SELECT/INSERT/UPDATE/DELETE statement.
    7. Each ORACLE DB instance can be identified using oracle SID; similarly ORACLE DB can identify each connection using connection SID. Both SID are different.
    8. So ORACLE session is nothing but a jdbc(tcp) connection; which is nothing but one SID.
    9. If we set maximum cursors as 500 then it is only for one JDBC session/connection/SID.
    10. }
    11. So we can have many JDBC connection with its respective no of cursors (statements).
    12. public void close(){
    13. Once the JVM is terminated all the connections/cursors will be closed, OR JDBCConnection is closed CURSORS with respect to that connection will be closed.

    try{

    Loggin as sysdba.

    In Putty (Oracle login):

      [oracle@db01 ~]$ sqlplus / as sysdba
    

    In SqlPlus:

    UserName: sys as sysdba

    delegates.close(); }finally{

    Set session_cached_cursors value to 0 so that it wont have closed cursors.

     alter session set session_cached_cursors=0
    select * from V$PARAMETER where name='session_cached_cursors'
    

    Select existing OPEN_CURSORS valuse set per connection in DB

     SELECT max(a.value) as highest_open_cur, p.value as max_open_cur FROM v$sesstat a, v$statname b, v$parameter p WHERE a.statistic# = b.statistic# AND b.name = 'opened cursors current' AND p.name= 'open_cursors'  GROUP BY p.value;
    

    Below is the query to find the SID/connections list with open cursor values.

     SELECT a.value, s.username, s.sid, s.serial#
    FROM v$sesstat a, v$statname b, v$session s
    WHERE a.statistic# = b.statistic#  AND s.sid=a.sid
    AND b.name = 'opened cursors current' AND username = 'SCHEMA_NAME_IN_CAPS'
    

    Use the below query to identify the sql's in the open cursors

     SELECT oc.sql_text, s.sid
    FROM v$open_cursor oc, v$session s
    WHERE OC.sid = S.sid
    AND s.sid=1604
    AND OC.USER_NAME ='SCHEMA_NAME_IN_CAPS'
    
    log.debug(delegates.toString() + " was closed");

    Now debug the Code and Enjoy!!! :)

    }

    query to find sql that opened.

    SELECT s.machine, oc.user_name, oc.sql_text, count(1)
    FROM v$open_cursor oc, v$session s
    WHERE oc.sid = s.sid
    and S.USERNAME='XXXX'
    GROUP BY user_name, sql_text, machine
    HAVING COUNT(1) > 2
    ORDER BY count(1) DESC
    

    If your application is a Java EE application running on Oracle WebLogic as the application server, a possible cause for this issue is the Statement Cache Size setting in WebLogic.

    The timeline of your expiring Educational account is something like this:

      If the Statement Cache Size setting for a particular data source is about equal to, or greater than, the Oracle database maximum open cursor count setting, then all of the open cursors can be consumed by cached SQL statements that are held open by WebLogic, resulting in the ORA-01000 error.

      To address this, reduce the Statement Cache Size setting for each WebLogic datasource that points to the Oracle database to be significantly less than the maximum cursor count setting on the database.

      In the WebLogic 10 Admin Console, the Statement Cache Size setting for each data source can be found at Services (left nav) > Data Sources > (individual data source) > Connection Pool tab.

      >pay for a premium account.
    • you will get an email/notification in Github announcing you that your education coupon will expire, and ask you to either:

      • renew your educational coupon (or any coupon for free premium account).
    • make all your private repos public and downgrade to a free account,
    • I had this problem today when I wanted to access one of my private repos.

    • add a payment method and they will start charging you after the coupon expires, or
    • My free educational plan expired in August. I got a big red notification bar in Github telling me to pay for a Micro plan or to downgrade.

    • renew your educational coupon.
    I didn't need my private repos, so I ignored it, until today.

    Personally, I find this policy weird, as no other service does this. How will it be if you stopped paying for extra storage on Google Drive, and the next day you find that half your files were hidden until you pay up.

  • the time comes, and, if you didn't do any of these, they will lock you out of your private repos.

  • In our case, we were using Hibernate and we had many variables referencing the same Hibernate mapped entity. We were creating and saving these references in a loop. Each reference opened a cursor and kept it open.

    What does this lock means?

    • you cannot access your repo in any way, ie. you cannot even see your files in github, or clone it.

    We discovered this by using a query to check the number of open cursors while running our code, stepping through with a debugger and selectively commenting things out.

    What can you do to access them?

    • go to each one of your private repos, in settings, and make them public. This will give you access to them, so you can clone or download a copy quickly and then delete them.
    • As to why each new reference opened another cursor - the entity in question had collections of other entities mapped to it and I think this had something to do with it (perhaps not just this alone but in combination with how we had configured the fetch mode and cache settings). Hibernate itself has had bugs around failing to close open cursors, though it looks like these have been fixed in later versions.

    • pay for a premium account.
    • Since we didn't really need to have so many duplicate references to the same entity anyway, the solution was to stop creating and holding onto all those redundant references. Once we did that the problem when away.

  • renew your educational coupon (or any coupon for free premium account).
  • This problem mainly happens when you are using connection pooling because when you close connection that connection go back to the connection pool and all cursor associated with that connection never get closed as the connection to database is still open.

    I had this problem today when I wanted to access one of my private repos.

    So one alternative is to decrease the idle connection time of connections in pool, so may whenever connection sits idle in connection for say 10 sec , connection to database will get closed and new connection created to put in pool.

    Updated 2020 answer

    I had this problem with my datasource in WildFly and Tomcat, connecting to a Oracle 10g.

    I found that under certain conditions the statement wasn't closed even when the statement.close() was invoked.

    I had my student account that expired recently and was asked to either provide billing details to continue on pro subscription which now costs USD $4/mo (vs $7/mo previously) or downgrade to a free account.

    The problem was with the Oracle Driver we were using: ojdbc7.jar. This driver is intended for Oracle 12c and 11g, and it seems has some issues when is used with Oracle 10g, so I downgrade to ojdbc5.jar and now everything is running fine.

    Downgrading to the free account would still let me keep and access all my private repos (with less number of collaborators allowed I think) while losing access to GitHub Pages and GitHub Wikis within those private repos - so not too bad overall. This is the message that I received while downgrading my account

    is within those private repos - so not too bad overall. This is the message that I received while downgrading my account

    Recent changes to their pricing, features and billing structure by Microsoft resulted in the following features included as part of the free subscription:

    Recent changes to their pricing, features and billing structure by Microsoft resulted in the following features included as part of the free subscription:

    GitHub Free for user accounts

    GitHub Free for user accounts

    With GitHub Free for user accounts, you can work with unlimited

    With GitHub Free for user accounts, you can work with unlimited collaborators on unlimited public repositories with a full feature collaborators on unlimited public repositories with a full feature set, and on unlimited private repositories with a limited feature set.

    set, and on unlimited private repositories with a limited feature set.

    With GitHub Free, your user account includes:

    • GitHub Community Support
    • With GitHub Free, your user account includes:

      • GitHub Community Support
      • GitHub Dependabot alerts
      • GitHub Dependabot alerts
      • Two-factor authentication enforcement
      • Two-factor authentication enforcement
      • 2,000 GitHub Actions minutes
      • 500MB GitHub Packages storage
  • 2,000 GitHub Actions minutes
  • 500MB GitHub Packages storage
  • And the pro account includes the following features:

    GitHub Pro

    And the pro account includes the following features:

    GitHub Pro

    In addition to the features available with GitHub Free for user

    In addition to the features available with GitHub Free for user accounts, GitHub Pro includes:

      accounts, GitHub Pro includes:

      • GitHub Support via email
      • GitHub Support via email
      • 3,000 GitHub Actions minutes
      • 3,000 GitHub Actions minutes
      • 2GB GitHub Packages storage
      • 2GB GitHub Packages storage
      • Advanced tools and insights in private repositories:
        • Advanced tools and insights in private repositories:
          • Required pull request reviewers
          • Required pull request reviewers
          • Multiple pull request reviewers
          • Auto-linked references
          • Multiple pull request reviewers
          • GitHub Pages
          • Auto-linked references
          • Wikis
          • GitHub Pages
          • Protected branches
          • Wikis
          • Code owners
          • Protected branches
          • Repository insights graphs: Pulse, contributors, traffic, commits, code frequency, network, and forks
  • Code owners
  • Source: https://docs.github.com/en/github/getting-started-with-github/githubs-products#github-pro

    Based on the post by Andrew Alcock above, I made changes so that inside the loop, I closed each resultset and each statement after getting the data and before looping again, and that solved the problem.

    Additionaly, the exact same problem occured in another loop of Insert Statements, in another Oracle DB (ORA-01000), this time after 300 statements. Again it was solved in the same way, so either the PreparedStatement or the ResultSet or both, count as open cursors until they are closed.

    I too had faced this issue.The below exception used to come

    java.sql.SQLException: - ORA-01000: maximum open cursors exceeded
    

    You can try this

    console.log( 'a'.charCodeAt​(0))​
    

    I was using Spring Framework with Spring JDBC for dao layer.

    First convert it to Ascii number .. Increment it .. then convert from Ascii to char..

    var nex = 'a'.charCodeAt(0);
    console.log(nex)
    $('#btn1').on('click', function() {
    var curr = String.fromCharCode(nex++)
    console.log(curr)
    });
    

    My application used to leak cursors somehow and after few minutes or so, It used to give me this exception.

    ​Check FIDDLE

    After a lot of thorough debugging and analysis, I found that there was the problem with the Indexing, Primary Key and Unique Constraints in one of the Table being used in the Query i was executing.

    was the problem with the Indexing, Primary Key and Unique Constraints in one of the Table being used in the Query i was executing.

    My application was trying to update the Columns which were mistakenly Indexed.

    My application was trying to update the Columns which were mistakenly Indexed. So, whenever my application was hitting the update query on the indexed columns, The database was trying to do the reindexing based on the updated values. It was leaking the cursors.

    So, whenever my application was hitting the update query on the indexed columns, The database was trying to do the reindexing based on the updated values. It was leaking the cursors.

    I was able to solve the problem by doing Proper Indexing on the columns which were used to search in the query and applying appropriate constraints wherever required.

    Usage:

    const ids = new StringIdGenerator();
    
    
    ids.next(); // 'a'
    ids.next(); // 'b'
    ids.next(); // 'c'
    
    
    // ...
    ids.next(); // 'z'
    ids.next(); // 'A'
    ids.next(); // 'B'
    
    
    // ...
    ids.next(); // 'Z'
    ids.next(); // 'aa'
    ids.next(); // 'ab'
    ids.next(); // 'ac'
    

    I needed to use sequences of letters multiple times and so I made this function based on this SO question. I hope this can help others.

    function charLoop(from, to, callback)
    {
    var i = from.charCodeAt(0);
    var to = to.charCodeAt(0);
    for(;i<=to;i++) callback(String.fromCharCode(i));
    }
    

      I ran into this issue after setting the prepared statement cache size to a large value. Apparently, when prepared statements are kept in cache, the cursor stays open.