• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

Bulk insert in Oracle

Collapse
X
  •  
  • Filter
  • Time
  • Show
Clear All
new posts

    Bulk insert in Oracle

    I need to copy data from our production system to our UAT system. I need to run a consolidation query on this data before inserting into the UAT tables. I'll then check the MIS from each system to ensure I haven't lost anything.

    These tables are reasonably large..the largest has over 2M rows and consumes about 220MB.

    One way is to spool it out to .csv using SQL*Plus. The other is to have the DBA link the databases together and pull across using the queries.

    I'm worried about blowing redo-buffers during the insert. How would I go about doing a commit every say 10K rows - similar to sqlldr? I don't think there is any type of "SET" command which can do this. Best I can figure is a cursor with a counter and commit every time counter MOD 10000 = 0.

    I don't care about the contents of the UAT system..it is a UAT system after all easly rebuilt from backup.
    McCoy: "Medical men are trained in logic."
    Spock: "Trained? Judging from you, I would have guessed it was trial and error."

    #2
    Originally posted by lilelvis2000 View Post
    I need to copy data from our production system to our UAT system. I need to run a consolidation query on this data before inserting into the UAT tables. I'll then check the MIS from each system to ensure I haven't lost anything.

    These tables are reasonably large..the largest has over 2M rows and consumes about 220MB.

    One way is to spool it out to .csv using SQL*Plus. The other is to have the DBA link the databases together and pull across using the queries.

    I'm worried about blowing redo-buffers during the insert. How would I go about doing a commit every say 10K rows - similar to sqlldr? I don't think there is any type of "SET" command which can do this. Best I can figure is a cursor with a counter and commit every time counter MOD 10000 = 0.

    I don't care about the contents of the UAT system..it is a UAT system after all easly rebuilt from backup.
    You have a couple of options...

    You can take an export, and import it using the direct path method.

    Or, you can force NOLOGGING on the affected objects - this will all but suppress redo and undo (the one you would actually 'blow') generation. These objects would be unrecoverable if you lost your DB and had to do a restore, but who cares...
    The squint, the cocked eye and clenched first are the cornerstones of all Merseyside communication from birth to grave

    Comment


      #3
      Hiya,

      I know it is the root of all evil - but couldn't you map the tables through ODBC in MS ACCESS and just do a straight INSERT INTO between the two DBs?

      It's not pretty - but it should work.

      ...or restore from a backup of live?

      Regards,

      Hem

      Comment


        #4
        I need to alter the data in a specific way as it is being transferred. True this could be done through Access...but Access will attempt to do it all in a single transaction..which could kill the redo buffers and archive space..which would bring the database to a grinding halt.

        I'd also be worried about any round off (the data is numeric) within Access..as Access is very Slllloooww at handling numeric data.

        I have created this UAT system from a backup of the original. There isn't enough disk space to hold the additional data. The entire SAN is suffereing from a lack of disk space - I had to pull a favour just to get disk space I got now!

        I'll be getting the necessary DBA work done next week and will be able to do some tests before I do the "live" work. Should be fun...doing it this way also cuts down the the data flying over a 6Mb VPN link. (with only a 480Kb upload)
        McCoy: "Medical men are trained in logic."
        Spock: "Trained? Judging from you, I would have guessed it was trial and error."

        Comment


          #5
          Originally posted by EqualOpportunities View Post
          You can take an export, and import it using the direct path method.

          Or, you can force NOLOGGING on the affected objects - this will all but suppress redo and undo (the one you would actually 'blow') generation. These objects would be unrecoverable if you lost your DB and had to do a restore, but who cares...
          WHS.

          Originally posted by lilelvis2000 View Post
          I need to alter the data in a specific way as it is being transferred.
          You WILL regret that.

          Change it and transfer it, or transfer it and change it. How will you know where you cocked up if you made a mistake?
          My all-time favourite Dilbert cartoon, this is: BTW, a Dumpster is a brand of skip, I think.

          Comment


            #6
            Originally posted by Hemingfield View Post
            Hiya,

            I know it is the root of all evil - but couldn't you map the tables through ODBC in MS ACCESS and just do a straight INSERT INTO between the two DBs?

            It's not pretty - but it should work.

            ...or restore from a backup of live?

            Regards,

            Hem
            For suggesting such, you deserve a beating. If you don't need to preserve existing data on your test system, then backup / restore would be the quickest way of moving data around servers.

            Comment


              #7
              Originally posted by lilelvis2000 View Post
              I need to alter the data in a specific way as it is being transferred. True this could be done through Access...but Access will attempt to do it all in a single transaction..which could kill the redo buffers and archive space..which would bring the database to a grinding halt.
              If you have SQL Server installed somewhere then use SSIS (or DTS)

              as RichardCranium says though, one step at a time, so use staging tables, you can always clear them down afterwards
              Coffee's for closers

              Comment


                #8
                I have SQL server, but not the client - they are a Oracle only shop.

                I agree I need a staging area. I will have to ask for more room for the UAT system. But its my staging area and will have all the MIS I need to compare
                the reports..to make sure we're financially balanced with production before we overwrite production.
                McCoy: "Medical men are trained in logic."
                Spock: "Trained? Judging from you, I would have guessed it was trial and error."

                Comment


                  #9
                  Originally posted by lilelvis2000 View Post
                  I need to copy data from our production system to our UAT system. I need to run a consolidation query on this data before inserting into the UAT tables. I'll then check the MIS from each system to ensure I haven't lost anything.

                  These tables are reasonably large..the largest has over 2M rows and consumes about 220MB.

                  One way is to spool it out to .csv using SQL*Plus. The other is to have the DBA link the databases together and pull across using the queries.

                  I'm worried about blowing redo-buffers during the insert. How would I go about doing a commit every say 10K rows - similar to sqlldr? I don't think there is any type of "SET" command which can do this. Best I can figure is a cursor with a counter and commit every time counter MOD 10000 = 0.

                  I don't care about the contents of the UAT system..it is a UAT system after all easly rebuilt from backup.
                  Surely DBLINK and straight SQL is the best way, especially if you could stand a failure. Turn logging off before inserting.

                  over 2M rows? I'm trying to tune a query that has over 100M and a subquery with 113M.

                  Comment

                  Working...
                  X