• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!
Collapse

You are not logged in or you do not have permission to access this page. This could be due to one of several reasons:

  • You are not logged in. If you are already registered, fill in the form below to log in, or follow the "Sign Up" link to register a new account.
  • You may not have sufficient privileges to access this page. Are you trying to edit someone else's post, access administrative features or some other privileged system?
  • If you are trying to post, the administrator may have disabled your account, or it may be awaiting activation.

Previously on "Bulk insert in Oracle"

Collapse

  • expat
    replied
    Originally posted by lilelvis2000 View Post
    I need to copy data from our production system to our UAT system. I need to run a consolidation query on this data before inserting into the UAT tables. I'll then check the MIS from each system to ensure I haven't lost anything.

    These tables are reasonably large..the largest has over 2M rows and consumes about 220MB.

    One way is to spool it out to .csv using SQL*Plus. The other is to have the DBA link the databases together and pull across using the queries.

    I'm worried about blowing redo-buffers during the insert. How would I go about doing a commit every say 10K rows - similar to sqlldr? I don't think there is any type of "SET" command which can do this. Best I can figure is a cursor with a counter and commit every time counter MOD 10000 = 0.

    I don't care about the contents of the UAT system..it is a UAT system after all easly rebuilt from backup.
    Surely DBLINK and straight SQL is the best way, especially if you could stand a failure. Turn logging off before inserting.

    over 2M rows? I'm trying to tune a query that has over 100M and a subquery with 113M.

    Leave a comment:


  • lilelvis2000
    replied
    I have SQL server, but not the client - they are a Oracle only shop.

    I agree I need a staging area. I will have to ask for more room for the UAT system. But its my staging area and will have all the MIS I need to compare
    the reports..to make sure we're financially balanced with production before we overwrite production.

    Leave a comment:


  • Spacecadet
    replied
    Originally posted by lilelvis2000 View Post
    I need to alter the data in a specific way as it is being transferred. True this could be done through Access...but Access will attempt to do it all in a single transaction..which could kill the redo buffers and archive space..which would bring the database to a grinding halt.
    If you have SQL Server installed somewhere then use SSIS (or DTS)

    as RichardCranium says though, one step at a time, so use staging tables, you can always clear them down afterwards

    Leave a comment:


  • Weltchy
    replied
    Originally posted by Hemingfield View Post
    Hiya,

    I know it is the root of all evil - but couldn't you map the tables through ODBC in MS ACCESS and just do a straight INSERT INTO between the two DBs?

    It's not pretty - but it should work.

    ...or restore from a backup of live?

    Regards,

    Hem
    For suggesting such, you deserve a beating. If you don't need to preserve existing data on your test system, then backup / restore would be the quickest way of moving data around servers.

    Leave a comment:


  • RichardCranium
    replied
    Originally posted by EqualOpportunities View Post
    You can take an export, and import it using the direct path method.

    Or, you can force NOLOGGING on the affected objects - this will all but suppress redo and undo (the one you would actually 'blow') generation. These objects would be unrecoverable if you lost your DB and had to do a restore, but who cares...
    WHS.

    Originally posted by lilelvis2000 View Post
    I need to alter the data in a specific way as it is being transferred.
    You WILL regret that.

    Change it and transfer it, or transfer it and change it. How will you know where you cocked up if you made a mistake?

    Leave a comment:


  • lilelvis2000
    replied
    I need to alter the data in a specific way as it is being transferred. True this could be done through Access...but Access will attempt to do it all in a single transaction..which could kill the redo buffers and archive space..which would bring the database to a grinding halt.

    I'd also be worried about any round off (the data is numeric) within Access..as Access is very Slllloooww at handling numeric data.

    I have created this UAT system from a backup of the original. There isn't enough disk space to hold the additional data. The entire SAN is suffereing from a lack of disk space - I had to pull a favour just to get disk space I got now!

    I'll be getting the necessary DBA work done next week and will be able to do some tests before I do the "live" work. Should be fun...doing it this way also cuts down the the data flying over a 6Mb VPN link. (with only a 480Kb upload)

    Leave a comment:


  • Hemingfield
    replied
    Hiya,

    I know it is the root of all evil - but couldn't you map the tables through ODBC in MS ACCESS and just do a straight INSERT INTO between the two DBs?

    It's not pretty - but it should work.

    ...or restore from a backup of live?

    Regards,

    Hem

    Leave a comment:


  • EqualOpportunities
    replied
    Originally posted by lilelvis2000 View Post
    I need to copy data from our production system to our UAT system. I need to run a consolidation query on this data before inserting into the UAT tables. I'll then check the MIS from each system to ensure I haven't lost anything.

    These tables are reasonably large..the largest has over 2M rows and consumes about 220MB.

    One way is to spool it out to .csv using SQL*Plus. The other is to have the DBA link the databases together and pull across using the queries.

    I'm worried about blowing redo-buffers during the insert. How would I go about doing a commit every say 10K rows - similar to sqlldr? I don't think there is any type of "SET" command which can do this. Best I can figure is a cursor with a counter and commit every time counter MOD 10000 = 0.

    I don't care about the contents of the UAT system..it is a UAT system after all easly rebuilt from backup.
    You have a couple of options...

    You can take an export, and import it using the direct path method.

    Or, you can force NOLOGGING on the affected objects - this will all but suppress redo and undo (the one you would actually 'blow') generation. These objects would be unrecoverable if you lost your DB and had to do a restore, but who cares...

    Leave a comment:


  • lilelvis2000
    started a topic Bulk insert in Oracle

    Bulk insert in Oracle

    I need to copy data from our production system to our UAT system. I need to run a consolidation query on this data before inserting into the UAT tables. I'll then check the MIS from each system to ensure I haven't lost anything.

    These tables are reasonably large..the largest has over 2M rows and consumes about 220MB.

    One way is to spool it out to .csv using SQL*Plus. The other is to have the DBA link the databases together and pull across using the queries.

    I'm worried about blowing redo-buffers during the insert. How would I go about doing a commit every say 10K rows - similar to sqlldr? I don't think there is any type of "SET" command which can do this. Best I can figure is a cursor with a counter and commit every time counter MOD 10000 = 0.

    I don't care about the contents of the UAT system..it is a UAT system after all easly rebuilt from backup.

Working...
X