• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

Any PL/SQL experts here?

Collapse
X
  •  
  • Filter
  • Time
  • Show
Clear All
new posts

    Any PL/SQL experts here?

    I am using dynamic SQL to generate a recordset. It is dynamic because it is part of a metadata driven solution and the columns returned by the query are not known at compile time, and need to be capable of being changed without recompilation.

    This cursor is used to update/append another which has a known, fixed structure, this second cursor will be returned to an external ETL tool to perform the actual update of a table in another database. There will always be more columns in the second cursor than the first.

    So, the requirement is to generate a cursor of the second fixed, format, with the columns in common with the first populated and the rest null. The second cursor is then returned from the package using a PIPELINE function.

    I can generate the list of columns so I thought of building some dynamic SQL like

    INSERT INTO cur_second (Field1, Field2, Field3 ....) SELECT <SQL for first cursor>

    however EXECUTE IMMEDIATE does not like cursor names, it seems to need a table, and the table will not exist in the current database.

    I'm sure there's a solution out there using ref cursors, or temp tables or summat, but so far the most elegant route escapes me. Any PL/SQL gurus out there?
    My subconscious is annoying. It's got a mind of its own.

    #2
    Is the target table constant? If so, then you could create a database link between the two databases to be able to access that second one.
    Originally posted by MaryPoppins
    I hadn't really understood this 'pwned' expression until I read DirtyDog's post.

    Comment


      #3
      "Ugh" is my first thought. By "dynamic SQL" do you simply mean you're building query strings on the fly in the main application (or through another query)?
      Originally posted by MaryPoppins
      I'd still not breastfeed a nazi
      Originally posted by vetran
      Urine is quite nourishing

      Comment


        #4
        Originally posted by pjclarke View Post
        I am using dynamic SQL to generate a recordset. It is dynamic because it is part of a metadata driven solution and the columns returned by the query are not known at compile time, and need to be capable of being changed without recompilation.
        I don't see how you are going to get around not having to recompile - if the source data columns change, then do the target columns not change as well? Or is the target table something like COL1 VARCHAR2(4000), COL2 VARCHAR2(4000) etc.

        Sounds like the perfect situation for Interconnect, but that's going to be expensive for the replacement.
        Originally posted by MaryPoppins
        I hadn't really understood this 'pwned' expression until I read DirtyDog's post.

        Comment


          #5
          Thanks for the replies:

          Yes, the target table is effectively constant however db links are banned in production, paranoid DBA.

          I'd welcome alternate, more elegant approaches, I'm a little rusty. Here's the full requirement, note the database design concepts are not mine, but I have to work with it

          There is a staging database which contains (will contain) metadata and data from all feeds into and out of clientco. There's an 'attribute' table that holds metadata on every attribute (field) under control and an attribute value history table that holds the actual values supplied as a series of name/value pairs with date and other stuff. The values are all conveted to VARCHAR.

          So if 'balance' is supplied daily by a particular input feed, there will be one row on the attrib hist table for the balance value for each day for each account. Other tables hold data on the sources, targets and mappings between them.

          I effectively need to do a pivot: retrieve the values for a set of attributes (attrib hist rows) on a particular date, each selected row becomes a column in the output recordset, and there will be one row per account (I am simplifying slightly).

          The set of attributes to return is retrieved from one of the control tables, that is not a problem, but because it is liable to change it means the PIVOT verb is not available, you cannot use a SELECT or a variable in the list of columns to pivot, it requires a hard-coded list, apparently.

          So yes, I am generating some SQL on the fly, like

          Code:
          SELECT    AVH.ILIM_ID, 
                    AIX.EXTERNAL_ID,  
                    MAX(CASE WHEN ATTR.ATTRIB_NAME = 'COUNTRY_CODE' THEN AVH.ATTRIB_VALUE END) AS ISSUE_CNTRY_CD,
                    MAX(CASE WHEN ATTR.ATTRIB_NAME = 'CURRENCY_CODE' THEN AVH.ATTRIB_VALUE END) AS LOC_CRRNCY_CD,
                    MAX(CASE WHEN ATTR.ATTRIB_NAME = 'TICKER' THEN AVH.ATTRIB_VALUE END) AS TICKER, 
          		  [etc]
          FROM      data_provider dp, 
                    SSDS_ATTRIB_VALUE_HISTORY AVH, 
          		  SSDS_ATTRIB ATTR, 
          		  ASSET_ID_XREF AIX, 
          		  EXTERNAL_ID_TYPE eit 
          WHERE     ATTR.ATTRIB_ID = AVH.ATTRIB_ID
          AND       DP.PROVIDER_ID = ATTR.PROVIDER_ID 
                    [Filters on Date etc...]
          GROUP BY  AVH.ILIM_ID, 
                    AIX.EXTERNAL_ID
          Running that through EXECUTE IMMEDIATE gives me the data I need, the next step is to use that data to sparsely populate the fixed-form table that I need to return.

          I can well see the 'Ugh' factor, but can you do better?
          Last edited by pjclarke; 7 January 2014, 12:28.
          My subconscious is annoying. It's got a mind of its own.

          Comment


            #6
            A view on the history table? Hmmmm, liking it.
            My subconscious is annoying. It's got a mind of its own.

            Comment


              #7
              Do it as a ref cursor - loop through the dynamic SQL and insert into the target table. You may be able to do a bulk collect, I'm not sure off the top of my head. That's only going to be good if there is a small(ish) data volume to transfer, though.

              How do you actually get the output from your system into the new one? Where does the third party tool read it from?
              Originally posted by MaryPoppins
              I hadn't really understood this 'pwned' expression until I read DirtyDog's post.

              Comment


                #8
                Originally posted by DirtyDog View Post
                Do it as a ref cursor - loop through the dynamic SQL and insert into the target table. You may be able to do a bulk collect, I'm not sure off the top of my head. That's only going to be good if there is a small(ish) data volume to transfer, though.

                How do you actually get the output from your system into the new one? Where does the third party tool read it from?
                Yes, you can do EXECUTE IMMEDIATE ... BULK COLLECT INTO. The headache is INSERTING into a variable set of columns into the target table (which isn't a physical table, though I can create a PL/SQL table of the required type.)

                Data volumes are tiny and will remain so.

                Company standard is to expose data as a PIPELINED function out of a package, this is read by a tool called Data Integrator (Not the Business Objects one, this is by Pervasive Software), which has a direct connectiona to the source and target databases....
                My subconscious is annoying. It's got a mind of its own.

                Comment


                  #9
                  Originally posted by pjclarke View Post
                  Yes, you can do EXECUTE IMMEDIATE ... BULK COLLECT INTO. The headache is INSERTING into a variable set of columns into the target table (which isn't a physical table, though I can create a PL/SQL table of the required type.)

                  Data volumes are tiny and will remain so.
                  Assuming that the target table has a known and fixed number of columns...

                  Use the dynamic SQL to collect it into a PL/SQL table. Pass that type as a parameter into the function and have the function pipe the row out.

                  Originally posted by pjclarke View Post
                  Company standard is to expose data as a PIPELINED function out of a package, this is read by a tool called Data Integrator (Not the Business Objects one, this is by Pervasive Software), which has a direct connectiona to the source and target databases....
                  And yet they won't have a database link. I'm sure there's some logic there somewhere.
                  Originally posted by MaryPoppins
                  I hadn't really understood this 'pwned' expression until I read DirtyDog's post.

                  Comment


                    #10
                    Use the dynamic SQL to collect it into a PL/SQL table. Pass that type as a parameter into the function and have the function pipe the row out.
                    Uh-huh, that's where my head's at, and pretty much where we came in. Its just that niggling step of INSERTing a variable number of columns into a target 'table', even when the target table is a known and fixed format....

                    And yet they won't have a database link. I'm sure there's some logic there somewhere.
                    You'd have thought being a DBA requied a certain capacity for logical thought, wouldn't you? Experience indicates otherwise.
                    My subconscious is annoying. It's got a mind of its own.

                    Comment

                    Working...
                    X