• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!

Parsing data from legacy system

Collapse
X
  •  
  • Filter
  • Time
  • Show
Clear All
new posts

    Parsing data from legacy system

    I have to get data from a legacy system - I can run a report that outputs the data we need in a report (text file) that has 69 lines per record. I need a quick and effecient way to read the 69 lines for each record and find the bits of data we want (each one is in a fixed position), write all of the data to a CSV or simlar file so I can grab it and import it.

    I reckon I could cobble something up in PeopleCode (which is similar to VB) but it would take several days (I am not a programmer by any stretch of the imagination) and probably would not work. Any pointers to tools/example scripts anyone has would be gratefully recieved. I know it's a cheek, but if you don't ask..........

    BTW I have access to MS Office apps so could use VB in that - also have a linux box running Ubuntu so could use any handy utilites that might be lying around. I have tried Googling (a lot) but must be typing in the wong questions......

    #2
    Sql Server Integration Services (SSIS) was born for this
    Coffee's for closers

    Comment


      #3
      Perhaps, write a Perl or Python script to do that?

      Comment


        #4
        Offshore it and throw resource at the problem

        Comment


          #5
          Assuming an Oracle 9i or above database, can you not pull the report extract in as an external table, and just select against it in SQL to get the data that you want?
          Best Forum Advisor 2014
          Work in the public sector? You can read my FAQ here
          Click here to get 15% off your first year's IPSE membership

          Comment


            #6
            Originally posted by TheFaQQer View Post
            Assuming an Oracle 9i or above database, can you not pull the report extract in as an external table, and just select against it in SQL to get the data that you want?

            Brill idea - many thanks - sadly, they won't give me that kind of access to the Oracle DB and I don't have one of my own I can use (maybe I should though).

            I do, however, have SSIS on a VM I can use so I'll give that a go later - last time I used SSIS it was happy with fixed length or csv (or similar) but wanted a record per line - however I haven't used it for while so I'll look again.

            Thanks for all the replies.

            Comment


              #7
              Originally posted by Peoplesoft bloke View Post
              Brill idea - many thanks - sadly, they won't give me that kind of access to the Oracle DB and I don't have one of my own I can use (maybe I should though).

              I do, however, have SSIS on a VM I can use so I'll give that a go later - last time I used SSIS it was happy with fixed length or csv (or similar) but wanted a record per line - however I haven't used it for while so I'll look again.

              Thanks for all the replies.
              good choice
              You may need to write a tiny bit of VB to handle the non standard text file
              Coffee's for closers

              Comment


                #8
                awk?
                SQL-Loader?

                I'm a bit of a luddite though - I'd probably do it with a bit of c.

                any issues with the format? control characters? whitespace?

                Comment


                  #9
                  Originally posted by Peoplesoft bloke View Post
                  I have to get data from a legacy system - I can run a report that outputs the data we need in a report (text file) that has 69 lines per record. I need a quick and effecient way to read the 69 lines for each record and find the bits of data we want (each one is in a fixed position), write all of the data to a CSV or simlar file so I can grab it and import it.

                  I reckon I could cobble something up in PeopleCode (which is similar to VB) but it would take several days (I am not a programmer by any stretch of the imagination) and probably would not work. Any pointers to tools/example scripts anyone has would be gratefully recieved. I know it's a cheek, but if you don't ask..........

                  BTW I have access to MS Office apps so could use VB in that - also have a linux box running Ubuntu so could use any handy utilites that might be lying around. I have tried Googling (a lot) but must be typing in the wong questions......
                  Perl is born for stuff like this. You could probably pay someone on here to do it for you in a few hours, then charge your client $$$.

                  Comment


                    #10
                    How automated or repeatable does it need to be?
                    If "not very", you could open it in Excel, use the Text-To-Columns function, and save it as CSV. You could (probably) automate all of that with macros too.

                    Comment

                    Working...
                    X