• Visitors can check out the Forum FAQ by clicking this link. You have to register before you can post: click the REGISTER link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. View our Forum Privacy Policy.
  • Want to receive the latest contracting news and advice straight to your inbox? Sign up to the ContractorUK newsletter here. Every sign up will also be entered into a draw to WIN £100 Amazon vouchers!
Collapse

You are not logged in or you do not have permission to access this page. This could be due to one of several reasons:

  • You are not logged in. If you are already registered, fill in the form below to log in, or follow the "Sign Up" link to register a new account.
  • You may not have sufficient privileges to access this page. Are you trying to edit someone else's post, access administrative features or some other privileged system?
  • If you are trying to post, the administrator may have disabled your account, or it may be awaiting activation.

Previously on "Parsing data from legacy system"

Collapse

  • BrilloPad
    replied
    Originally posted by thunderlizard View Post
    You could still use Excel: just copy-and-paste the text in starting at A1 (then it won't bother you with the import wizard thing) and use Text-to-columns to split it. Even with multiple rows per record, as long as it's consistent, you can write a macro to get the data items you want using Range.Offset, because they'll all be in fixed positions relative to the top-left cell of each record.


    Excel's great. 90% of the world's financial sector can't be wrong! (except when they are...)
    most credit sysystems at IBs are written in excel - the better ones use mysql. I know ClientCo was threatened with closure by IFA several times - perhaps if they had been they might have put together proper systems and understood their risk better.

    Leave a comment:


  • xoggoth
    replied
    Agree with thunderlizard, find excel is really good for this sort of thing, with or without using its vba. Could post you a message with example with read/write to text files from execl vba if you decide to go that route.
    Last edited by xoggoth; 16 March 2008, 22:53.

    Leave a comment:


  • thunderlizard
    replied
    You could still use Excel: just copy-and-paste the text in starting at A1 (then it won't bother you with the import wizard thing) and use Text-to-columns to split it. Even with multiple rows per record, as long as it's consistent, you can write a macro to get the data items you want using Range.Offset, because they'll all be in fixed positions relative to the top-left cell of each record.


    Excel's great. 90% of the world's financial sector can't be wrong! (except when they are...)

    Leave a comment:


  • Sysman
    replied
    If it's a one off muck around...

    But if it's something which will occur again, do it properly.

    COBOL was designed for this job

    Quotation on request.

    Leave a comment:


  • Peoplesoft bloke
    replied
    Originally posted by thunderlizard View Post
    How automated or repeatable does it need to be?
    If "not very", you could open it in Excel, use the Text-To-Columns function, and save it as CSV. You could (probably) automate all of that with macros too.
    I'll only need to do it once "In anger", but I'm running it with smaller test versions of the data files to prove it before loading the full size data files. I'm also using test versions of the data when away from site to avoid taking sensitive data off site.
    I'd discounted Excel as the wizard only offered to open text files as fixed width or delimited - and this file, although fixed width, has 70 lines per record.

    Leave a comment:


  • OrangeHopper
    replied
    We do similar things here. Monthly reports, produced by Oracle's Discoverer, are imported into Excel and macros are run to extract and reformat. Unfortunately, it took a few days for the Excel/VB novice (i.e. me) to sort it out.

    <enter tongue-in-check, not too serious mode>
    My parse2 stuff is designed to tackle this sort of problem. I could put a page togther for you where you paste the report and it outputs the data you want extracted.
    </enter tongue-in-check, not too serious mode>

    Leave a comment:


  • thunderlizard
    replied
    How automated or repeatable does it need to be?
    If "not very", you could open it in Excel, use the Text-To-Columns function, and save it as CSV. You could (probably) automate all of that with macros too.

    Leave a comment:


  • Platypus
    replied
    Originally posted by Peoplesoft bloke View Post
    I have to get data from a legacy system - I can run a report that outputs the data we need in a report (text file) that has 69 lines per record. I need a quick and effecient way to read the 69 lines for each record and find the bits of data we want (each one is in a fixed position), write all of the data to a CSV or simlar file so I can grab it and import it.

    I reckon I could cobble something up in PeopleCode (which is similar to VB) but it would take several days (I am not a programmer by any stretch of the imagination) and probably would not work. Any pointers to tools/example scripts anyone has would be gratefully recieved. I know it's a cheek, but if you don't ask..........

    BTW I have access to MS Office apps so could use VB in that - also have a linux box running Ubuntu so could use any handy utilites that might be lying around. I have tried Googling (a lot) but must be typing in the wong questions......
    Perl is born for stuff like this. You could probably pay someone on here to do it for you in a few hours, then charge your client $$$.

    Leave a comment:


  • where did my id go?
    replied
    awk?
    SQL-Loader?

    I'm a bit of a luddite though - I'd probably do it with a bit of c.

    any issues with the format? control characters? whitespace?

    Leave a comment:


  • Spacecadet
    replied
    Originally posted by Peoplesoft bloke View Post
    Brill idea - many thanks - sadly, they won't give me that kind of access to the Oracle DB and I don't have one of my own I can use (maybe I should though).

    I do, however, have SSIS on a VM I can use so I'll give that a go later - last time I used SSIS it was happy with fixed length or csv (or similar) but wanted a record per line - however I haven't used it for while so I'll look again.

    Thanks for all the replies.
    good choice
    You may need to write a tiny bit of VB to handle the non standard text file

    Leave a comment:


  • Peoplesoft bloke
    replied
    Originally posted by TheFaQQer View Post
    Assuming an Oracle 9i or above database, can you not pull the report extract in as an external table, and just select against it in SQL to get the data that you want?

    Brill idea - many thanks - sadly, they won't give me that kind of access to the Oracle DB and I don't have one of my own I can use (maybe I should though).

    I do, however, have SSIS on a VM I can use so I'll give that a go later - last time I used SSIS it was happy with fixed length or csv (or similar) but wanted a record per line - however I haven't used it for while so I'll look again.

    Thanks for all the replies.

    Leave a comment:


  • TheFaQQer
    replied
    Assuming an Oracle 9i or above database, can you not pull the report extract in as an external table, and just select against it in SQL to get the data that you want?

    Leave a comment:


  • oracleslave
    replied
    Offshore it and throw resource at the problem

    Leave a comment:


  • bored
    replied
    Perhaps, write a Perl or Python script to do that?

    Leave a comment:


  • Spacecadet
    replied
    Sql Server Integration Services (SSIS) was born for this

    Leave a comment:

Working...
X