I've a requirement for loading a huge volume of records into pieces of files. Explain as below:
(1) Suppose I have 1,000,000 records in my source which having rownumber for each record.
(2) My target will have 100000 records per file as target is flatfile. That means I've to generate 10 target having 100000 records each.
(3) The number of target files will be varied and I have script to check the count.
(4) While I'll rune Informatica workflow it should run as many times as the count of target file generation (for eg: 10)
(5) Each time the wokflow will run, my Source Qualifier have to check the last rownumber processed in the last run and I've to hold the last rownumber inot my parameter file dynamicaaly after each seesion load.
My questions are:
(1) How to update the same parameter file after each session automatically?
(2) How to implement the loop/ counter while running the workflow?
Your early suggestion will be appreciated. Thanks in advance.
To load 10000 records for each file from source contains 1000000 records, you can use Transaction control transformation. By using this you can create different files with 10000 records as commit. For this no need to run mapping 10 times. in single run you can create ten files.
The following mapping on Informatica Marketplace Website should meet your requirement. You will also find over 100 mappings and many free tools and utilities on the same website.
Iteration can be achieved by using pmcd command (for the informatica server on UNIX). We can have a shell script code in the post session, which would check if the number of iterations have been performed or not.