I have created a mapping and at the end I have SQL transformation which executes the SQL query (very simple by taking only 1 column) and created the Target table. If I try to dump the data into Target table from SQL Transformation it is taking too long. Could anybody help me how i can over come this problem?
Here is how i used.
It's a SQL transformation with Dynamic Query option (Query build from the table, so it's very simple query gives about 3.1 million records output, when i try to load them into Target table is taking too much time (may be more than 8 hrs). Please help me how to run this quickly? btw. there is no SQL override for this.
I have some questinos for your case.
1.) Do you want to get one column from SQL Transformation (Like the function of lookup) ?
2.) Do you mean SQL transformation query give about 3.1 million records output ??
3.) How many source row do you have in your ETL?
4.) For this sentense,
**when i try to load them into Target table is taking too much time (may be more than 8 hrs).**
Do you mean your ETL use 8 hrs? or use script or others??
Thanks for your quick answer Felix Ho.
1. No lookup's, I need to dump all the data from SQL transformation into target tables.
2. SQL transformation has a Query (from the table, it's a table driven and bringing the queries from tables into SQL transformation and replaceing with parameters to make correct SQL)... from these Queries, i am writing the output into Target tables.
3. Source and target should have same number of records... no filter.
4. I have multiple transformations before this SQL transformation... doing multiple things, but those are not taking much time... once it comes to SQL transformation Query is building and then excuting the Query and populating the Target tables.
I have used SQL transformation in a mapping. I have used it to because i am deriving the column name dynamically.
the SQL statement is like:
Select parent_id, Child_id
where fieldid = ~variable~ (need to use variable as this value is diff for dev/test/prod)
If i run this query it runs in 5 seconds on database. But while loading the data through ETL it takes 2 Hrs. The throughput on target table is 50 rows.
I also created another mapping using the query with hardcoded value. This runs in 2 min with throughput of 2500 rows on same target table.
I am pretty sure that the performance is slow because of using the SQL transformation. But not sure why? and how to improve the performance.
Can you help here?
SQL Tx is bound to be slow for heavy loads. For each incoming row, a SQL session is initiated and data exchanged over n/w between DB and IS. This results in latency and poor throughput.
Why not parameterize your variable as a mapping parameter and use it in lookup/SQ? You can initialize value in param file and manitain a separate file for each environment.
We can update DTM buffer size to take more records in SQL transformation and also use the Partitioning... We don't have partitioning as our license comes with OBIEE (kind of free software). Hence I did by pass few things to run this sql tx quicker.