Skip navigation
Contact Us
5329 Views 5 Replies Latest reply: Nov 14, 2011 10:33 AM by Ali Syed RSS
Kubilay Tsil Kara Novice 27 posts since
May 24, 2011
Currently Being Moderated

Nov 7, 2011 9:35 AM

Update a Salesforce object with data from itself using Informatica Cloud

Hi I just wanted to write a post to show you how you can simulate an 'update of a table from another table' using Informatica cloud.


This is a solution I had do come up with when having to update/delete a Salesforce object with data from itself.


Is like a self-join where you basically set the Salesforce object to be the source and set the same Salesforce object to be the target. It is quite cool, because this way you can correct or delete lots of data which is bad in your existing salesforce objects. I even have written a blog entry about this in my blog here, which goes something like this.



In brief this is how a self-join DML goes using the Salesforce Account Standard Object:


1. Create a Source on the Account object






2. Create a Target for the same Account object in the same org. Just choose as target the same object.





3. Field Mapping a kind  of  'Self-Join', see criteria  { Id=Id }




4. Use DECODE function in the same Field Mapping to manipulate data




The Informatica Cloud string function 'DECODE' does the trick. Observe how it changes the occurrences of the string 'Direct Employer' to 'Employer' in the Type field of the standard Salesforce Account object, which is actually the same object it reads it from. This is an update example, replace a word with another.


decode(Type,'Direct Employer','Employer')


When you save and run this task it will read from the same Account  object and write back to the same Account object the changed data. All  within an Informatica Cloud Data Synchronisation 'Self-join' task! No  file downloads, Excel use or whatever. Just one command. Given the  plethora of  Informatica Cloud String, Arithmetic, Date etc... functions  available, just imagine what you can do to your data!


This is a wonderful and simple example of how Informatica Cloud can be used to cleanse data, I think.

  • Ali Syed Novice 35 posts since
    May 2, 2010

    This is very cool concept that we can use the SFDC object as Source and Target. In one of my project I tried using it  and it was working fine initially, but then lately I started getting SOSQL Timeout errors because the when Infa Cloud sends the select statement to the Souce object, query was in a wait state and it gets timeout after 30 mins. I wasnt sure if it was a contention issue or something else, because it was a nightly job when things were quiet.


    Well, anyways, Did you happen to run in to those issues?




      • Ali Syed Novice 35 posts since
        May 2, 2010

        When using IC to load data to SFDC, it uses row level lockings and it can process up to 200 recs at a time and so, for that 200 records  is one API call.


        Also, since, our Source is from a Cloud and Target is in Cloud, but we are trying to connect through an Infa agent at our machines, its a slow process as we are bringing the data down from a remote server to a local server and then pushing it to another remote server and it has to cycle that for every 200 records and

        also, if we have some rules / workflows written on the SFDC fields, then the load is much slower.


        To avoid this,I've updated the task and created a cache table which will be a replica of the SFDC object  using DRS task it brings the data down incrementally and then process the updates back to SFDC using my table. Updating the Cache table is my Task1 in the TaskFlow and then Updating the SFDC object is my task 2 in that TaskFlow


        I've tested it and seen a huge performance improvement .

          • Ali Syed Novice 35 posts since
            May 2, 2010

            Hi Kubilay,


            Thanks, I will check out CopyForce for some POC's.


            Also, about the BULK API, I hope they come up with this in this Winter release,

            It will definitely be big enhancement, especially, when you are trying to make the current  data available Near Real Time.


            Also, if they can  add another expression and a filter/Router after the lookup. This will also, be a big improvement adding more flexibility and reducing the number of individual task to complete a single load based on some criteria.




More Like This

  • Retrieving data ...

Bookmarked By (0)