Manage categories

Close

Create and manage categories in Informatica Cloud. Removing a category will not remove content.

Categories in Informatica Cloud
Add a new category (0 remaining)

Manage Announcements

Close

Create and manage announcements in Informatica Cloud. Try to limit the announcements to keep them useful.

Announcements in Informatica Cloud
Subject Author Date Actions
Skip navigation
Contact Us

Ideas

Filter by Categories & Tags

Sort By: Filter by Stage: When:
score 10
Active 2 votes

Hi,   Not sure, if someone else asked for something similar.   Anytime, if i intend to delete Data sync. task, i expect Informatica coud to give me a message indicating what sync. task would i like to delete, this will be extra information which can be very uself to customers, since once the sync. task is deleted, there is no way to recover unless we made a copy of it.   Refer to below screenshot.           -Neeraj Vasudeva


Read More...

score 0
Already Offered 0 votes
Created on Apr 26, 2015 7:10 AM by Ramesh Allada - Last Modified: Apr 27, 2015 11:47 AM

Is there a plan to introduce variables in expression transformations for Cloud mappings?

score 20
Active 4 votes
Created on Apr 6, 2015 10:09 AM by Saranjeet Singh - Last Modified: Apr 6, 2015 10:09 AM

It will be nice to access success and error row counts related to a task as a parameter. Just like we can access $LastRunTime. If a similar parameter for success and error row counts is available, we can use it in the postprocessing session, to write these counts to an audit table.

score 20
Active 4 votes
Created on Feb 24, 2015 2:25 PM by Joshua Vaughn - Last Modified: Feb 24, 2015 2:26 PM

From a discussion started by Brian Goodyear:   Looking for ability to send email notification when agent turns to inactive status. In some cases we can't wait for a data sync task to fail, we need to know when the agent goes inactive. We have an alert set on our servers to notifiy when the windows service goes down, but sometimes the the agent is in a inactive state and the windows service is running.

score 15
Active 3 votes
Created on Feb 17, 2015 7:15 AM by Paul King - Last Modified: Feb 17, 2015 7:20 AM

Hello,   Following is a scenario that does not seem achievable in the current release of Mapping Designer. We maintain a job control table (a table that stores our delta ranges for different workflows) in an Oracle databaseWhen creating a Source in the IC Mapping Designer, we would like to dynamically set the Source's filter condition based upon a record present in the job control table. For example, if the Source is related to Salesforce and we are interested in sourcing recently updated Accou …


Read More...

score 35
Partially Implemented 7 votes
Created on Mar 27, 2012 2:45 PM by Joshua Vaughn - Last Modified: Jan 9, 2015 3:54 PM

From Greg Roberts: As products evolve it is necessary to update our tasks.  After upadating a task in the Sandbox environment I need to promote it to Produciton.  If the task already exists in production an error occurs.  In order to promote the task, I have to first delete the existing task.  Greg would like the migrate objects functionality to give him the option to overwrite existing objects when naming conflicts arise.   See this discussion thread for more info: https://community.informati …


Read More...

score 15
Active 3 votes
Created on Nov 27, 2014 4:37 AM by Pim Uijttewaal - Last Modified: Nov 27, 2014 4:37 AM

If I have different Data Synchronization apps in one task-flow, and one is not able to run, because Secure Agent is inactive, the following ones will also not be triggered. This should at least by possible to set, based on a setting in the task-flow. This is annoying if I have many tasks, in one task-flow, using different Secure Agents.  Thanks

score 5
Active 1 vote

Actually a data synchronization service (DSS) is designed in this way that a preprocessing command is processed after reading from the parameter file. If I update the parameter file in a preprocessing command the updated values are not incorporated in the filter values for the DSS. A dummy task before the DSS is currently necessary to update the parameter file.     I would like to have the preprocessing command executed before reading the parameter file to update the filter values just in one f …


Read More...

score 30
Delivered 5 votes
Created on Jun 29, 2011 10:11 PM by Dan Byrne - Last Modified: Sep 29, 2014 11:18 AM

It would be useful to be able to specify whether or not to execute a lookup, based on an expression of the current data available.   For example, if an input field is NULL, I do not want the lookup to execute, as it will find multiple rows of data in the lookup (as it is a nullable field), so instead of returning an error or multiple rows (which are not relevant), there could be a condition/expression to determine whether to utilise the lookup (or whether or not to utilise the lookup result)

score 70
Coming Soon 6 votes
Created on Mar 7, 2011 4:24 AM by Steven Mitchell - Last Modified: Sep 29, 2014 11:09 AM

Please could we include the option of selecting a secure agent when creating a salesforce connection. This is because when a connection from a secure agent is executed in a job generally a salesforce connection cannot be made due to firewall/proxy configuration problems. The test option is good for testing credentials but thoughtless when users are attempting to connect into salesforce from their secure agents locally.   This problem manifests itself in general confusion on the users part as wh …


Read More...

score 45
Coming Soon 9 votes
Created on Apr 1, 2013 7:52 PM by John Worth - Last Modified: Sep 29, 2014 10:54 AM

I would like to see an agent clustering option.  It does not need to be overly slick and can run on top of the existing secure agent implementation.  I am just looking to be able to define a simple cluster that would support high availability and automate the manual load balancing that I am forced to implement.   Step 1: Define a cluster secure agent: CLUS01 Step 2: Assign servers to the cluster secure agent: SERV01, SERV02, SERV03 Step 3: Assign connections to run on the cluster agent Step …


Read More...

score 10
Active 2 votes
Created on Aug 21, 2014 10:32 AM by Chris Lundy - Last Modified: Aug 21, 2014 10:32 AM

When you navigate to the connectors view, can you expose the dependencies, or jobs/entities that are using the connectors?  This is valuable because sometimes you setup connectors and want to delete poc or other test connectors but don't quite know if you can until you hit delete.  The message shows the dependencies.   Can you please expose this on the default connector view?

score 20
Delivered 4 votes
Created on May 13, 2011 9:48 AM by Karl Roembke - Last Modified: Aug 13, 2014 11:23 AM

One of the things we really need is some kind of external/federated security option. Since Informatica Cloud exists outside of my company's control, it is extremely important to us that we can control who has access to all computing resources, irrespective of where they reside. The current situation of creating multiple external logins for users to gain access to Informatica cloud presents a significant risk, as the normal on-boarding and off-boarding processes can't be used. When an employee or …


Read More...

score 35
Delivered 7 votes
Created on Apr 12, 2013 10:59 AM by Joshua Vaughn - Last Modified: Aug 13, 2014 11:19 AM

Currently Informatica Cloud task flows have a "Stop on Error" option for each task in the flow. If this option is checked for a specfic task, the task flow will stop running if that task results in an Failed status. However, there is no way to force the task flow to stop running if a task results in a Warning status.   Suggestion: Rename the "Stop on Error" option to "Stop on Failure" and add an additional "Stop on Warning" option for each task.   For additional details, see this community dis …


Read More...

score 25
Active 5 votes
Created on Jun 21, 2014 10:04 AM by Toye Begbaaji - Last Modified: Jun 21, 2014 10:04 AM

In Informatica Cloud, it's too easy to forget how many and which Tasks/Taskflows have ones email address included in the notification list. To make matters worse the email notification list is included during cloning and/or migration. This means potentially hundreds of random notifications from various Tasks/Taskflows and various Informatica Cloud orgs bombarding ones mailbox, and in some cases the mailboxes of past employees/contractors. The only option (that I know of) is to visit each Task/Ta …


Read More...

score 10
Already Offered 2 votes
Created on Feb 19, 2013 7:47 AM by Scott Chirls - Last Modified: May 27, 2014 11:11 AM

It would be useful to be able to undelete items from the Salesforce recycle bin using a task.  We would be able to undelete and reassign tasks that were incorrectly deleted by users in Salesforce, for instance.

score 15
Delivered 3 votes
Created on Feb 20, 2012 7:38 PM by Paul King - Last Modified: May 27, 2014 10:35 AM

Our integrations involve multiple, disparate data sources, and we have a small # of sfdc users/licenses at this time.  As a result, we throttle our integrations to ensure we do not exceed our sfdc storage limits.  For example, we need to retrieve a subset of customers from source A, and use that subset of customers to limit the data retrieved from source B.  Since Informatica Cloud lacks the PowerCenter equivalent of a 'Joiner', we are forced to make use of DB links and/or replicate disparate da …


Read More...

score 10
Active 2 votes
Created on May 27, 2014 7:56 AM by Shawn Clark - Last Modified: May 27, 2014 7:56 AM

Rather than referencing scripts that are stored on our local servers, it would make more sense to have a place within Informatica Cloud to store our Pre/Post scripts. This would make migrations and maintainance much easier, and also make Informatica Cloud feel more like a complete solution.   Thanks! Shawn

score 10
Active 2 votes
Created on May 27, 2014 7:45 AM by Shawn Clark - Last Modified: May 27, 2014 7:45 AM

I often need to switch my replicated task to point to various Salesforce.com sandboxes. However, when I change the source to point to SFDC - DEV instead of SFDC - QA or any other sandbox, the referenced object(s) are dropped and I lose all my field exclusions and relationships between objects.   In addition, there is no way to quickly copy/paste all field exclusions, so I have to go through the dropbox and re-select, sometimes a vast amount of fields. This is usually done by having my Informat …


Read More...

score 35
Active 7 votes
Created on Apr 28, 2014 8:28 AM by Shawn Clark - Last Modified: Apr 28, 2014 8:28 AM

Currently, the only way a user can view the Pre/Post processing commands is if they have access to Edit a task. We do not want to grant this permission to Developers in QA and Production, but would like them to be able to view which scripts are being referenced.   Additiionally, This would make it much easier on administrators as well, not to have to go into the wizard and hit step 6 to see if any Pre/Post processing commands are being run.

score 5
Active 1 vote
Created on Mar 20, 2014 11:13 AM by Chris Lundy - Last Modified: Mar 20, 2014 11:13 AM

When first installing the Informatica Agent the DTM/AgentConnectionTimeout is set to 5.  When running SFTP jobs the agent opens the connection and the java class fails.  This is because the timeout is set to expire too quickly.  If you look at the SalesforceTimeout, it is set to 300.  This idea is request that the AgentConnectionTimeout should be set to 300 as a default when you install a new agent.  Once extended to 300, SFTP functioned properly and didn't fail.   DTMINFOAgentConnectionTimeout3 …


Read More...

score 5
Active 1 vote
Created on Mar 18, 2014 10:10 PM by Dan Byrne - Last Modified: Mar 18, 2014 10:10 PM

It would be great to be able to bulk-load data with a MS SQL Server destination (eg. using BULK INSERT or OPENROWSET)... see example reference below from MS: http://technet.microsoft.com/en-us/library/ms175915.aspx   This would facilitate quicker loads when there are lots of records needing to be loaded from one system to another, rather than individual inserts into SQL.

score 5
Active 1 vote
Created on Feb 12, 2014 9:28 AM by John Worth - Last Modified: Feb 12, 2014 9:28 AM

When I currently create a user group and want to provide them access to a limited number of objects, I have to manually assign this permission to each object.  I am using naming conventions that would allow me to provide a filter on the user group object to make this a one time assignment for all current and future objects.   e.g. I only want my LATAM users to be able to run LATAM task flows, and read LATAM DSS and DRS.  All LATAM objects start with the string LAT.  So, I would like to be able …


Read More...

score 0
Delivered 0 votes
Created on Feb 12, 2014 2:32 AM by Pim Uijttewaal - Last Modified: Feb 12, 2014 9:01 AM

It would be great if in the Informatica Cloud I could restrict users creating, changing, editing, deleting certain jobs. We are running with different countries, the Data Integration. I would like to authorize certain users only to edit/run certain jobs and not being able to access other jobs.

score 10
Active 2 votes
Created on Dec 9, 2013 9:00 AM by Tom Engelhardt - Last Modified: Dec 9, 2013 9:00 AM

It would be nice to see some indiciation of overall progress when adding objects to an object migration. Right now when you select an object such as a large task flow with many individual data sync tasks and you click "Ok," the screen does not advance, nor is there any indication in the UI that the objects are getting added to the migration queue. The screen hangs for a bit until, finally, all objects get added to the queue.   A suggestion would be that, once user chooses objects and clicks "Ok …


Read More...

score 15
Active 3 votes
Created on Nov 13, 2013 2:26 PM by Saranjeet Singh - Last Modified: Nov 13, 2013 2:26 PM

It will be great if we could right click on hyperlinks in Informatica Cloud and open them in another tab. Right now, when using Chrome browser, if you right click on a hyperlink and select 'Open in new tab'... a blank window opens up.

score 20
Active 4 votes
Created on Oct 27, 2013 5:19 PM by Nandha Hareesh Manne - Last Modified: Oct 27, 2013 5:19 PM

Hi ,   It will be really useful if Task ID is visible in the UI Rahter than having to fetch from URL especially for users using this tool https://community.informatica.com/docs/DOC-2908   Thanks Nandha Hareesh

score 35
Active 7 votes

Since teradata is one of the major databse players why dont you create an adapter for terdata to cloud in the market place, as of now only ODBC is the viable route to pump data out to cloud from on permise and it takes too much performance hit.

score 25
Active 5 votes
Created on Sep 16, 2013 11:07 PM by Patrick Goessens - Last Modified: Sep 16, 2013 11:37 PM

Currently the Activity Monitor's last column is simply called "Row Count", and essentially what it shows is the number of rows read from source. Every time I show someone what the progress is, they think that it equals what is already loaded in the target... but in case of a Cloud target (in our case SFDC) that is never the case (as it is inherently slower to write than to read). Thus I would find it very useful to have an extra column in there, to show how many records have actually been load …


Read More...

score 80
Active 16 votes
Created on Sep 16, 2013 6:21 PM by Patrick Goessens - Last Modified: Sep 16, 2013 11:09 PM

We're in the process of going live with SFDC, and we're using Informatica Cloud (IC) to load the data into SFDC. We've gone through various testing stages, and within testing stages we've gone through iterations.   Now, I got asked yesterday by the PM what the loading conditions were for a particular DSS 2½ weeks ago. The DSS itself has undergone changes, and then I realised that I couldn't tell him what the config was back then.   Thus the question arose, what are the options to come up wit …


Read More...

Items per page
RSS feed of this list 1 2 Previous Next

Actions

Notifications