Manage categories

Close

Create and manage categories in Informatica Cloud. Removing a category will not remove content.

Categories in Informatica Cloud
Add a new category (0 remaining)

Manage Announcements

Close

Create and manage announcements in Informatica Cloud. Try to limit the announcements to keep them useful.

Announcements in Informatica Cloud
Subject Author Date Actions
Skip navigation
Contact Us

Ideas

Filter by Categories & Tags

Sort By: Filter by Stage: When:
score 5
Active 1 vote
Created on Mar 20, 2014 11:13 AM by Chris Lundy - Last Modified: Mar 20, 2014 11:13 AM

When first installing the Informatica Agent the DTM/AgentConnectionTimeout is set to 5.  When running SFTP jobs the agent opens the connection and the java class fails.  This is because the timeout is set to expire too quickly.  If you look at the SalesforceTimeout, it is set to 300.  This idea is request that the AgentConnectionTimeout should be set to 300 as a default when you install a new agent.  Once extended to 300, SFTP functioned properly and didn't fail.   DTMINFOAgentConnectionTimeout3 …


Read More...

score 5
Active 1 vote
Created on Mar 18, 2014 10:10 PM by Dan Byrne - Last Modified: Mar 18, 2014 10:10 PM

It would be great to be able to bulk-load data with a MS SQL Server destination (eg. using BULK INSERT or OPENROWSET)... see example reference below from MS: http://technet.microsoft.com/en-us/library/ms175915.aspx   This would facilitate quicker loads when there are lots of records needing to be loaded from one system to another, rather than individual inserts into SQL.

score 5
Active 1 vote
Created on Feb 12, 2014 9:28 AM by John Worth - Last Modified: Feb 12, 2014 9:28 AM

When I currently create a user group and want to provide them access to a limited number of objects, I have to manually assign this permission to each object.  I am using naming conventions that would allow me to provide a filter on the user group object to make this a one time assignment for all current and future objects.   e.g. I only want my LATAM users to be able to run LATAM task flows, and read LATAM DSS and DRS.  All LATAM objects start with the string LAT.  So, I would like to be able …


Read More...

score 0
Delivered 0 votes
Created on Feb 12, 2014 2:32 AM by Pim Uijttewaal - Last Modified: Feb 12, 2014 9:01 AM

It would be great if in the Informatica Cloud I could restrict users creating, changing, editing, deleting certain jobs. We are running with different countries, the Data Integration. I would like to authorize certain users only to edit/run certain jobs and not being able to access other jobs.

score 10
Active 2 votes
Created on Dec 9, 2013 9:00 AM by Tom Engelhardt - Last Modified: Dec 9, 2013 9:00 AM

It would be nice to see some indiciation of overall progress when adding objects to an object migration. Right now when you select an object such as a large task flow with many individual data sync tasks and you click "Ok," the screen does not advance, nor is there any indication in the UI that the objects are getting added to the migration queue. The screen hangs for a bit until, finally, all objects get added to the queue.   A suggestion would be that, once user chooses objects and clicks "Ok …


Read More...

score 15
Active 3 votes
Created on Nov 13, 2013 2:26 PM by Saranjeet Singh - Last Modified: Nov 13, 2013 2:26 PM

It will be great if we could right click on hyperlinks in Informatica Cloud and open them in another tab. Right now, when using Chrome browser, if you right click on a hyperlink and select 'Open in new tab'... a blank window opens up.

score 10
Active 2 votes
Created on Oct 27, 2013 5:19 PM by Nandha Hareesh Manne - Last Modified: Oct 27, 2013 5:19 PM

Hi ,   It will be really useful if Task ID is visible in the UI Rahter than having to fetch from URL especially for users using this tool https://community.informatica.com/docs/DOC-2908   Thanks Nandha Hareesh

score 30
Active 6 votes

Since teradata is one of the major databse players why dont you create an adapter for terdata to cloud in the market place, as of now only ODBC is the viable route to pump data out to cloud from on permise and it takes too much performance hit.

score 20
Active 4 votes
Created on Sep 16, 2013 11:07 PM by Patrick Goessens - Last Modified: Sep 16, 2013 11:37 PM

Currently the Activity Monitor's last column is simply called "Row Count", and essentially what it shows is the number of rows read from source. Every time I show someone what the progress is, they think that it equals what is already loaded in the target... but in case of a Cloud target (in our case SFDC) that is never the case (as it is inherently slower to write than to read). Thus I would find it very useful to have an extra column in there, to show how many records have actually been load …


Read More...

score 25
Active 5 votes
Created on Sep 16, 2013 6:21 PM by Patrick Goessens - Last Modified: Sep 16, 2013 11:09 PM

We're in the process of going live with SFDC, and we're using Informatica Cloud (IC) to load the data into SFDC. We've gone through various testing stages, and within testing stages we've gone through iterations.   Now, I got asked yesterday by the PM what the loading conditions were for a particular DSS 2½ weeks ago. The DSS itself has undergone changes, and then I realised that I couldn't tell him what the config was back then.   Thus the question arose, what are the options to come up wit …


Read More...

score 20
Active 4 votes
Created on Sep 13, 2013 1:46 PM by Joshua Vaughn - Last Modified: Sep 13, 2013 1:46 PM

From a discussion started by Steve Odgen:   There is no way to view the log file for an actively running task without go to the agent maching and opening the file there. Are there plans to change this? If not, then there should be. It can be misleading and frustrating to see a task sitting there, appearing to do nothing, when its logging thousands of errors in the log file.

score 5
Delivered 1 vote
Created on Aug 28, 2013 1:59 PM by Steve Ogden - Last Modified: Sep 3, 2013 12:17 PM

Parameterization of task connections would be beneficial.

score 10
Active 2 votes
Created on Aug 28, 2013 1:49 PM by Steve Ogden - Last Modified: Aug 28, 2013 1:50 PM

It would be useful to have a means to have an IC task run based on an event. Like a watchfile, or even some means to combine PowerCenter workflows with Cloud tasks or task flows run via the same enterprise scheduling tool. Maybe the best solution is to have some sort of pmcmd support for Cloud that integrates with PowerCenter.

score 25
Active 5 votes
Created on Jul 3, 2013 10:55 AM by David Rewerts - Last Modified: Jul 3, 2013 10:55 AM

Today we are dependent upon Informatica Cloud architecture to be available 100% of the time for ETL processing.  This adds to the risk of a critical outage by introducing an entire infrastructure to be available for run-time needs.  In addition, this additional call for each ETL run hinders performance where it may not be necessary.   I propose caching the Informatica Cloud Workflows and Metadata into the Secure Agent.  If a Workflow or Metadata is updated in the cloud then that would automatic …


Read More...

score 5
Active 1 vote
Created on Jun 17, 2013 3:00 PM by Steve Ogden - Last Modified: Jun 17, 2013 3:00 PM

There needs to be a better way to organize tasks and who they are created by. Since there is no concept of a Folder, like in PowerCenter, it seems like the only way to organize tasks and restrict them to a specific set of users is by settting up a User Group, and then assigning permissions for tasks to these groups. And while I can setup a task view to see tasks created or updated by a single specific user, I cannot setup a view to see which group or groups own these tasks. So if I want to know …


Read More...

score 10
Active 2 votes
Created on Jun 17, 2013 2:50 PM by Steve Ogden - Last Modified: Jun 17, 2013 2:50 PM

We have noticed an issue with SFDC replication that will hopefully be addressed in a (near) future release. Lets say today you create, and run, an incremental SFDC replication task to replicate one SFDC table. Tomorrow you decide to add another table to the replication task. The next time you run the task, in addition to replicating the new table, it has the side effect of truncating and completely reloading the other replicated table you loaded the previous day. Since replicating SFDC tables is …


Read More...

score 5
Active 1 vote
Created on May 30, 2013 7:31 AM by Toye Begbaaji - Last Modified: May 30, 2013 7:31 AM

The ability to export and then reimport field mappings when working with Tasks would be huge. Here's why:   Whenever a connection or source/target is changed on a Synch Task, there is a high likelihood that the field mappings will need to be recreated. Often times this includes complex mapping and/or lookup expressionss. This can be a very tedious chore especially for scenarios where a Salesforce sandbox has been used and needs to be migrated to a production Salesforce org. Or when flat file na …


Read More...

score 25
Active 5 votes
Created on May 15, 2013 7:11 AM by Gary Jutras - Last Modified: May 15, 2013 7:11 AM

Can we get the organization name added to the subject or the body?  My company has six organizations, tens of thousand of tasks, thousands of taskflows and hundreds of schedules and it would make it easier to tell where an error is coming from at a glance.   Also if it was launched manually or from which schedule it was launched would also be useful.

score 5
Active 1 vote
Created on May 13, 2013 1:28 PM by Joshua Vaughn - Last Modified: May 13, 2013 1:28 PM

From Greg Roberts:   I need to be able to replicate our Oracle CRM On-Demand data in a similar fashion to SalesForce Replication. Please ad this feature to a near-future release. I have this thread with Informatica support on the topic: Informatica Cloud Replication of Oracle CRM On-Demand (OCOD).

score 10
Active 2 votes
Created on May 9, 2013 11:15 AM by Prasad Mani - Last Modified: May 9, 2013 11:19 AM

INFA Cloud DRS task fails when trying to replicate data to a SQL Server 2008 R2 target database for data values with a certain date range.   Example of a data that fails to replicate : "01/01/1700 00:00:00.000000000"   As per the documentation in the Infa cloud user guide, the DRS converts the salesforce datatype 'Date' to SQL Server datatype 'Datetime'   The 'Datetime' datatype does not support values older than year 1753 in SQL Server.   For DRS , the conversion datatype has to be 'Datetim …


Read More...

score 5
Active 1 vote
Created on Apr 24, 2013 8:11 AM by David Cheung - Last Modified: Apr 24, 2013 9:41 AM

We have a hourly scheduled ICS integration to a client's SFTP endpoint for soruce data. Periodically the endpoint becomes unreachable, when this occurs the task becomes hung waiting to connect. Any subsequence scheduled runs fails with the error "Data Synchronization task failed to run. Another instance of the task is currently executing". The hung task needs to be manually cancelled in order to resolved the issue. A timeout parameter is needed to allow self-aborting of the task so that the next …


Read More...

score 10
Active 2 votes
Created on Apr 21, 2013 1:50 PM by Toye Begbaaji - Last Modified: Apr 21, 2013 1:50 PM

It would be helpful to be able to search for strings across Task metadata. Consider the following scenarios:   One might, for example, want to reuse a complex conditional statement used successfully in a Task several months ago and this would make it easier for one to pinpoint said Task by searching for 'IIF' expressions across all existing Tasks so one can copy and paste to the new Task. One might need to update a hardcoded value used within one or more Tasks, for example, a quota updated from …


Read More...

score 15
Active 3 votes
Created on Apr 19, 2013 8:35 AM by Toye Begbaaji - Last Modified: Apr 19, 2013 8:36 AM

When reviewing the Task components of a Taskflow, the Task names display as just text. If there is a need to review or edit configuration of one of these Tasks, one must either memorize the name or copy/paste/search for the exact Task name in the list on the Tasks (separate) page/tab. The more Tasks one has in the org, the more cumbersome this becomes (due to pagination). Making them display as hyperlinks would reduce the margin of error and awkwardness by taking the user right to the appropriat …


Read More...

score 15
Active 3 votes
Created on Apr 17, 2013 6:58 AM by Gary Jutras - Last Modified: Apr 17, 2013 6:58 AM

This would allow us to use shares that have domain names in them like \\company.com\share.

score 35
Active 7 votes
Created on Apr 12, 2013 10:59 AM by Joshua Vaughn - Last Modified: Apr 12, 2013 10:59 AM

Currently Informatica Cloud task flows have a "Stop on Error" option for each task in the flow. If this option is checked for a specfic task, the task flow will stop running if that task results in an Failed status. However, there is no way to force the task flow to stop running if a task results in a Warning status.   Suggestion: Rename the "Stop on Error" option to "Stop on Failure" and add an additional "Stop on Warning" option for each task.   For additional details, see this community dis …


Read More...

score 45
Active 9 votes
Created on Apr 1, 2013 7:52 PM by John Worth - Last Modified: Apr 1, 2013 7:52 PM

I would like to see an agent clustering option.  It does not need to be overly slick and can run on top of the existing secure agent implementation.  I am just looking to be able to define a simple cluster that would support high availability and automate the manual load balancing that I am forced to implement.   Step 1: Define a cluster secure agent: CLUS01 Step 2: Assign servers to the cluster secure agent: SERV01, SERV02, SERV03 Step 3: Assign connections to run on the cluster agent Step …


Read More...

score 10
Active 2 votes
Created on Mar 19, 2013 8:54 AM by Duane Krahn - Last Modified: Mar 19, 2013 8:58 AM

We have a need to run some integration jobs that consolidate, upload and clean up Salesforce data on quarterly, semi-yearly and yearly basis.  The longest timeframe currently available in the Informatica Cloud job scheduler is Monthly (every month at a certain date).   I currently have to run these jobs manually, or setup a one-time schedule and then remember to reschedule it after it completes.   Adding the option to select & specify which months the job should run (similar to the Months opt …


Read More...

score 20
Active 4 votes
Created on Mar 5, 2013 11:19 AM by Karl Roembke - Last Modified: Mar 5, 2013 11:19 AM

I want to be able to give my individual users the ability to migrate objects between orgs without giving them administrative rights.   See the discussion in https://community.informatica.com/thread/32252   (I didn't realize that this wasn't listed under ideas until just now.)

score 30
Active 6 votes
Created on Feb 25, 2013 3:34 PM by Gary Jutras - Last Modified: Feb 25, 2013 3:34 PM

I would love to see nested workflows.   It would be incredibly useful.   Allow the stop on error to bubble to parent workflow. Then if in the parent workflow you decided not to stop on error you could continue on to the next step.   This would allow great flexibility in scheduling and to simulate try - final type logic.

score 5
Delivered 1 vote
Created on Jun 6, 2012 12:54 PM by Paul King - Last Modified: Feb 24, 2013 8:15 AM

In Apex Data Loader, "when you select the Hard Delete operation, the deleted records are not stored in the Recycle Bin. Instead, they become immediately eligible for deletion".   I'd like to see this as an option in Informatica Cloud.   This was also inquired about on: https://community.informatica.com/thread/31948

Items per page
RSS feed of this list 1 2 Previous Next

Actions

Notifications