Beruflich Dokumente
Kultur Dokumente
What's New
Informatica Cloud Data Integration What's New
Winter 2019 April
April 2019
© Copyright Informatica LLC 2016, 2019
This software and documentation are provided only under a separate license agreement containing restrictions on use and disclosure. No part of this document may be
reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC.
U.S. GOVERNMENT RIGHTS Programs, software, databases, and related documentation and technical data delivered to U.S. Government customers are "commercial
computer software" or "commercial technical data" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such,
the use, duplication, disclosure, modification, and adaptation is subject to the restrictions and license terms set forth in the applicable Government contract, and, to the
extent applicable by the terms of the Government contract, the additional rights set forth in FAR 52.227-19, Commercial Computer Software License.
Informatica, Informatica Cloud, Informatica Intelligent Cloud Services, PowerCenter, PowerExchange, and the Informatica logo are trademarks or registered trademarks
of Informatica LLC in the United States and many jurisdictions throughout the world. A current list of Informatica trademarks is available on the web at https://
www.informatica.com/trademarks.html. Other company and product names may be trade names or trademarks of their respective owners.
Portions of this software and/or documentation are subject to copyright held by third parties. Required third party notices are included with the product.
The information in this documentation is subject to change without notice. If you find any problems in this documentation, report them to us at
infa_documentation@informatica.com.
Informatica products are warranted according to the terms and conditions of the agreements under which they are provided. INFORMATICA PROVIDES THE
INFORMATION IN THIS DOCUMENT "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING WITHOUT ANY WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND ANY WARRANTY OR CONDITION OF NON-INFRINGEMENT.
Table of Contents 3
Chapter 3: Upgrading to Winter 2019 April. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Preparing for the upgrade. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Microsoft Azure Data Lake Store V3 Connector pre-upgrade tasks. . . . . . . . . . . . . . . . . . . 25
MySQL Connector Pre-Upgrade Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
After you upgrade. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Amazon Redshift V2 Connector post-upgrade tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Tableau V2 Connector post-upgrade tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Tableau V3 Connector post-upgrade tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Taskflows post-upgrade tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
4 Table of Contents
Preface
What's New contains a brief overview of new features, enhancements, and changed behaviors for the Winter
2019 release. It also includes upgrade steps that you might need to perform.
Informatica Resources
Informatica provides you with a range of product resources through the Informatica Network and other online
portals. Use the resources to get the most from your Informatica products and solutions and to learn from
other Informatica users and subject matter experts.
Informatica Documentation
Use the Informatica Documentation Portal to explore an extensive library of documentation for current and
recent product releases. To explore the Documentation Portal, visit https://docs.informatica.com.
Informatica maintains documentation for many products on the Informatica Knowledge Base in addition to
the Documentation Portal. If you cannot find documentation for your product or product version on the
Documentation Portal, search the Knowledge Base at https://search.informatica.com.
If you have questions, comments, or ideas about the product documentation, contact the Informatica
Documentation team at infa_documentation@informatica.com.
https://network.informatica.com/community/informatica-network/products/cloud-integration
To find resources on using Application Integration (the Informatica Cloud Real Time service), access the
community at:
https://network.informatica.com/community/informatica-network/products/cloud-integration/cloud-
application-integration/content
5
Developers can learn more and share tips at the Cloud Developer community:
https://network.informatica.com/community/informatica-network/products/cloud-integration/cloud-
developers
https://marketplace.informatica.com/community/collections/cloud_integration
To search the Knowledge Base, visit https://search.informatica.com. If you have questions, comments, or
ideas about the Knowledge Base, contact the Informatica Knowledge Base team at
KB_Feedback@informatica.com.
Subscribe to the Informatica Intelligent Cloud Services Trust Center to receive upgrade, maintenance, and
incident notifications. The Informatica Intelligent Cloud Services Status page displays the production status
of all the Informatica cloud products. All maintenance updates are posted to this page, and during an outage,
it will have the most current information. To ensure you are notified of updates and outages, you can
subscribe to receive updates for a single component or all Informatica Intelligent Cloud Services
components. Subscribing to all components is the best way to be certain you never miss an update.
To subscribe, go to the Informatica Intelligent Cloud Services Status page and click SUBSCRIBE TO
UPDATES. You can then choose to receive notifications sent as emails, SMS text messages, webhooks, RSS
feeds, or any combination of the four.
For online support, click Submit Support Request in Informatica Intelligent Cloud Services. You can also use
Online Support to log a case. Online Support requires a login. You can request a login at
https://network.informatica.com/welcome.
The telephone numbers for Informatica Global Customer Support are available from the Informatica web site
at https://www.informatica.com/services-and-training/support-services/contact-us.html.
6 Preface
Chapter 1
Connectors
The Winter 2019 April release includes the following new connectors and enhanced connectors.
New Connectors
This release includes the following new connectors.
MemSQL V2 Connector
You can use MemSQL V2 Connector to connect to MemSQL from Cloud Data Integration and securely read
data from or write data to MemSQL database. Use the Bulk Insert option to write large volumes of data to the
MemSQL database.
Enhanced Connectors
This release includes enhancements to the following connectors.
Coupa V2 Connector
You can configure IsAPIGlobalNamespace field for a custom field to display the custom field under root tag
or custom field tag in the field mapping.
Cvent Connector
You can use the ContactSnapshot object to retrieve a list of contacts pertaining to each event instead of all
contacts present in account-wide address book.
For more information about ContactSnapshot object and filters, see the Cloud Data Integration Cvent
Connector Guide.
7
File Processor Connector
You can use File Processor Connector to sign the files in addition to encrypting and decrypting the files in a
local file system.
Litmos Connector
You can create mappings to read the following objects:
• OrgLearningpathCourses
• OrgCourseModules
• GetAllCourseCustomeFields
• GetCourseCustomeFieldById
You search for Enterprise Data Catalog objects on the Data Catalog page:
Search for objects to discover tables, views, and flat files in the catalog. You discover objects by entering a
search phrase that might occur in the object name, description, or other metadata such as data domain or
9
associated business glossary term. After you select an object from the search results, you can add it to a
new synchronization task, to a new mapping, or to a mapping that is currently open in Data Integration.
For more information about discovering objects to use as mapping sources, targets, or lookup objects, see
Mappings. For more information about discovering objects to use as synchronization task sources, see
Tasks.
File listener
This release includes the following new features and enhancements for file listener:
Mass ingestion
This release includes the following new features and enhancements for the mass ingestion tasks:
You can also use a file listener as a source type to transfer files to target connectors.
• Advanced FTP V2
• Advanced FTPS V2
• Advanced SFTP V2
The previous versions of connectors are deprecated. Informatica recommends that you upgrade to the new
version of connectors before it drops support for the previous versions.
Mapping inventory
If your organization uses Enterprise Data Catalog and you have the appropriate licenses, the Mapping
Designer includes the new Inventory panel. The Inventory panel lists the Enterprise Data Catalog objects that
you have discovered on the Data Catalog page and added to the mapping.
Each mapping has its own inventory. You can add inventory objects to the mapping as sources, targets, or
lookup objects. Objects remain in the inventory until you remove them.
When you use smart match, Informatica Intelligent Cloud Services looks for common patterns in field names
and automatically matches fields with similar names. For example, if you have an incoming field Cust_Name
and a target field Customer_Name, the smart match automatically links the Cust_Name field with the
Customer_Name field.
To map fields with similar names, select Smart Match from the Automatch menu.
For more information about using smart match in transformations, see Transformations. For more
information about using smart match in mapping tasks and synchronization tasks, see Tasks.
Explore page
The Explore page includes the following enhancements.
Copy assets
You can copy assets within the same folder. When you copy assets into the current folder, you can either
overwrite the original asset or keep both assets. If you choose to keep both assets, Informatica Intelligent
Cloud Services appends the new asset name with "Copy x" where x is the sequential copy number.
When you copy assets into a different folder, if an asset with that name already exists, you can choose to
keep both assets. Informatica Intelligent Cloud Services appends the new asset name with "Copy x" where x
is the sequential copy number.
• To facilitate working with large or complex structures, you can search for data values (nodes) in the visual
model using full or partial strings.
• From the search results you can select one or multiple nodes and perform actions on those nodes. You
can exclude or include nodes, collapse or expand the model under nodes, rename nodes, or change node
type to string.
• You can create a model based on an Excel file with multiple sheets, which will include all sheets in the file.
You can view and tune all sheets in the model and choose the sheet for which to display in the input data
panel.
Taskflows
Taskflows include the following enhancements:
Taskflow as a service
You can invoke a taskflow as an API by publishing the taskflow as a service. When you publish a
taskflow, Data Integration generates the service URL and the SOAP service URL. You can use these
endpoint URLs to invoke the taskflow as an API. When you invoke a taskflow as an API, you can
dynamically provide input parameters for the tasks that the taskflow contains and perform
orchestration.
Subtaskflows
When you create a taskflow, you can use the Subtaskflows step to embed and reuse a taskflow within
another taskflow. You can use a subtaskflow to reuse the same orchestration flow across multiple
branches of a taskflow or across different taskflows. You can then invoke the taskflow with different
sets of parameters.
Administrator
The Administrator service includes the following enhancements.
Connections search
You can search for connections on the Connections page in Administrator. You can search for connections
by name or description.
For example, your organization uses Data Integration and Operational Insights. To balance the load across
Secure Agent groups, you might configure one Secure Agent group for Data Integration tasks and another for
Operational Insights auto-scaling and data collection.
To do this, for the first group, you enable Data Integration Server and disable Auto Scale App and OpsInsights
Data Collector App. For the second group, you enable Auto Scale App and OpsInsights Data Collector App
and disable Data Integration Server.
For more information about assigning services to Secure Agent groups, see Administrator in the
Administrator help.
The following image shows the find field on the All Jobs page:
REST API
The Informatica Intelligent Cloud Services REST API includes the following enhancements.
objects resource
Use the objects resource to receive a list of your organization's assets. You can receive a list of all of your
organization's assets. Or, you can filter the list by project, folder, asset type, last update, and the user who
last updated the asset.
For example, in Organization A, a mapping task with a Sequence Generator transformation has a NEXTVAL
value of 3270. The same task was migrated to Organization B, however the NEXTVAL value in Organization B
is 0. You want to synchronize the task's state between Organization A and Organization B so that the
NEXTVAL value in both organizations has a value of 3270. You use the fetchState and loadState resources to
synchronize the NEXTVAL value so that you can run the task in Organization B while preserving the sequence
of numbers.
Changed behavior
The Winter 2019 March release includes the following changed behaviors.
Previously, object-level details for import and export logs were only available on the Import/Export Logs and
My Import/Export Logs pages for seven days.
For more information about import and export logs, see Monitor.
Migrate assets
Users with Administrator privileges can log into a parent organization, switch to a sub-organization, and
import or export Data Integration assets. Previously, administrators needed to log into the sub-organization
to import and export assets.
Transformation palette
When you create a mapping, you can now view only the transformations that your organization is licensed to
use, or you can view all transformations that are available in Data Integration. If a transformation that you
need is not licensed, you can contact Informatica Global Customer Support to get the license.
Use the licensed transformations icon at the bottom of the transformation palette to view either the licensed
transformations or all transformations.
Changed behavior 15
The following image shows the licensed transformations icon:
Previously, when you created a mapping, the transformation palette displayed all transformations that were
available in Data Integration.
Taskflows
If you add a Data Task step, the Data Task fields appear on the Temp Fields tab of the Start step.
Previously, if you added a Data Task step, the Data Task fields appeared on the Input Fields tab of the Data
Task step.
If you included a Data Task step in a taskflow, after upgrade, the Data Task fields appear on the Temp Fields
tab of the Start step. The Data Task fields represent the input parameters of the task.
For more information about Data Task fields in taskflows, see Taskflows.
Administrator
The Administrator service includes the following behavior changes.
For example, if your organization has no access to Application Integration or API Manager, you cannot assign
the Application Integration Business Manager, Application Integration Data Viewer, Deployer, or Operator role
Previously, administrators could assign any role to a user, regardless of the organization's licenses.
For more information about user roles, see Administrator in the Administrator help.
Renamed roles
Some service-specific roles are renamed to clarify which services that the roles apply to.
For more information about user roles, see Administrator in the Administrator help.
The following image shows the Generate Token option on the Runtime Environments page:
Previously, you registered a Secure Agent using your Informatica Intelligent Cloud Services user name and
password.
For more information about downloading, installing, and registering a Secure Agent, see Getting Started in the
Data Integration help or Administrator in the Administrator help.
Connectors
The Winter 2019 March release includes the following new connectors and enhanced connectors.
Connectors 17
New connectors
This release includes the following new connectors.
Greenplum Connector
You can use Greenplum Connector to connect to Greenplum from Data Integration. Use Greenplum
Connector to read data from or write data to Greenplum. You can create a mapping task to process data
based on the data flow logic defined in a Greenplum mapping.
When you run a Greenplum mapping to read data from Greenplum, the Secure Agent invokes the Greenplum
database parallel file server, gpfdist, to read data.
When you run a Greenplum mapping task to write data to Greenplum, the Secure Agent creates a control file
to provide load specifications to the Greenplum gpload bulk loading utility, invokes the Greenplum gpload
bulk loading utility, and writes data to the named pipe. The Greenplum gpload bulk loading utility launches
gpfdist, which is Greenplum's file distribution program, that reads data from the named pipe and loads data
into the Greenplum target.
Enhanced connectors
This release includes enhancements to the following connectors.
• In addition to the existing regions, you can also read data from or write data to the following regions:
- China(Beijing)
- China(Ningxia)
• You can select a cluster region name in the Cluster Region connection property, even though you specify
the cluster region name in the JDBC URL connection property.
• You can retain the null values when you read data from Amazon Redshift. To retain the null values, select
the Treat NULL Value as NULL option in the source advanced properties.
• When you create a target, you can view and edit the metadata of the target object in the Target Fields tab.
You can edit the data type, precision and define primary key of the columns in the target object.
• You can specify the alias name generated by AWS Key Management Service (AWS KMS) in the Customer
Master Key ID connection property.
• In addition to the existing file formats, you can create external tables for the RCFile and SequenceFIile file
formats to use Amazon Redshift Spectrum.
Connectors 19
• You can use the following changed data enhancements when you write data to the Amazon Redshift
target:
Write changed data to Amazon Redshift target
You can create a mapping to capture changed data from a CDC source, and then run the associated
mapping tasks to write the changed data to an Amazon Redshift target.
For example, you can capture changed data from an Oracle CDC source and write the changed data
to an Amazon Redshift target table.
You can resume the extraction of changed data from the point of interruption when a mapping task
fails or is stopped before completing the task. To resume processing changed data, set the Recovery
Strategy advanced session property to Resume from last checkpoint on the Schedule page when you
create or edit a mapping task.
Data Driven
You can retain the order of the records when you read data from a CDC source and write data to an
Amazon Redshift target.
Recovery Schema Name
You can specify the name of the schema that contains recovery file to resume the extraction of the
changed data from the point of interruption. You can add the recovery schema name in the Recovery
Schema Name target property.
Amazon S3 V2 Connector
This release includes the following enhancements for Amazon S3 V2 Connector:
• In addition to the existing regions, you can also read data from or write data to the following regions:
- China(Beijing)
- China(Ningxia)
• You can create a mapping to read or write an ORC file to Amazon S3.
• You can read data from or write data to the files whose names are longer than 80 characters.
• You can encrypt an Avro or Parquet file using Server-Side Encryption as the encryption type when you
write the file to an Amazon S3 target.
• In addition to the existing compression formats, you can compress data in the following formats when
you read data from or write data to Amazon S3:
- Deflate
- Snappy
- Zlib
Anaplan V2 Connector
You can configure the proxy server settings for the Secure Agent to connect to Anaplan V2.
Hive Connector
You can use the following distributions for a read or write operation in Hive Connector:
Hortonworks 2.5 and Hortonworks 2.6 Hortonworks 2.5 and Hortonworks 2.6
Litmos Connector
You can create mappings to read the following objects:
• OrgCourses
• OrgLearningPaths
• OrgUserProgramResults
• OrgModules
• OrgModulesResult
• OrgLearningpathResults
• When you write data to Microsoft Azure SQL Data Warehouse, you can compress the files that are written
to the Blob staging area in the gzip format.
• You can use custom queries when you read a source object.
• You can override the source object and the source object schema in the source advanced properties. The
source object and the source object schema defined in the source advanced properties take the
precedence.
• You can override the target object and the target object schema in the target advanced properties. The
target object and the target object schema defined in the target advanced properties take the precedence.
• You can utilize the MD5() pushdown function when you use an ODBC connection to connect to Microsoft
Azure SQL Data Warehouse.
Connectors 21
ODBC Connector
This release includes the following enhancements for ODBC Connector:
• You can use the ODBC connection to connect to Google BigQuery. Specify the Google BigQuery subtype in
an ODBC connection to connect to Google BigQuery sources and targets.
• You can enable full or source pushdown optimization for Google BigQuery sources and targets. Specify
the Google BigQuery subtype in the ODBC connection properties to enable pushdown optimization when
you read data from or write data to Google BigQuery tables. The Secure Agent pushes the supported
functions to Google BigQuery sources and targets based on the pushdown optimization session property
value you specify.
Oracle Connector
This release includes the following enhancements for Oracle Connector:
• You can use an Oracle connection to connect to Oracle Database Cloud Service.
• You can configure additional properties for the Oracle connection to read metadata. Specify the
connection properties as semicolon-separated key-value pairs in the Metadata Advanced Connection
Properties field in the Oracle connection.
• You can configure additional properties for the Oracle connection to run mappings. Specify the
connection properties as semicolon-separated key-value pairs in the Runtime Advanced Connection
Properties field in the Oracle connection.
REST V2 Connector
This release includes the following enhancements for REST V2 Connector:
Salesforce Connector
You can use version 44.0 of the Salesforce API to create a Salesforce connection and access Salesforce
objects.
• You can use the insert and update operations for a Non-Contact Linked Data Extension.
• You can configure the Salesforce Marketing Cloud connection properties to access data across all
business units in your Salesforce Marketing Cloud account.
Tableau V3 Connector
The Secure Agent displays the projects and data sources that are present on Tableau Server or Tableau
Online when you do not specify the schema in the Schema File Path connection property.
Teradata Connector
You can configure Kerberos authentication for a Teradata connection. To configure Kerberos, you must
select the authentication type as KRB5. Specify the Kerberos artifacts directory where the krb5.conf and
IICSTPT.keytab files are located and provide the Kerberos user name in the Teradata connection properties.
Changed behavior
This release includes the following behavior changes in connectors.
If you have mass ingestion tasks that use Advanced FTP connection as a source or target or both,
Informatica recommends that you replace the Advanced FTP connections with Advanced FTP V2
connections.
If you have mass ingestion tasks that use Advanced FTPS connection as a source or target or both,
Informatica recommends that you replace the Advanced FTPS connections with Advanced FTPS V2
connections.
If you have mass ingestion tasks that use Advanced SFTP connection as a source or target or both,
Informatica recommends that you replace the Advanced SFTP connections with Advanced SFTP V2
connections.
Tableau V2 Connector
This release includes the following changes in Tableau V2 Connector:
• You must install Red Hat Enterprise Linux version 7 or higher for the Secure Agents that runs on Linux
operating systems to run the mappings successfully.
Previously, you used to install Red Hat Enterprise Linux version 6 or higher for the Secure Agents installed
on Linux operating systems to run the mappings successfully.
MySQL Connector
MySQL Connector uses the MySQL JDBC and ODBC drivers, version 8.0.12.
Previously, MySQL Connector used the MySQL JDBC driver version 5.1.40 and ODBC driver version 5.3.7.
PostgreSQL Connector
The SSLv2 protocol support for PostgreSQL Connector has been removed. The SSLv2 option does not display
in the Crypto Protocol Versions list in the PostgreSQL connection properties.
Snowflake Connector
To address issues of timeout when using a custom SQL query as a source in a mapping, the Secure Agent
fetches the metadata using separate metadata calls.
Previously, the Secure Agent ran the SQL query for a few records to obtain the metadata and long running
queries would cause a timeout.
Connectors 23
Tableau V2 Connector
This release includes the following changes in Tableau V2 Connector:
• You can add both the license of Tableau V2 Connector and Tableau V3 Connector in the same
organization.
Previously, you could not retain the licenses of Tableau V2 Connector and Tableau V3 Connector in the
same organization. You had to create a sub-organization and then add the license of Tableau V3
Connector to retain the license of Tableau V2 Connector.
Tableau V3 Connector
This release includes the following changes in Tableau V3 Connector:
• You can add both the license of Tableau V2 Connector and Tableau V3 Connector in the same
organization.
Previously, you could not retain the licenses of Tableau V2 Connector and Tableau V3 Connector in the
same organization. You had to create a sub-organization and then add the license of Tableau V3
Connector to retain the license of Tableau V2 Connector.
What's New guides for releases occurring within the last year of the current release are included in the
following community article: https://network.informatica.com/docs/DOC-17912
Files that you added to the following directory are preserved after the upgrade:
Perform the following steps to ensure that the Secure Agent is ready for the upgrade:
1. Ensure that each Secure Agent machine has sufficient disk space available for upgrade. To calculate the
free space required for upgrade, use the following formula:
Minimum required free space = 3 * (size of current Secure Agent installation directory -
space used for logs directory) + 1 GB
2. Ensure that no tasks run during the maintenance window. If you use Informatica Intelligent Cloud
Services to schedule tasks, you can configure a blackout period for the organization.
To configure a blackout period, in Administrator, select Schedules, and then click Blackout Period.
3. Close all applications and open files to avoid file lock issues, for example:
• Windows Explorer
• Notepad
• Windows Command Processor (cmd.exe)
1. Open the existing mapping or mapping task that has .csv files in the source or the target object.
3. Click the Formatting Option and select the file format as Flat.
25
Note: If there are multiple Microsoft Azure Data Lake Store sources or targets in the mapping, repeat the step
#3 for each source and target.
5. Optional. Verify if Data Preview is successful for the edited sources and targets.
Note: You can perform this task even after upgrading to Winter 2019, but before you re-run the existing
mappings.
Existing mappings from earlier versions fail if you do not use version 8.0.12 of the MySQL JDBC and ODBC
drivers.
• Install Red Hat Enterprise Linux version 7 or higher for the Secure Agents installed on Linux operating
systems.
• Assign the read, write, and execute permissions to the third-party libraries manually. To assign
permissions, perform the following steps:
1. From the command prompt, go to the following directory:
<INFA_AGENT_INSTALLED_LOCATION>/downloads/package-tableauV2.<latest_version>/package/
rdtm
2. Enter the following command:
chmod 777 *
You must verify the taskflow for possible upgrade errors and manually save the taskflow. Otherwise, the
upgrade message appears each time you open the taskflow and you cannot run or publish the taskflow.
Scheduled taskflows will continue to run as is. However, if you open and edit a scheduled taskflow, you must
verify the scheduled taskflow for possible upgrade errors and manually save the taskflow. Otherwise, the
upgrade message appears each time you open the scheduled taskflow.
A Monitor service 14
Automatch 11
R
C REST API
enhancements 12, 14
Cloud Application Integration community roles
URL 5 licensing changes 16
Cloud Developer community renamed roles 16
URL 5
connections
search 13
copy assets 12
S
customize Explore page 12 search jobs 14
Secure Agent
registration changes 16
D upgrade preparation 25
Secure Agent groups
data catalog discovery service assignment 13
for mappings 9 Smart Match 11
for synchronization tasks 9 status
mapping inventory 11 Informatica Intelligent Cloud Services 6
Data Integration community sub-organization import export 14
URL 5 synchronization tasks
data catalog discovery 9
system status 6
E
Explore page 12 T
Taskflows
I enhancements 13
transformations
import export assets 14 palette changes 15
Import Export logs 15 trust site
Informatica Global Customer Support description 6
contact information 6
Informatica Intelligent Cloud Services
web site 5 U
upgrade notifications 6
M upgrade preparation
Secure Agent preparation 25
maintenance outages 6
Mapping Designer
Inventory panel 11
transformation palette changes 15
W
mappings web site 5
data catalog discovery 9
inventory 11
28