Sie sind auf Seite 1von 28

Informatica® Cloud Data Integration

Winter 2019 April

What's New
Informatica Cloud Data Integration What's New
Winter 2019 April
April 2019
© Copyright Informatica LLC 2016, 2019

This software and documentation are provided only under a separate license agreement containing restrictions on use and disclosure. No part of this document may be
reproduced or transmitted in any form, by any means (electronic, photocopying, recording or otherwise) without prior consent of Informatica LLC.

U.S. GOVERNMENT RIGHTS Programs, software, databases, and related documentation and technical data delivered to U.S. Government customers are "commercial
computer software" or "commercial technical data" pursuant to the applicable Federal Acquisition Regulation and agency-specific supplemental regulations. As such,
the use, duplication, disclosure, modification, and adaptation is subject to the restrictions and license terms set forth in the applicable Government contract, and, to the
extent applicable by the terms of the Government contract, the additional rights set forth in FAR 52.227-19, Commercial Computer Software License.

Informatica, Informatica Cloud, Informatica Intelligent Cloud Services, PowerCenter, PowerExchange, and the Informatica logo are trademarks or registered trademarks
of Informatica LLC in the United States and many jurisdictions throughout the world. A current list of Informatica trademarks is available on the web at https://
www.informatica.com/trademarks.html. Other company and product names may be trade names or trademarks of their respective owners.

Portions of this software and/or documentation are subject to copyright held by third parties. Required third party notices are included with the product.

The information in this documentation is subject to change without notice. If you find any problems in this documentation, report them to us at
infa_documentation@informatica.com.

Informatica products are warranted according to the terms and conditions of the agreements under which they are provided. INFORMATICA PROVIDES THE
INFORMATION IN THIS DOCUMENT "AS IS" WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING WITHOUT ANY WARRANTIES OF
MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND ANY WARRANTY OR CONDITION OF NON-INFRINGEMENT.

Publication Date: 2019-04-08


Table of Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Informatica Resources. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Informatica Documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Informatica Intelligent Cloud Services web site. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Informatica Intelligent Cloud Services Communities. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
Informatica Intelligent Cloud Services Marketplace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Data Integration connector documentation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Informatica Knowledge Base. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Informatica Intelligent Cloud Services Trust Center. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Informatica Global Customer Support. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

Chapter 1: Winter 2019 April. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7


Connectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
New Connectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
Enhanced Connectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7

Chapter 2: Winter 2019 March. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9


New features and enhancements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Data catalog discovery. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
File listener. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Mass ingestion. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Mapping inventory. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Field mapping smart match . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Explore page. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
Intelligent Structure Discovery and Structure Parser transformation. . . . . . . . . . . . . . . . . . 12
Taskflows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Administrator. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
Monitor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
REST API. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Changed behavior. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Import and export logs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Migrate assets. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Transformation palette. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Taskflows. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Administrator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
Connectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
New connectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Enhanced connectors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 19
Changed behavior. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
Enhancements in previous releases. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24

Table of Contents 3
Chapter 3: Upgrading to Winter 2019 April. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Preparing for the upgrade. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Microsoft Azure Data Lake Store V3 Connector pre-upgrade tasks. . . . . . . . . . . . . . . . . . . 25
MySQL Connector Pre-Upgrade Tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
After you upgrade. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Amazon Redshift V2 Connector post-upgrade tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Tableau V2 Connector post-upgrade tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Tableau V3 Connector post-upgrade tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Taskflows post-upgrade tasks. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

Index. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28

4 Table of Contents
Preface
What's New contains a brief overview of new features, enhancements, and changed behaviors for the Winter
2019 release. It also includes upgrade steps that you might need to perform.

Informatica Resources
Informatica provides you with a range of product resources through the Informatica Network and other online
portals. Use the resources to get the most from your Informatica products and solutions and to learn from
other Informatica users and subject matter experts.

Informatica Documentation
Use the Informatica Documentation Portal to explore an extensive library of documentation for current and
recent product releases. To explore the Documentation Portal, visit https://docs.informatica.com.

Informatica maintains documentation for many products on the Informatica Knowledge Base in addition to
the Documentation Portal. If you cannot find documentation for your product or product version on the
Documentation Portal, search the Knowledge Base at https://search.informatica.com.

If you have questions, comments, or ideas about the product documentation, contact the Informatica
Documentation team at infa_documentation@informatica.com.

Informatica Intelligent Cloud Services web site


You can access the Informatica Intelligent Cloud Services web site at http://www.informatica.com/cloud.
This site contains information about Data Integration editions and applications as well as information about
other Informatica Cloud integration services.

Informatica Intelligent Cloud Services Communities


Use the Informatica Intelligent Cloud Services Community to discuss and resolve technical issues. You can
also find technical tips, documentation updates, and answers to frequently asked questions.

Access the Informatica Intelligent Cloud Services Community at:

https://network.informatica.com/community/informatica-network/products/cloud-integration

To find resources on using Application Integration (the Informatica Cloud Real Time service), access the
community at:

https://network.informatica.com/community/informatica-network/products/cloud-integration/cloud-
application-integration/content

5
Developers can learn more and share tips at the Cloud Developer community:

https://network.informatica.com/community/informatica-network/products/cloud-integration/cloud-
developers

Informatica Intelligent Cloud Services Marketplace


Visit the Informatica Marketplace to try and buy Data Integration Connectors, templates, and mapplets:

https://marketplace.informatica.com/community/collections/cloud_integration

Data Integration connector documentation


You can access documentation for Data Integration Connectors at the Documentation Portal. To explore the
Documentation Portal, visit https://docs.informatica.com.

Informatica Knowledge Base


Use the Informatica Knowledge Base to find product resources such as how-to articles, best practices, video
tutorials, and answers to frequently asked questions.

To search the Knowledge Base, visit https://search.informatica.com. If you have questions, comments, or
ideas about the Knowledge Base, contact the Informatica Knowledge Base team at
KB_Feedback@informatica.com.

Informatica Intelligent Cloud Services Trust Center


The Informatica Intelligent Cloud Services Trust Center provides information about Informatica security
policies and real-time system availability.

You can access the trust center at https://www.informatica.com/trust-center.html.

Subscribe to the Informatica Intelligent Cloud Services Trust Center to receive upgrade, maintenance, and
incident notifications. The Informatica Intelligent Cloud Services Status page displays the production status
of all the Informatica cloud products. All maintenance updates are posted to this page, and during an outage,
it will have the most current information. To ensure you are notified of updates and outages, you can
subscribe to receive updates for a single component or all Informatica Intelligent Cloud Services
components. Subscribing to all components is the best way to be certain you never miss an update.

To subscribe, go to the Informatica Intelligent Cloud Services Status page and click SUBSCRIBE TO
UPDATES. You can then choose to receive notifications sent as emails, SMS text messages, webhooks, RSS
feeds, or any combination of the four.

Informatica Global Customer Support


You can contact a Customer Support Center by telephone or online.

For online support, click Submit Support Request in Informatica Intelligent Cloud Services. You can also use
Online Support to log a case. Online Support requires a login. You can request a login at
https://network.informatica.com/welcome.

The telephone numbers for Informatica Global Customer Support are available from the Informatica web site
at https://www.informatica.com/services-and-training/support-services/contact-us.html.

6 Preface
Chapter 1

Winter 2019 April


This section provides information about new features, enhancements, and behavior changes in the Winter
2019 April release of Informatica Intelligent Cloud Services Data Integration.

Connectors
The Winter 2019 April release includes the following new connectors and enhanced connectors.

New Connectors
This release includes the following new connectors.

Adobe Experience Platform Connector


You can use Adobe Experience Platform Connector to connect to Adobe Experience Platform from Cloud
Data Integration and securely read data from or write data to Adobe Experience Platform. Create an Adobe
Experience Platform connection and use the connection as a source or as a target in mapping tasks.

MemSQL V2 Connector
You can use MemSQL V2 Connector to connect to MemSQL from Cloud Data Integration and securely read
data from or write data to MemSQL database. Use the Bulk Insert option to write large volumes of data to the
MemSQL database.

Enhanced Connectors
This release includes enhancements to the following connectors.

Coupa V2 Connector
You can configure IsAPIGlobalNamespace field for a custom field to display the custom field under root tag
or custom field tag in the field mapping.

Cvent Connector
You can use the ContactSnapshot object to retrieve a list of contacts pertaining to each event instead of all
contacts present in account-wide address book.

For more information about ContactSnapshot object and filters, see the Cloud Data Integration Cvent
Connector Guide.

7
File Processor Connector
You can use File Processor Connector to sign the files in addition to encrypting and decrypting the files in a
local file system.

Litmos Connector
You can create mappings to read the following objects:

• OrgLearningpathCourses
• OrgCourseModules
• GetAllCourseCustomeFields
• GetCourseCustomeFieldById

8 Chapter 1: Winter 2019 April


Chapter 2

Winter 2019 March


This section provides information about new features, enhancements, and behavior changes in the Winter
2019 March release of Informatica Intelligent Cloud Services Data Integration.

New features and enhancements


The Winter 2019 March release includes the following new features and enhancements.

Data catalog discovery


If your organization uses Enterprise Data Catalog and you have the appropriate licenses, you can perform a
search against the catalog and discover assets. You can use the assets that you discover as sources, targets,
and lookup objects in mappings and as sources in synchronization tasks.

You search for Enterprise Data Catalog objects on the Data Catalog page:

Search for objects to discover tables, views, and flat files in the catalog. You discover objects by entering a
search phrase that might occur in the object name, description, or other metadata such as data domain or

9
associated business glossary term. After you select an object from the search results, you can add it to a
new synchronization task, to a new mapping, or to a mapping that is currently open in Data Integration.

For more information about discovering objects to use as mapping sources, targets, or lookup objects, see
Mappings. For more information about discovering objects to use as synchronization task sources, see
Tasks.

File listener
This release includes the following new features and enhancements for file listener:

Regular expressions for file name pattern


You can use regular expressions to define the file name pattern to which the file listener listens.

Start and stop a file listener manually


A file listener runs at a defined frequency according to the configured run schedule. You can also start or
stop a file listener manually by using the Start and Stop buttons on the file listener page.

File listener monitoring


You can monitor file listener jobs in Monitor on the File Transfer Logs page.

Mass ingestion
This release includes the following new features and enhancements for the mass ingestion tasks:

Mass ingestion task sources and targets


The following table lists source and target connectors that are introduced in this release:

Connector Source Target

Advanced FTP V2 Yes Yes

Advanced FTPS V2 Yes Yes

Advanced SFTP V2 Yes Yes

Google Cloud Storage V2 Yes Yes

Hadoop Data File Storage Yes Yes

Google Big Query V2 No Yes

You can also use a file listener as a source type to transfer files to target connectors.

New versions of advanced file transfer connectors


Effective in this release, Informatica introduces the following new connectors:

• Advanced FTP V2
• Advanced FTPS V2
• Advanced SFTP V2

The previous versions of connectors are deprecated. Informatica recommends that you upgrade to the new
version of connectors before it drops support for the previous versions.

10 Chapter 2: Winter 2019 March


Regular expression for file name pattern
You can use regular expressions to define source file name patterns.

Mapping inventory
If your organization uses Enterprise Data Catalog and you have the appropriate licenses, the Mapping
Designer includes the new Inventory panel. The Inventory panel lists the Enterprise Data Catalog objects that
you have discovered on the Data Catalog page and added to the mapping.

Each mapping has its own inventory. You can add inventory objects to the mapping as sources, targets, or
lookup objects. Objects remain in the inventory until you remove them.

The following image shows the Inventory panel in a new mapping:

1. Inventory icon. Shows and hides the Inventory panel.


2. "Select an object from the inventory" icon in the transformation Properties panel. Use to select an object from the
inventory as the source, target, or lookup object.

For more information about the mapping inventory, see Mappings.

Field mapping smart match


Use smart match to automatically map fields with similar names on the Field Mapping tab of
transformations, mapping tasks, and synchronization tasks.

When you use smart match, Informatica Intelligent Cloud Services looks for common patterns in field names
and automatically matches fields with similar names. For example, if you have an incoming field Cust_Name
and a target field Customer_Name, the smart match automatically links the Cust_Name field with the
Customer_Name field.

To map fields with similar names, select Smart Match from the Automatch menu.

New features and enhancements 11


The following image shows the Automatch menu in the Target transformation properties:

For more information about using smart match in transformations, see Transformations. For more
information about using smart match in mapping tasks and synchronization tasks, see Tasks.

Explore page
The Explore page includes the following enhancements.

Customize the Explore page


You can add the Created By, Created On, and Updated By columns to the Explore page header in the Projects
and Folders view and in the Asset Types view. Right-click the column heading area and select the column to
add.

Copy assets
You can copy assets within the same folder. When you copy assets into the current folder, you can either
overwrite the original asset or keep both assets. If you choose to keep both assets, Informatica Intelligent
Cloud Services appends the new asset name with "Copy x" where x is the sequential copy number.

When you copy assets into a different folder, if an asset with that name already exists, you can choose to
keep both assets. Informatica Intelligent Cloud Services appends the new asset name with "Copy x" where x
is the sequential copy number.

For more information about copying assets, see Asset Management.

Intelligent Structure Discovery and Structure Parser transformation


Intelligent Structure Discovery and the Structure Parser transformation have the following updates:

• To facilitate working with large or complex structures, you can search for data values (nodes) in the visual
model using full or partial strings.
• From the search results you can select one or multiple nodes and perform actions on those nodes. You
can exclude or include nodes, collapse or expand the model under nodes, rename nodes, or change node
type to string.
• You can create a model based on an Excel file with multiple sheets, which will include all sheets in the file.
You can view and tune all sheets in the model and choose the sheet for which to display in the input data
panel.

12 Chapter 2: Winter 2019 March


• Multiple Structure Parser transformations can be deployed midstream, enabling flexible pipeline parsing.
The parsers can get input from other parsers.
• Intelligent Structure Discovery assigns the detected datatypes to the output ports. The ports will then be
propagated correctly in the Structure Parser transformation mapping.

Taskflows
Taskflows include the following enhancements:

Taskflow as a service

You can invoke a taskflow as an API by publishing the taskflow as a service. When you publish a
taskflow, Data Integration generates the service URL and the SOAP service URL. You can use these
endpoint URLs to invoke the taskflow as an API. When you invoke a taskflow as an API, you can
dynamically provide input parameters for the tasks that the taskflow contains and perform
orchestration.

Subtaskflows

When you create a taskflow, you can use the Subtaskflows step to embed and reuse a taskflow within
another taskflow. You can use a subtaskflow to reuse the same orchestration flow across multiple
branches of a taskflow or across different taskflows. You can then invoke the taskflow with different
sets of parameters.

For more information about taskflows, see Taskflows.

Administrator
The Administrator service includes the following enhancements.

Connections search
You can search for connections on the Connections page in Administrator. You can search for connections
by name or description.

For more information about connections, see Connections.

Service assignment for Secure Agent groups


By default, when you create a Secure Agent group, all services that your organization uses can use the group.
If your organization uses multiple services, the demand on the Secure Agent group can be high. To reduce
the potential demand on a Secure Agent group, you can now enable and disable specific Secure Agent
services for the group.

For example, your organization uses Data Integration and Operational Insights. To balance the load across
Secure Agent groups, you might configure one Secure Agent group for Data Integration tasks and another for
Operational Insights auto-scaling and data collection.

To do this, for the first group, you enable Data Integration Server and disable Auto Scale App and OpsInsights
Data Collector App. For the second group, you enable Auto Scale App and OpsInsights Data Collector App
and disable Data Integration Server.

For more information about assigning services to Secure Agent groups, see Administrator in the
Administrator help.

New features and enhancements 13


Monitor
You can search for jobs on the Running Jobs, All Jobs, or My Jobs pages. To find jobs, enter a full or partial
string in the Find field. You can search for jobs by instance name or error message.

The following image shows the find field on the All Jobs page:

For more information about finding jobs, see Monitor.

REST API
The Informatica Intelligent Cloud Services REST API includes the following enhancements.

objects resource
Use the objects resource to receive a list of your organization's assets. You can receive a list of all of your
organization's assets. Or, you can filter the list by project, folder, asset type, last update, and the user who
last updated the asset.

fetchState and loadState resources


Use the fetchState and loadState resources to synchronize state variables for assets across your
organizations.

For example, in Organization A, a mapping task with a Sequence Generator transformation has a NEXTVAL
value of 3270. The same task was migrated to Organization B, however the NEXTVAL value in Organization B
is 0. You want to synchronize the task's state between Organization A and Organization B so that the
NEXTVAL value in both organizations has a value of 3270. You use the fetchState and loadState resources to
synchronize the NEXTVAL value so that you can run the task in Organization B while preserving the sequence
of numbers.

Import and export job logs


Use the import resource to download import job logs. Use the export resource to download export job logs.
Informatica Intelligent Cloud Services returns the logs in text files.

Export file checksum validation


When you upload an import package, Informatica Intelligent Cloud Services uses checksum validation to
verify that the export file was not modified after it was created. If you want to upload an import package that
contains a modified export file, you can include the relaxChecksum parameter.

14 Chapter 2: Winter 2019 March


file listener resource
You can use the file listener resource to create, update, and delete file listeners, and to view the status of a
file listener job. You can also start and stop a file listener resource manually.

mass ingestion tasks


Use the mass ingestion tasks resource to view, create, update, and delete mass ingestion tasks and to find
the location of mass ingestion tasks.

Changed behavior
The Winter 2019 March release includes the following changed behaviors.

Import and export logs


Object-level details for import and export logs are now available in the import or export log file. Detailed
import and export log files are available for download for seven days on the Import/Export Logs and My
Import/Export Logs pages. Object-level details are also available on the Import/Export Logs and My Import/
Export Logs pages for 24 hours.

Previously, object-level details for import and export logs were only available on the Import/Export Logs and
My Import/Export Logs pages for seven days.

For more information about import and export logs, see Monitor.

Migrate assets
Users with Administrator privileges can log into a parent organization, switch to a sub-organization, and
import or export Data Integration assets. Previously, administrators needed to log into the sub-organization
to import and export assets.

Transformation palette
When you create a mapping, you can now view only the transformations that your organization is licensed to
use, or you can view all transformations that are available in Data Integration. If a transformation that you
need is not licensed, you can contact Informatica Global Customer Support to get the license.

Use the licensed transformations icon at the bottom of the transformation palette to view either the licensed
transformations or all transformations.

Changed behavior 15
The following image shows the licensed transformations icon:

Previously, when you created a mapping, the transformation palette displayed all transformations that were
available in Data Integration.

For more information about viewing licensed transformations, see Transformations.

Taskflows
If you add a Data Task step, the Data Task fields appear on the Temp Fields tab of the Start step.

Previously, if you added a Data Task step, the Data Task fields appeared on the Input Fields tab of the Data
Task step.

If you included a Data Task step in a taskflow, after upgrade, the Data Task fields appear on the Temp Fields
tab of the Start step. The Data Task fields represent the input parameters of the task.

For more information about Data Task fields in taskflows, see Taskflows.

Administrator
The Administrator service includes the following behavior changes.

License based roles


The roles that you can assign to users and groups now vary based on your organization's licenses. Enabling
and disabling roles based on your licenses helps to prevent administrators from assigning incorrect roles to
users and groups.

For example, if your organization has no access to Application Integration or API Manager, you cannot assign
the Application Integration Business Manager, Application Integration Data Viewer, Deployer, or Operator role

16 Chapter 2: Winter 2019 March


to any user or group in your organization. These roles appear disabled on the Users, User Groups, and User
Roles pages.

Previously, administrators could assign any role to a user, regardless of the organization's licenses.

For more information about user roles, see Administrator in the Administrator help.

Renamed roles
Some service-specific roles are renamed to clarify which services that the roles apply to.

The following roles are renamed:

New role name Previous role name

Application Integration Business Manager Business Manager

Application Integration Data Viewer Data Viewer

Data Integration Data Previewer Data Preview

For more information about user roles, see Administrator in the Administrator help.

Secure Agent registration


You now register a Secure Agent using your Informatica Intelligent Cloud Services user name and a security
token. Registering the agent using a token improves security because the token does not store sensitive
information such as user passwords.

The following image shows the Generate Token option on the Runtime Environments page:

Previously, you registered a Secure Agent using your Informatica Intelligent Cloud Services user name and
password.

For more information about downloading, installing, and registering a Secure Agent, see Getting Started in the
Data Integration help or Administrator in the Administrator help.

Connectors
The Winter 2019 March release includes the following new connectors and enhanced connectors.

Connectors 17
New connectors
This release includes the following new connectors.

Advanced FTP V2 Connector


Advanced FTP V2 Connector enables you to securely transfer files to and from FTP servers. Advanced FTP
V2 Connector can be only used within mass ingestion tasks. Create an Advanced FTP V2 connection and use
the connection as a source or as a target in mass ingestion tasks. When you use an Advanced FTP V2 source
in a mass ingestion task, you can schedule the task to receive notifications from a file listener.

Advanced FTPS V2 Connector


Advanced FTPS V2 Connector enables you to securely transfer files to and from FTPS servers. Advanced
FTPS V2 Connector can be only used within mass ingestion tasks. Create an Advanced FTPS V2 connection
and use the connection as a source or as a target in mass ingestion tasks. When you use an Advanced FTPS
V2 source in a mass ingestion task, you can schedule the task to receive notifications from a file listener.

Advanced SFTP V2 Connector


Advanced SFTP V2 Connector enables you to securely transfer files to and from SFTP servers. Advanced
SFTP V2 Connector can be only used within mass ingestion tasks. Create an Advanced SFTP V2 connection
and use the connection as a source or as a target in mass ingestion tasks. When you use an Advanced SFTP
V2 source in a mass ingestion task, you can schedule the task to receive notifications from a file listener.

Db2 for i CDC Connector


You can use Db2 for i CDC Connector to connect to a PowerExchange CDC for Db2 environment from Data
Integration. Use Db2 for i Connector to retrieve metadata for Db2 source tables on an IBM i system and to
extract change records that PowerExchange captured for these tables. Add Db2 for i sources in mappings,
and then run the associated mapping tasks to transmit change records to a Microsoft SQL Server or an
Oracle target.

Db2 for LUW CDC Connector


You can use Db2 for LUW CDC Connector to connect to a PowerExchange CDC for Db2 environment from
Data Integration. Use Db2 for LUW Connector to retrieve metadata for Db2 source tables on Linux, UNIX, or
Windows and to extract change records that PowerExchange captured for these tables. Add Db2 for LUW
sources in mappings, and then run the associated mapping tasks to transmit change records to a Microsoft
SQL Server or an Oracle target.

Db2 Warehouse on Cloud Connector


You can use Db2 Warehouse on Cloud Connector to connect securely to IBM Db2 Warehouse on Cloud from
Data Integration. Use Db2 Warehouse on Cloud Connector to read data from or write data to IBM Db2
Warehouse on Cloud. Db2 Warehouse on Cloud sources and targets in mappings represent tables in IBM Db2
Warehouse on Cloud. You can create a mapping task to process data based on the data flow logic defined in
a Db2 Warehouse on Cloud mapping.

Google BigQuery V2 Connector


You can use Google BigQuery V2 Connector to connect to Google BigQuery from Data Integration. Use
Google BigQuery Connector to read data from or write data to Google BigQuery. You can create a Google
BigQuery V2 connection and use the connection in mass ingestion tasks, mapping tasks, and mappings.
Create a mass ingestion task to transfer files from a Google Cloud Storage source to a Google BigQuery
target.

Hadoop Files V2 Connector


You can use Hadoop Files V2 Connector to connect to HDFS (Hadoop Distributed File System) from Data
Integration. Use Hadoop Files V2 Connector to securely read data from and write data to HDFS . You can read
or write structured, semi-structured, and unstructured data.

18 Chapter 2: Winter 2019 March


You can create a Hadoop Files V2 connection and use the connection in mass ingestion tasks, mappings, or
mapping tasks. Create a mass ingestion task to transfer files from a Hadoop Files source to any target that
mass ingestion task supports and transfer files from any source that mass ingestion task supports to a
Hadoop Files target. Create a mapping task to process data based on the data flow logic defined in a
mapping or integration template.

Greenplum Connector
You can use Greenplum Connector to connect to Greenplum from Data Integration. Use Greenplum
Connector to read data from or write data to Greenplum. You can create a mapping task to process data
based on the data flow logic defined in a Greenplum mapping.

When you run a Greenplum mapping to read data from Greenplum, the Secure Agent invokes the Greenplum
database parallel file server, gpfdist, to read data.

When you run a Greenplum mapping task to write data to Greenplum, the Secure Agent creates a control file
to provide load specifications to the Greenplum gpload bulk loading utility, invokes the Greenplum gpload
bulk loading utility, and writes data to the named pipe. The Greenplum gpload bulk loading utility launches
gpfdist, which is Greenplum's file distribution program, that reads data from the named pipe and loads data
into the Greenplum target.

Microsoft Azure Data Lake Store Gen2 Connector


You can use Microsoft Azure Data Lake Store Gen2 Connector to connect to Microsoft Azure Data Lake Store
Gen2 from Data Integration. Use Microsoft Azure Data Lake Store Gen2 Connector to read data from or write
data to Microsoft Azure Data Lake Store Gen2. You can create a mapping task to process data based on the
data flow logic defined in a Microsoft Azure Data Lake Store Gen2 mapping.

Enhanced connectors
This release includes enhancements to the following connectors.

Amazon Redshift V2 Connector


This release includes the following enhancements for Amazon Redshift V2 Connector:

• In addition to the existing regions, you can also read data from or write data to the following regions:
- China(Beijing)

- China(Ningxia)
• You can select a cluster region name in the Cluster Region connection property, even though you specify
the cluster region name in the JDBC URL connection property.
• You can retain the null values when you read data from Amazon Redshift. To retain the null values, select
the Treat NULL Value as NULL option in the source advanced properties.
• When you create a target, you can view and edit the metadata of the target object in the Target Fields tab.
You can edit the data type, precision and define primary key of the columns in the target object.
• You can specify the alias name generated by AWS Key Management Service (AWS KMS) in the Customer
Master Key ID connection property.
• In addition to the existing file formats, you can create external tables for the RCFile and SequenceFIile file
formats to use Amazon Redshift Spectrum.

Connectors 19
• You can use the following changed data enhancements when you write data to the Amazon Redshift
target:
Write changed data to Amazon Redshift target

You can create a mapping to capture changed data from a CDC source, and then run the associated
mapping tasks to write the changed data to an Amazon Redshift target.

For example, you can capture changed data from an Oracle CDC source and write the changed data
to an Amazon Redshift target table.

Capture Changed Data

You can resume the extraction of changed data from the point of interruption when a mapping task
fails or is stopped before completing the task. To resume processing changed data, set the Recovery
Strategy advanced session property to Resume from last checkpoint on the Schedule page when you
create or edit a mapping task.

Data Driven

To capture changed data, select Data Driven as the target operation.

Preserve Record Order on Write

You can retain the order of the records when you read data from a CDC source and write data to an
Amazon Redshift target.
Recovery Schema Name

You can specify the name of the schema that contains recovery file to resume the extraction of the
changed data from the point of interruption. You can add the recovery schema name in the Recovery
Schema Name target property.

Amazon S3 V2 Connector
This release includes the following enhancements for Amazon S3 V2 Connector:

• In addition to the existing regions, you can also read data from or write data to the following regions:
- China(Beijing)

- China(Ningxia)
• You can create a mapping to read or write an ORC file to Amazon S3.
• You can read data from or write data to the files whose names are longer than 80 characters.
• You can encrypt an Avro or Parquet file using Server-Side Encryption as the encryption type when you
write the file to an Amazon S3 target.
• In addition to the existing compression formats, you can compress data in the following formats when
you read data from or write data to Amazon S3:
- Deflate

- Snappy

- Zlib

Anaplan V2 Connector
You can configure the proxy server settings for the Secure Agent to connect to Anaplan V2.

Google Analytics Connector


You can read data from reports from more than one property and more than one view in Google Analytics.
When you configure a mapping to read data from Google Analytics, you can specify comma separated
property IDs and view IDs in the Property ID and View ID advanced source property, respectively.

20 Chapter 2: Winter 2019 March


Google Cloud Storage V2 Connector
You can use a Google Cloud Storage V2 source in a mass ingestion tasks to transfer files into any targets
that mass ingestion task supports.

Hive Connector
You can use the following distributions for a read or write operation in Hive Connector:

Kerberos Cluster Non-Kerberos Cluster

Cloudera 5.8 to Cloudera 5.13 Cloudera 5.8 to Cloudera 5.13

Hortonworks 2.5 and Hortonworks 2.6 Hortonworks 2.5 and Hortonworks 2.6

HDInsight 3.6 HDInsight 3.6

Litmos Connector
You can create mappings to read the following objects:

• OrgCourses
• OrgLearningPaths
• OrgUserProgramResults
• OrgModules
• OrgModulesResult
• OrgLearningpathResults

Microsoft Azure Data Lake Store V3 Connector


You can read and write binary files using the Microsoft Azure Data Lake Store V3 connection.

Microsoft Azure SQL Data Warehouse V3 Connector


This release includes the following enhancements for Microsoft Azure SQL Data Warehouse V3 Connector:

• When you write data to Microsoft Azure SQL Data Warehouse, you can compress the files that are written
to the Blob staging area in the gzip format.
• You can use custom queries when you read a source object.
• You can override the source object and the source object schema in the source advanced properties. The
source object and the source object schema defined in the source advanced properties take the
precedence.
• You can override the target object and the target object schema in the target advanced properties. The
target object and the target object schema defined in the target advanced properties take the precedence.
• You can utilize the MD5() pushdown function when you use an ODBC connection to connect to Microsoft
Azure SQL Data Warehouse.

Microsoft Sharepoint Online Connector


You can use Microsoft Sharepoint Online Connector on Linux.

Connectors 21
ODBC Connector
This release includes the following enhancements for ODBC Connector:

• You can use the ODBC connection to connect to Google BigQuery. Specify the Google BigQuery subtype in
an ODBC connection to connect to Google BigQuery sources and targets.
• You can enable full or source pushdown optimization for Google BigQuery sources and targets. Specify
the Google BigQuery subtype in the ODBC connection properties to enable pushdown optimization when
you read data from or write data to Google BigQuery tables. The Secure Agent pushes the supported
functions to Google BigQuery sources and targets based on the pushdown optimization session property
value you specify.

Oracle Connector
This release includes the following enhancements for Oracle Connector:

• You can use an Oracle connection to connect to Oracle Database Cloud Service.
• You can configure additional properties for the Oracle connection to read metadata. Specify the
connection properties as semicolon-separated key-value pairs in the Metadata Advanced Connection
Properties field in the Oracle connection.
• You can configure additional properties for the Oracle connection to run mappings. Specify the
connection properties as semicolon-separated key-value pairs in the Runtime Advanced Connection
Properties field in the Oracle connection.

REST V2 Connector
This release includes the following enhancements for REST V2 Connector:

• You can upload files with or without file boundaries.


• You can configure the connection timeout as an advance field in the connection properties.

Salesforce Connector
You can use version 44.0 of the Salesforce API to create a Salesforce connection and access Salesforce
objects.

Salesforce Marketing Cloud Connector


This release includes the following enhancements for Salesforce Marketing Cloud Connector:

• You can use the insert and update operations for a Non-Contact Linked Data Extension.
• You can configure the Salesforce Marketing Cloud connection properties to access data across all
business units in your Salesforce Marketing Cloud account.

SAP Table Reader Connector


You can use bulk mode when you want to read a large amount of data from SAP tables and improve the
performance.

Tableau V3 Connector
The Secure Agent displays the projects and data sources that are present on Tableau Server or Tableau
Online when you do not specify the schema in the Schema File Path connection property.

Teradata Connector
You can configure Kerberos authentication for a Teradata connection. To configure Kerberos, you must
select the authentication type as KRB5. Specify the Kerberos artifacts directory where the krb5.conf and
IICSTPT.keytab files are located and provide the Kerberos user name in the Teradata connection properties.

22 Chapter 2: Winter 2019 March


For more information about configuring Kerberos authentication for a Teradata connection, see the Cloud
Data Integration Teradata Connector Guide.

Changed behavior
This release includes the following behavior changes in connectors.

Advanced FTP Connector (Deprecated)


Effective in the Winter 2019 March release, Advanced FTP Connector is deprecated. Informatica might drop
support for Advanced FTP Connector in a future release.

If you have mass ingestion tasks that use Advanced FTP connection as a source or target or both,
Informatica recommends that you replace the Advanced FTP connections with Advanced FTP V2
connections.

Advanced FTPS Connector (Deprecated)


Effective in the Winter 2019 March release, Advanced FTPS Connector is deprecated. Informatica might drop
support for Advanced FTPS Connector in a future release.

If you have mass ingestion tasks that use Advanced FTPS connection as a source or target or both,
Informatica recommends that you replace the Advanced FTPS connections with Advanced FTPS V2
connections.

Advanced SFTP Connector (Deprecated)


Effective in the Winter 2019 March release, Advanced SFTP Connector is deprecated. Informatica might drop
support for Advanced SFTP Connector in a future release.

If you have mass ingestion tasks that use Advanced SFTP connection as a source or target or both,
Informatica recommends that you replace the Advanced SFTP connections with Advanced SFTP V2
connections.

Tableau V2 Connector
This release includes the following changes in Tableau V2 Connector:

• You must install Red Hat Enterprise Linux version 7 or higher for the Secure Agents that runs on Linux
operating systems to run the mappings successfully.
Previously, you used to install Red Hat Enterprise Linux version 6 or higher for the Secure Agents installed
on Linux operating systems to run the mappings successfully.

MySQL Connector
MySQL Connector uses the MySQL JDBC and ODBC drivers, version 8.0.12.

Previously, MySQL Connector used the MySQL JDBC driver version 5.1.40 and ODBC driver version 5.3.7.

PostgreSQL Connector
The SSLv2 protocol support for PostgreSQL Connector has been removed. The SSLv2 option does not display
in the Crypto Protocol Versions list in the PostgreSQL connection properties.

Snowflake Connector
To address issues of timeout when using a custom SQL query as a source in a mapping, the Secure Agent
fetches the metadata using separate metadata calls.

Previously, the Secure Agent ran the SQL query for a few records to obtain the metadata and long running
queries would cause a timeout.

Connectors 23
Tableau V2 Connector
This release includes the following changes in Tableau V2 Connector:

• You can add both the license of Tableau V2 Connector and Tableau V3 Connector in the same
organization.
Previously, you could not retain the licenses of Tableau V2 Connector and Tableau V3 Connector in the
same organization. You had to create a sub-organization and then add the license of Tableau V3
Connector to retain the license of Tableau V2 Connector.

Tableau V3 Connector
This release includes the following changes in Tableau V3 Connector:

• You can add both the license of Tableau V2 Connector and Tableau V3 Connector in the same
organization.
Previously, you could not retain the licenses of Tableau V2 Connector and Tableau V3 Connector in the
same organization. You had to create a sub-organization and then add the license of Tableau V3
Connector to retain the license of Tableau V2 Connector.

Enhancements in previous releases


You can find information on enhancements and changed behavior in previous Data Integration releases on
Informatica Network.

What's New guides for releases occurring within the last year of the current release are included in the
following community article: https://network.informatica.com/docs/DOC-17912

24 Chapter 2: Winter 2019 March


Chapter 3

Upgrading to Winter 2019 April


This section includes information about tasks that you might need to perform before or after an upgrade
from Data Integration Summer 2018 to Winter 2019 April.

Preparing for the upgrade


The Secure Agent upgrades the first time that you access the Informatica Intelligent Cloud Services Winter
2019 April release.

Files that you added to the following directory are preserved after the upgrade:

<Secure Agent installation directory>/apps/Data_Integration_Server/ext/deploy_to_main/bin/


rdtm-extra

Perform the following steps to ensure that the Secure Agent is ready for the upgrade:

1. Ensure that each Secure Agent machine has sufficient disk space available for upgrade. To calculate the
free space required for upgrade, use the following formula:
Minimum required free space = 3 * (size of current Secure Agent installation directory -
space used for logs directory) + 1 GB
2. Ensure that no tasks run during the maintenance window. If you use Informatica Intelligent Cloud
Services to schedule tasks, you can configure a blackout period for the organization.
To configure a blackout period, in Administrator, select Schedules, and then click Blackout Period.
3. Close all applications and open files to avoid file lock issues, for example:
• Windows Explorer
• Notepad
• Windows Command Processor (cmd.exe)

Microsoft Azure Data Lake Store V3 Connector pre-upgrade tasks


Before you upgrade to Winter 2019, edit the existing mappings that contain .csv files in the source or the
target objects to update the file format.

1. Open the existing mapping or mapping task that has .csv files in the source or the target object.

2. Select the source or target as applicable.

3. Click the Formatting Option and select the file format as Flat.

25
Note: If there are multiple Microsoft Azure Data Lake Store sources or targets in the mapping, repeat the step
#3 for each source and target.

4. Save the mapping.

5. Optional. Verify if Data Preview is successful for the edited sources and targets.

Note: You can perform this task even after upgrading to Winter 2019, but before you re-run the existing
mappings.

MySQL Connector Pre-Upgrade Tasks


Before you upgrade to Winter 2019, verify that you have the MySQL JDBC and ODBC drivers, version 8.0.12 on
the Secure Agent machine.

Existing mappings from earlier versions fail if you do not use version 8.0.12 of the MySQL JDBC and ODBC
drivers.

After you upgrade


Perform the following tasks after you upgrade to the Winter 2019 April release.

Amazon Redshift V2 Connector post-upgrade tasks


After you upgrade, when you create a target using the Create Target option, you cannot use the existing
mappings to edit the metadata of the target object in the Target Field tab.

Tableau V2 Connector post-upgrade tasks


After you upgrade, you must perform the following tasks for Tableau V2 Connector to run the mappings
successfully:

• Install Red Hat Enterprise Linux version 7 or higher for the Secure Agents installed on Linux operating
systems.
• Assign the read, write, and execute permissions to the third-party libraries manually. To assign
permissions, perform the following steps:
1. From the command prompt, go to the following directory:
<INFA_AGENT_INSTALLED_LOCATION>/downloads/package-tableauV2.<latest_version>/package/
rdtm
2. Enter the following command:
chmod 777 *

Tableau V3 Connector post-upgrade tasks


After you upgrade, you must assign the read, write, and execute permissions to the third-party libraries
manually to run the mappings successfully:

To assign permissions, perform the following steps:

1. From the command prompt, go to the following directory:


<INFA_AGENT_INSTALLED_LOCATION>downloads/package-TableauV3.5/package/tableauv3/libs

26 Chapter 3: Upgrading to Winter 2019 April


2. Enter the following command:
chmod 777 *

Taskflows post-upgrade tasks


After you upgrade, when you open an existing taskflow or import a taskflow that was exported before the
Winter 2019 March release, a message appears stating that the taskflow definition has been upgraded and
that you must verify the taskflow for possible upgrade errors.

You must verify the taskflow for possible upgrade errors and manually save the taskflow. Otherwise, the
upgrade message appears each time you open the taskflow and you cannot run or publish the taskflow.

Scheduled taskflows will continue to run as is. However, if you open and edit a scheduled taskflow, you must
verify the scheduled taskflow for possible upgrade errors and manually save the taskflow. Otherwise, the
upgrade message appears each time you open the scheduled taskflow.

After you upgrade 27


Index

A Monitor service 14

Automatch 11
R
C REST API
enhancements 12, 14
Cloud Application Integration community roles
URL 5 licensing changes 16
Cloud Developer community renamed roles 16
URL 5
connections
search 13
copy assets 12
S
customize Explore page 12 search jobs 14
Secure Agent
registration changes 16

D upgrade preparation 25
Secure Agent groups
data catalog discovery service assignment 13
for mappings 9 Smart Match 11
for synchronization tasks 9 status
mapping inventory 11 Informatica Intelligent Cloud Services 6
Data Integration community sub-organization import export 14
URL 5 synchronization tasks
data catalog discovery 9
system status 6

E
Explore page 12 T
Taskflows

I enhancements 13
transformations
import export assets 14 palette changes 15
Import Export logs 15 trust site
Informatica Global Customer Support description 6
contact information 6
Informatica Intelligent Cloud Services
web site 5 U
upgrade notifications 6

M upgrade preparation
Secure Agent preparation 25
maintenance outages 6
Mapping Designer
Inventory panel 11
transformation palette changes 15
W
mappings web site 5
data catalog discovery 9
inventory 11

28

Das könnte Ihnen auch gefallen