The article assumes that you are using PowerCenter 10.2 Hotfix 1 and you want to replace a data warehouse table definition with an almost identical table definition from Snowflake. If the Snowflake table definition is different from the existing data warehouse table definition, you need to modify the mappings to address those changes.
Before you replace the data warehouse objects, such as Oracle, Netezza, or Teradata with Snowflake objects, you must complete the prerequisite tasks. You can then replace the source definitions, target definitions, and transformations to build Snowflake mappings in the PowerCenter Designer. Update the session to include the configured Snowflake objects and the appropriate Snowflake connection in the PowerCenter Workflow Manager.
Perform the tasks on one sample mapping first. After you validate the changes, you can perform the tasks on the rest of the mappings.
1. Import the source, target, and lookup tables from Snowflake. Use the following guidelines when you import the objects:
If you want to configure pushdown optimization for a Snowflake mapping, use the ODBC connection to import the Snowflake tables and run the mapping.
To use an ODBC connection to connect to Snowflake, you must create an ODBC data source name (DSN) for Snowflake.
For instructions about creating an ODBC connection and configuring pushdown optimization, see the PowerExchange for Snowflake User Guide 10.2 HotFix
from the following website: https://kb.informatica.com/proddocs/Product%20Documentation/6/PWX_102HF1_SnowflakeUserGuideForPowerCenter_en.pdf
2. Open the existing Source Qualifier transformation in the data warehouse mapping and save the configured properties, such as the SQL query and filter conditions, that exist in the transformation.
The following image shows an example of a custom SQL query used for a Snowflake source:
3. If the source qualifier includes a query override, import a source definition from Snowflake using a similar query as used in the query override.
4. Remove the outgoing
links from the Source Qualifier, drag the corresponding source into the
mapping, and connect the ports from the new Source Qualifier (Application
Source Qualifier). The source can either be a table, view, or query from
5. Delete the old source
definition and the old Source Qualifier from the mapping canvas.
Note: The existing Source Qualifier will not work with Snowflake.
6. Remove existing incoming links into the target definition, drag the corresponding target definition from Snowflake, and link the ports.
7. Delete the old target definition from the mapping canvas.
8. For each lookup that uses different lookup tables, drag the corresponding Snowflake table into the mapping and configure a partial pipeline.
9. Configure a Lookup transformation to use the partial pipeline instead of the original source from the existing data warehouse. In scenarios where data types of some ports do not match with the corresponding fields in the Snowflake object, you need to re-create the lookup afresh using the pipeline source so that the field metadata is refreshed and is consistent with the Snowflake object. For more information, see section, Example of Using Pipeline Lookups.
10. Validate and save the mapping.
Configure a Snowflake connection.
Configure the Snowflake connection for the Snowflake source, lookup, and target, as applicable, in an existing or a new session.
If you plan to use pushdown optimization, configure an ODBC connection to connect to Snowflake.
To run a session configured with pushdown optimization, configure an ODBC connection for the Snowflake source, lookup, and target, as applicable.
After you validate the changes for the sample mapping, repeat the steps for the rest of the mappings that you want to modify.
To configure a lookup for Snowflake tables, you must configure a pipeline lookup that includes a partial pipeline for a lookup table.
The following screenshot shows an example from a data warehouse mapping that uses a direct lookup:
What can we do to improve this information (2000 or fewer characters)