Skip Ribbon Commands
Skip to main content
Navigate Up
Sign In

Quick Launch

Average Rating:

(1 Rating)
facebook Twitter
Email
Print Bookmark Alert me when this article is updated

Feedback

FAQ: What are the steps to replace data warehouse objects with Snowflake objects in PowerCenter mappings and sessions in version 10.2 HotFix 1?
Answer

The article assumes that you are using PowerCenter 10.2 Hotfix 1 and you want to replace a data warehouse table definition with an almost identical table definition from Snowflake. If the Snowflake table definition is different from the existing data warehouse table definition, you need to modify the mappings to address those changes.


Before you replace the data warehouse objects, such as Oracle, Netezza, or Teradata with Snowflake objects, you must complete the prerequisite tasks. You can then replace the source definitions, target definitions, and transformations to build Snowflake mappings in the PowerCenter Designer. Update the session to include the configured Snowflake objects and the appropriate Snowflake connection in the PowerCenter Workflow Manager.


Prerequisites

 

Complete the following prerequisites: 

  • Back up the PowerCenter repository so that you retain a copy of the current state of the data warehouse objects for future requirements.
  • Identify a few data warehouse mappings, maybe 5-10, that represent the data movement patterns. Include at least one mapping that contains a Snowflake source, target, and lookup table, with or without query override, as applicable.
  • If you want to configure an SQL query, you must apply EBF-11911 on PowerCenter 10.2 HotFix 1. For information, contact Informatica Global Customer Support.


Steps for Replacing Data Warehouse Objects with Snowflake Objects

Perform the tasks on one sample mapping first. After you validate the changes, you can perform the tasks on the rest of the mappings.

 

In the PowerCenter Designer, perform the following tasks:

1. Import the source, target, and lookup tables from Snowflake.​ ​​​Use the following guidelines when you import the objects:

    • Import the Snowflake objects by using the same schema and database that you plan to run them on.
    • Use the appropriate connection to import the Snowflake schema. 
           If you want to configure pushdown optimization for a Snowflake mapping, use the ODBC connection to import the Snowflake tables and run the mapping. 
           To use an ODBC connection to connect to Snowflake, you must create an ODBC data source name (DSN) for Snowflake.
           For instructions about creating an ODBC connection and configuring pushdown optimization, see the PowerExchange for Snowflake User Guide 10.2 HotFix
           from the following​ website: https://kb.informatica.com/proddocs/Product%20Documentation/6/PWX_102HF1_SnowflakeUserGuideForPowerCenter_en.pdf
                                                                                                           ​            
 2. Open the existing Source Qualifier transformation in the data warehouse mapping and save the configured properties, such as the SQL query and filter conditions, that exist in the transformation.

The following image shows an example of a custom SQL query used for a Snowflake source:
SQl-Query.PNG

 
3. If the source qualifier includes a query override, import a source definition from Snowflake using a similar query as used in the query override.

4. Remove the outgoing links from the Source Qualifier, drag the corresponding source into the mapping, and connect the ports from the new Source Qualifier (Application Source Qualifier). The source can either be a table, view, or query from Snowflake​.
 
5. Delete the old source definition and the old Source Qualifier from the mapping canvas. 
Note: The existing Source Qualifier will not work with Snowflake. 

6. Remove existing incoming links into the target definition, drag the corresponding target definition from Snowflake, and link the ports. 

7. Delete the old target definition from the mapping canvas. 

8. For each lookup that uses different lookup tables, drag the corresponding Snowflake table into the mapping and configure a partial pipeline. 

9. Configure a Lookup transformation to use the partial pipeline instead of the original source from the existing data warehouse. In scenarios where data types of some ports do not match with the corresponding fields in the Snowflake object, you need to re-create the lookup afresh using the pipeline source so that the field metadata is refreshed and is consistent with the Snowflake object. For more information, see section, Example of Using Pipeline Lookups​.

10. Validate and save the mapping.

 

In the PowerCenter Workflow Manager, perform the following tasks based on your requirement:  

  • Configure a Snowflake connection. 

  • Configure the Snowflake connection for the Snowflake source, lookup, and target, as applicable, in an existing or a new session. 

  • If you plan to use pushdown optimization, configure an ODBC connection to connect to  Snowflake. 

  • To run a session configured with pushdown optimization, configure an ODBC connection for the Snowflake source, lookup, and target, as applicable.  

 After you validate the changes for the sample mapping, repeat the steps for the rest of the mappings that you want to modify.



Example of Using Pipeline Lookups

To configure a lookup for Snowflake tables, you must configure a pipeline lookup that includes a partial pipeline for a lookup table. 


The following screenshot shows an example from a data warehouse mapping that uses a direct lookup:

Lookup_1.PNG The following screenshot shows the same mapping, which is modified to include a pipeline Lookup transformation for Snowflake:

Lookup_2.PNG

​​

More Information

Applies To
Product: PowerCenter; PowerExchange
Problem Type:
User Type:
Project Phase:
Product Version: PowerCenter 10.2; HotFix; PowerExchange Adapters 10.2
Database:
Operating System:
Other Software:

Reference

Attachments

Last Modified Date:1/21/2019 8:44 PMID:547962
People who viewed this also viewed

Feedback

Did this KB document help you?



What can we do to improve this information (2000 or fewer characters)