Skip Ribbon Commands
Skip to main content
Navigate Up
Sign In

Quick Launch

Average Rating:

facebook Twitter
Email
Print Bookmark Alert me when this article is updated

Feedback

FAQ: Queries regarding Hadoop, Hive and Hadoop files connector in IICS
Answer

Is it possible to read directly from a file placed in HDFS in IICS?

You can use Hadoop Files Connector to securely read data from and write data to complex files on local system or to HDFS. Below is the connector guide. 
https://network.informatica.com/onlinehelp/IICS/prod/CDI/en/index.htm#page/cloud-data-integration-hadoop-files,-connector-guide/Hadoop_Files_Connector_Overview.html

Is it possible to write to Hadoop (Hive tables) with non-Kerberos setup?

On a non-Kerberos cluster, you can perform the read operation if the Secure Agent is installed outside the cluster. You must install the Secure Agent on one of the nodes of the Hadoop cluster to perform the write operation successfully. 

Is Hadoop connector only for reading from Hive and Impala tables?​

Yes, currently, Hadoop connector is certified only to read from Hive and Impala tables. 

What type of files can be read or written using Hadoop Files Connector?

You can read or write structured, semi-structured, and unstructured data using Hadoop Files connector. For example, the user can read PDF, JSON, and JPG formats. 

Can Hive Connector be used in non-Kerberos environment?

You can use Hive Connector on both Kerberos and non-Kerberos clusters. One can read data from and write data to partitioned and bucketed tables in Hive. ​

More Information

Applies To
Product: Cloud Data Integration
Problem Type: Configuration
User Type: Architect
Project Phase: Configure
Product Version:
Database:
Operating System:
Other Software:

Reference

Attachments

Last Modified Date:11/11/2019 5:49 AMID:569782
People who viewed this also viewed

Feedback

Did this KB document help you?



What can we do to improve this information (2000 or fewer characters)