Skip Ribbon Commands
Skip to main content
Navigate Up
Sign In

Quick Launch

Average Rating:

facebook Twitter
Email
Print Bookmark Alert me when this article is updated

Feedback

ERROR: "[AUTOINST_1005] Internal error. Cannot find the archive configuration file TransparentInstallConfig.xml" while running Profiling jobs from Informatica EDC
Problem Description

While running Profiling jobs from Informatica 'Enterprise Data Catalog' (EDC), execution fails. In the Profile run logs, an error message similar to the following could be observed:

 

Profiling Job Log Trace

 

2020-05-06 09:38:04.633 <pool-18-thread-2> INFO: [IDP_1031] Number of parallel sessions = 1

2020-05-06 09:38:04.633 <pool-18-thread-2> INFO: [IDP_1120] Start time for profile Profile_customer_sales = 1588772284633

2020-05-06 09:38:04.635 <pool-18-thread-2> INFO: ---------------------------------

2020-05-06 09:38:04.635 <pool-18-thread-2> INFO: [Profile_customer_sales] Submitted (1) Waiting (0) Completed (0)

2020-05-06 09:38:04.635 <pool-18-thread-2> INFO: ---------------------------------

2020-05-06 09:38:04.916 <profile-mapping-thread-2> INFO: Time taken for generating actual mapping Profile_customer_sales_52681986757291326 for task Profile_customer_sales = 0.281(s)

2020-05-06 09:38:06.366 <profile-mapping-thread-2> INFO: [DS_10000] Fetching value for key: [INFA_HADOOP_DIST_DIR_SUFFIX].

2020-05-06 09:38:06.367 <profile-mapping-thread-2> INFO: [DS_10000] Fetching value for key: [INFA_HADOOP_DIST_DIR_SUFFIX].

2020-05-06 09:38:06.367 <profile-mapping-thread-2> INFO: [DS_10000] Putting key and value: [INFA_HADOOP_DIST_DIR_SUFFIX:HDP_3.1].

2020-05-06 09:38:07.385 <profile-mapping-thread-2> INFO: [HADOOP_API_0008] The Integration Service found Informatica Hadoop distribution directory [/data/informatica/1040/services/shared/hadoop/HDP_3.1/lib and /conf] for Hadoop class loader.

2020-05-06 09:38:07.386 <profile-mapping-thread-2> INFO: [HADOOP_API_0008] The Integration Service found Informatica Hadoop distribution directory [/data/informatica/1040/services/shared/hadoop/HDP_3.1/infaLib] for Hadoop class loader.

2020-05-06 09:38:07.386 <profile-mapping-thread-2> INFO: [HADOOP_API_0005] The Integration Service created a Hadoop class loader [/data/informatica/1040/services/shared/hadoop/HDP_3.1].

2020-05-06 09:38:09.082 <profile-mapping-thread-2> WARNING: [CORE_0003] An internal exception occurred with message: java.lang.RuntimeException: java.lang.RuntimeException: [AUTOINST_1005] Internal error. Cannot find the archive configuration file [/data/informatica/1040/services/shared/hadoop/HDP_3.1//services/shared/hadoop/TransparentInstallConfig.xml]. Contact Informatica Global Customer Support.

at com.informatica.platform.dtm.autoinstaller.AutoInstaller.populatePatternLists(AutoInstaller.java:1581)

at com.informatica.platform.dtm.autoinstaller.AutoInstaller.<init>(AutoInstaller.java:416)

at com.informatica.platform.ldtm.util.autoinstaller.AutoInstallerUtils.syncBinaries(AutoInstallerUtils.java:353)

at com.informatica.platform.ldtm.util.autoinstaller.AutoInstallerUtils.autoInstall(AutoInstallerUtils.java:152)

at com.informatica.platform.ldtm.impl.TransformationMachineImpl.autoInstall(TransformationMachineImpl.java:1698)

at com.informatica.platform.ldtm.impl.TransformationMachineImpl.engineSpecificInit(TransformationMachineImpl.java:1669)

at com.informatica.platform.ldtm.impl.TransformationMachineImpl.<init>(TransformationMachineImpl.java:881)

at com.informatica.platform.ldtm.TransformationMachineFactoryImpl.createInstance(TransformationMachineFactoryImpl.java:506)

at com.informatica.ds.server.impl.TransformationMachineDISImpl.getLocalTxMachineInfo(TransformationMachineDISImpl.java:300)

at com.informatica.ds.server.impl.TransformationMachineDISImpl.internalSubmitOperation(TransformationMachineDISImpl.java:782)

at com.informatica.ds.server.impl.TransformationMachineDISImpl.internalSubmitOperation(TransformationMachineDISImpl.java:576)

at com.informatica.ds.server.impl.TransformationMachineDISImpl.submitOperation(TransformationMachineDISImpl.java:416)

at com.informatica.profiling.services.apiimpl.workflow.MappingExecutor.executeBlazeMappinginLDTM(MappingExecutor.java:517)

at com.informatica.profiling.services.apiimpl.workflow.MappingExecutor.call(MappingExecutor.java:285)

at com.informatica.profiling.services.apiimpl.workflow.MappingExecutor.call(MappingExecutor.java:1)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at com.informatica.profiling.task.scheduler.TaskRunnable.run(TaskRunnable.java:70)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

at java.lang.Thread.run(Thread.java:748)​

Cause

The encountered issue occurs, when the 'Custom Hadoop OS Path' attribute was configured in the 'Execution Options' section of the 'Data Integration Service' (DIS) used for profiling job and the configured location is not valid.

 

When the Informatica Domain and Hadoop cluster server are of different OS platform, it is required to have the 'Custom Hadoop OS Path'  location configured in the DIS.  For instance, say, if the Informatica Domain is running in RedHat Linux and if the Hadoop clusters are in 'SuSE Linux', then that configuration should be made.

Solution

In the failure scenario, as both the Informatica Domain server machine and the Hadoop cluster machines were in the same 'RedHat Linux' OS Platforms, removed the value of 'Custom Hadoop OS Path' attribute in the DIS, recycled it. 


infa_dei_custom_hadoop_os_path.png 



 Following the same, newly scheduled Profiling jobs got completed successfully.


More Information
When the Informatica Domain and Hadoop cluster server are of different OS platforms, then it is required to have the 'Custom Hadoop OS Path'  location configured. 

For instance, when Hadoop cluster is in 'SuSE Linux' and if the Informatica Domain is in RHEL, then 'SuSE Linux' DEI binaries should be installed in a specific location in the RHEL Informatica Domain machine. When DEI binaries are downloaded and extracted in a folder in DIS machine, "Custom Hadoop OS Path" attribute under 'Execution Options' section of DIS should refer to the directory, where the SuSE Linux DEI binaries are extracted in the DIS machine.

 

The location specified as 'Custom Hadoop OS Path'  should contain the following folders inside them:

 

  • [Custom_Hadoop_OS_Path]/connectors
  • [Custom_Hadoop_OS_Path]/DataTransformation
  • [Custom_Hadoop_OS_Path]/externaljdbcjars
  • [Custom_Hadoop_OS_Path]/jre
  • [Custom_Hadoop_OS_Path]/ODBC7.1
  • [Custom_Hadoop_OS_Path]/plugins
  • [Custom_Hadoop_OS_Path]/services

 

 

For more information on the installation instructions, refer to the following page in the Installation guide:

 

https://docs.informatica.com/data-engineering/data-engineering-integration/10-4-1/integration-guide/hadoop-integration/before-you-begin/configure-the-data-integration-service/download-the-informatica-server-binaries-for-the-hadoop-environm.html 

 ​

Applies To
Product: Enterprise Data Catalog; Data Engineering Integration(Big Data Management); Data Engineering Quality(Big Data Quality)
Problem Type: Configuration; Product Feature
User Type: Administrator; Data Analyst
Project Phase: Configure; Implement
Product Version: Informatica 10.2.2; HotFix; Informatica 10.4
Database:
Operating System:
Other Software:

Reference
Attachments
Last Modified Date:7/28/2020 12:59 AMID:623547
People who viewed this also viewed

Feedback

Did this KB document help you?



What can we do to improve this information (2000 or fewer characters)