Skip Ribbon Commands
Skip to main content
Navigate Up
Sign In

Quick Launch

Average Rating:

facebook Twitter
Email
Print Bookmark Alert me when this article is updated

Feedback

ERROR: "org.apache.hadoop.fs.azure.KeyProviderException: ExitCodeException exitCode=2: Error reading S/MIME message" while running mappings in Azure HDInsights cluster using Informatica DEI
Problem Description

While running mapping in Azure HDInsights cluster, using Informatica 'Data Engineering Integration' (DEI), earlier known as 'Big Data Management' (BDM), mapping execution fails. In the Mapping log trace, a message similar to the following could be observed:

 

Mapping Log Trace

 

2018-03-07 15:06:49.773 <LdtmWorkflowTask-pool-1-thread-1> INFO: Hadoop_Native_Log :INFO org.apache.tez.client.TezClient: The url to track the Tez Session: http://indthsbdm002.informatica.com:8088/proxy/application_1520430810806_0003/

2018-03-07 15:06:59.381 <LdtmWorkflowTask-pool-1-thread-1> INFO: Hadoop_Native_Log :INFO org.apache.tez.client.TezClient: App did not succeed. Diagnostics: Application application_1520430810806_0003 failed 2 times (global limit =5; local limit is =2) due to AM Container for appattempt_1520430810806_0003_000002 exited with  exitCode: 1

For more detailed output, check the application tracking page: http://indthsbdm002.informatica.com:8088/cluster/app/application_1520430810806_0003 Then click on links to logs of each attempt.

Diagnostics: Exception from container-launch.

Container id: container_e01_1520430810806_0003_02_000001

Exit code: 1

Stack trace: ExitCodeException exitCode=1:

at org.apache.hadoop.util.Shell.runCommand(Shell.java:944)

at org.apache.hadoop.util.Shell.run(Shell.java:848)

at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:1142)

at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:237)

at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:317)

at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:83)

at java.util.concurrent.FutureTask.run(FutureTask.java:266)

at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)

at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)

at java.lang.Thread.run(Thread.java:748)

 

 

Container exited with a non-zero exit code 1

Failing this attempt. Failing the application.

2018-03-07 15:06:59.384 <LdtmWorkflowTask-pool-1-thread-1> SEVERE: [HIVE_1071] The Integration Service failed to initialize task [exec0]. See the additional error messages for more information.

2018-03-07 15:06:59.392 <LdtmWorkflowTask-pool-1-thread-1> INFO: Hadoop_Native_Log :INFO org.apache.hadoop.hive.ql.exec.tez.TezSessionPoolManager: Closing tez session default? False"

 

In the 'syslog' of YARN Application that got created for mapping execution in Hadoop cluster, the following error message could be observed:

 

Hadoop Application Log

 

2018-03-06 10:40:37,447 [INFO] [main] |impl.TimelineClientImpl|: Timeline service address: http://headnodehost:8188/ws/v1/timeline/

2018-03-06 10:40:37,525 [INFO] [main] |impl.MetricsConfig|: loaded properties from hadoop-metrics2-azure-file-system.properties

2018-03-06 10:40:37,529 [INFO] [main] |sink.WasbAzureIaasSink|: Init starting.

2018-03-06 10:40:37,530 [INFO] [main] |sink.AzureIaasSink|: Init starting. Initializing MdsLogger.

2018-03-06 10:40:37,531 [INFO] [main] |sink.AzureIaasSink|: Init completed.

2018-03-06 10:40:37,531 [INFO] [main] |sink.WasbAzureIaasSink|: Init completed.

2018-03-06 10:40:37,536 [INFO] [main] |impl.MetricsSinkAdapter|: Sink azurefs2 started

2018-03-06 10:40:37,584 [INFO] [main] |impl.MetricsSystemImpl|: Scheduled snapshot period at 60 second(s).

2018-03-06 10:40:37,584 [INFO] [main] |impl.MetricsSystemImpl|: azure-file-system metrics system started

2018-03-06 10:40:37,611 [INFO] [main] |service.AbstractService|: Service org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl failed in state STARTED; cause: org.apache.hadoop.fs.azure.AzureException: org.apache.hadoop.fs.azure.KeyProviderException: ExitCodeException exitCode=2: Error reading S/MIME message

139758724331160:error:0D0680A8:asn1 encoding routines:ASN1_CHECK_TLEN:wrong tag:tasn_dec.c:1197:

139758724331160:error:0D07803A:asn1 encoding routines:ASN1_ITEM_EX_D2I:nested asn1 error:tasn_dec.c:374:Type=CMS_ContentInfo

139758724331160:error:0D0D106E:asn1 encoding routines:B64_READ_ASN1:decode error:asn_mime.c:192:

139758724331160:error:0D0D40CB:asn1 encoding routines:SMIME_read_ASN1:asn1 parse error:asn_mime.c:517:

org.apache.hadoop.fs.azure.AzureException: org.apache.hadoop.fs.azure.KeyProviderException: ExitCodeException exitCode=2: Error reading S/MIME message

139758724331160:error:0D0680A8:asn1 encoding routines:ASN1_CHECK_TLEN:wrong tag:tasn_dec.c:1197:

139758724331160:error:0D07803A:asn1 encoding routines:ASN1_ITEM_EX_D2I:nested asn1 error:tasn_dec.c:374:Type=CMS_ContentInfo

139758724331160:error:0D0D106E:asn1 encoding routines:B64_READ_ASN1:decode error:asn_mime.c:192:

139758724331160:error:0D0D40CB:asn1 encoding routines:SMIME_read_ASN1:asn1 parse error:asn_mime.c:517:

at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.createAzureStorageSession(AzureNativeFileSystemStore.java:1044)

at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.initialize(AzureNativeFileSystemStore.java:507)

at org.apache.hadoop.fs.azure.NativeAzureFileSystem.initialize(NativeAzureFileSystem.java:1281)

at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2795)

at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:99)

at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2829)

at org.apache.hadoop.fs.FileSystem$Cache.getUnique(FileSystem.java:2817)

at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:437)

at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:445)

at org.apache.hadoop.fs.FileSystem.newInstance(FileSystem.java:427)

at org.apache.hadoop.yarn.client.api.impl.FileSystemTimelineWriter.<init>(FileSystemTimelineWriter.java:115)

at org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl.createTimelineWriter(TimelineClientImpl.java:320)

at org.apache.hadoop.yarn.client.api.impl.TimelineClientImpl.serviceStart(TimelineClientImpl.java:312)

at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193)

at org.apache.tez.dag.history.ats.acls.ATSV15HistoryACLPolicyManager.initializeTimelineClient(ATSV15HistoryACLPolicyManager.java:70)

at org.apache.tez.dag.history.ats.acls.ATSV15HistoryACLPolicyManager.setConf(ATSV15HistoryACLPolicyManager.java:215)

at org.apache.tez.dag.history.logging.ats.ATSV15HistoryLoggingService.serviceInit(ATSV15HistoryLoggingService.java:150)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.hadoop.service.CompositeService.serviceInit(CompositeService.java:107)

at org.apache.tez.dag.history.HistoryEventHandler.serviceInit(HistoryEventHandler.java:70)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.tez.dag.app.DAGAppMaster.initServices(DAGAppMaster.java:1761)

at org.apache.tez.dag.app.DAGAppMaster.serviceInit(DAGAppMaster.java:569)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at org.apache.tez.dag.app.DAGAppMaster$7.run(DAGAppMaster.java:2389)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)

at org.apache.tez.dag.app.DAGAppMaster.initAndStartAppMaster(DAGAppMaster.java:2386)

at org.apache.tez.dag.app.DAGAppMaster.main(DAGAppMaster.java:2190)

Cause

 The encountered issue, generally occurs when there are issues with access key configurations of Azure HDInsights cluster used for running the mapping.

Solution

Perform the following steps for resolving the encountered issue: 

 

  1. Login to Informatica Administrator console.
  2. Navigate to 'Connections' tab.
  3. Edit the 'Cluster Configuration Object' (CCO) object, created for the Azure HDInsights cluster.
  4. Add/update the following properties under 'core-site.xml' section of CCO: 

 

Property

Value

fs.azure.account.keyprovider.<STORAGE_ACCOUNT_NAME>.blob.core.windows.net

org.apache.hadoop.fs.azure.SimpleKeyProvider

fs.azure.account.key.<STORAGE_ACCOUNT_NAME>.blob.core.windows.net

<DECRYPTED_ACCESS_KEY>

  

To get 'Decrypted Access Key' for Azure cluster, perform the following steps: 

  1. Login to one of the Azure Hadoop cluster nodes.  
  2. Navigate to the directory, specified as value for property 'fs.azure.shellkeyprovider.script' in 'core-site.xml' of CCO. Example: /usr/lib/python2.7/dist-packages/hdinsight_common/
  3. Run the following command to decrypt the account key:  

 

./decrypt.sh <ENCRYPTED ACCOUNT KEY>

 

### Encrypted Account Key could be obtained from 'fs.azure.account.key.<STORAGE_ACCOUNT_NAME>.blob.core.windows.net' property in 'core-site.xml' of CCO. 

  d. Replace the encrypted account key given with the 'fs.azure.account.key.<STORAGE_ACCOUNT_NAME>.blob.core.windows.net' property with the decrypted key, received as the output of the earlier step. 

      5. Delete the following property, if present in 'core-site.xml' section of 'CCO': 

Property

Value

fs.azure.shellkeyprovider.script

<PATH_TO_DECRYPTION_PROGRAM_IN_AZURE_NODES>

 

      6. Once properties are added to CCO, save the changes.

      7. Recycle the 'Data Integration Service' (DIS), used for running the mapping.

  

After performing the above steps and on re-running the mapping, it should complete without the encountered error.

 

More Information
Applies To
Product: Data Engineering Integration(Big Data Management); Data Engineering Quality(Big Data Quality); Data Engineering Streaming(Big Data Streaming); Enterprise Data Preparation
Problem Type: Configuration; Connectivity
User Type: Business Analyst; Administrator
Project Phase: Configure; Implement
Product Version: Informatica 10.2; Informatica 10.2.1; Informatica 10.2.1 Service Pack 1; Informatica 10.2.2; HotFix; Informatica 10.4
Database:
Operating System:
Other Software:

Reference
Attachments
Last Modified Date:3/31/2020 4:39 AMID:526262
People who viewed this also viewed

Feedback

Did this KB document help you?



What can we do to improve this information (2000 or fewer characters)