<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password: in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340060#M233257</link>
    <description>&lt;P&gt;i followed &lt;A href="https://docs.cloudera.com/documentation/enterprise/6/6.3/topics/cm_sg_ldap_grp_mappings.html#ldap_group_mapping" target="_blank"&gt;https://docs.cloudera.com/documentation/enterprise/6/6.3/topics/cm_sg_ldap_grp_mappings.html#ldap_group_mapping&lt;/A&gt;&amp;nbsp; &amp;nbsp; to set up openldap integration .&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;1.&amp;nbsp; install openldap&lt;/P&gt;&lt;P&gt;2.&amp;nbsp; set ldap parameter by doucments.&lt;/P&gt;&lt;P&gt;3. restart all service.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
    <pubDate>Wed, 30 Mar 2022 20:31:57 GMT</pubDate>
    <dc:creator>iamfromsky</dc:creator>
    <dc:date>2022-03-30T20:31:57Z</dc:date>
    <item>
      <title>Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340045#M233247</link>
      <description>&lt;P&gt;HI,&amp;nbsp; after i have integrated CDH with Openldap, I found there is&amp;nbsp; a WARNING in container log like below, try to get password file localjecks and permission denied.&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;2022-03-31 00:53:13,420 WARN [main] org.apache.hadoop.security.LdapGroupsMapping: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.ssl.keystore.password: 
java.io.IOException: Configuration problem with provider path.
	at org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2118)
	at org.apache.hadoop.conf.Configuration.getPassword(Configuration.java:2037)
	at org.apache.hadoop.security.LdapGroupsMapping.getPassword(LdapGroupsMapping.java:528)
	at org.apache.hadoop.security.LdapGroupsMapping.setConf(LdapGroupsMapping.java:473)
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
	at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:104)
	at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:100)
	at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:435)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:341)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:308)
	at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:895)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:861)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:728)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer.main(ContainerLocalizer.java:387)
Caused by: java.io.FileNotFoundException: /run/cloudera-scm-agent/process/9392-yarn-NODEMANAGER/creds.localjceks (Permission denied)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.&amp;lt;init&amp;gt;(FileInputStream.java:138)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.getInputStreamForFile(LocalJavaKeyStoreProvider.java:83)
	at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.locateKeystore(AbstractJavaKeyStoreProvider.java:334)
	at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.&amp;lt;init&amp;gt;(AbstractJavaKeyStoreProvider.java:88)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.&amp;lt;init&amp;gt;(LocalJavaKeyStoreProvider.java:58)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.&amp;lt;init&amp;gt;(LocalJavaKeyStoreProvider.java:50)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider$Factory.createProvider(LocalJavaKeyStoreProvider.java:177)
	at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:73)
	at org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2098)&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;this warning doesn't affect the mapreduce job, i just want to know how to resolve this.&amp;nbsp;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 30 Mar 2022 17:48:45 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340045#M233247</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-03-30T17:48:45Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340049#M233249</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/7922"&gt;@iamfromsky&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;The path you are mentioning has permissions issues. As the root user can you&amp;nbsp;&lt;BR /&gt;&lt;/EM&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;# chmod 777 /run/cloudera-scm-agent/process/9392-yarn-NODEMANAGER/creds.localjceks&lt;/LI-CODE&gt;&lt;P&gt;&lt;EM&gt;Then retry if that's successful then fine-tune the permissions .&lt;/EM&gt;&lt;/P&gt;&lt;P&gt;&lt;EM&gt;Hope that helps&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 30 Mar 2022 18:38:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340049#M233249</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2022-03-30T18:38:49Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340051#M233251</link>
      <description>&lt;P&gt;as you know , this file locate many path, namenode, datenode, yarn ,hbase. and this file is created by CDH, do you suggest me to change these location path permission ? if i restart one of these role, this file as i think would created again , and the permission still would be 700&lt;/P&gt;</description>
      <pubDate>Wed, 30 Mar 2022 18:47:00 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340051#M233251</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-03-30T18:47:00Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340059#M233256</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/7922"&gt;@iamfromsky&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;True let me check on that and revert on the config.&lt;/P&gt;&lt;P&gt;Can you share your integration steps or document?&lt;/P&gt;</description>
      <pubDate>Wed, 30 Mar 2022 19:54:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340059#M233256</guid>
      <dc:creator>Shelton</dc:creator>
      <dc:date>2022-03-30T19:54:17Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340060#M233257</link>
      <description>&lt;P&gt;i followed &lt;A href="https://docs.cloudera.com/documentation/enterprise/6/6.3/topics/cm_sg_ldap_grp_mappings.html#ldap_group_mapping" target="_blank"&gt;https://docs.cloudera.com/documentation/enterprise/6/6.3/topics/cm_sg_ldap_grp_mappings.html#ldap_group_mapping&lt;/A&gt;&amp;nbsp; &amp;nbsp; to set up openldap integration .&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;1.&amp;nbsp; install openldap&lt;/P&gt;&lt;P&gt;2.&amp;nbsp; set ldap parameter by doucments.&lt;/P&gt;&lt;P&gt;3. restart all service.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 30 Mar 2022 20:31:57 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340060#M233257</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-03-30T20:31:57Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340082#M233265</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/7922"&gt;@iamfromsky&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Can you check if the yarn user belongs to the hadoop group in your machines (&lt;FONT face="courier new,courier"&gt;id yarn&lt;/FONT&gt;)? If not, try adding it to the group and check if it resolves your problem.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Cheers,&lt;/P&gt;&lt;P&gt;André&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 31 Mar 2022 05:11:42 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340082#M233265</guid>
      <dc:creator>araujo</dc:creator>
      <dc:date>2022-03-31T05:11:42Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340190#M233285</link>
      <description>&lt;LI-CODE lang="markup"&gt;[root@host243 ~]# id yarn
uid=979(yarn) gid=973(yarn) groups=973(yarn),982(hadoop),979(solr)
[root@host243 ~]# 
[root@host243 ~]# 
[root@host243 ~]# hdfs groups yarn
yarn : hadoop yarn&lt;/LI-CODE&gt;&lt;P&gt;openldap user has been imported from OS user. so i think openldap user and group keep the same as os user/group.&lt;/P&gt;&lt;P&gt;there is just one think i'd like to share with you , after integrated with openldap, i haven't delete OS user.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 31 Mar 2022 21:02:20 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340190#M233285</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-03-31T21:02:20Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340226#M233286</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/7922"&gt;@iamfromsky&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;What are the default permissions of the&amp;nbsp;creds.localjceks file?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Cheers,&lt;/P&gt;&lt;P&gt;André&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 31 Mar 2022 21:40:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340226#M233286</guid>
      <dc:creator>araujo</dc:creator>
      <dc:date>2022-03-31T21:40:38Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340230#M233289</link>
      <description>&lt;LI-CODE lang="markup"&gt;/run/cloudera-scm-agent/process/9506-IMPALA-impala-CATALOGSERVER-45e2ae1dbc69e00f769182717dd71aa8-ImpalaRoleDiagnosticsCollection/creds.localjceks
/run/cloudera-scm-agent/process/9478-hue-KT_RENEWER/creds.localjceks
/run/cloudera-scm-agent/process/9476-hue-HUE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9471-impala-CATALOGSERVER/creds.localjceks
/run/cloudera-scm-agent/process/9462-impala-CATALOGSERVER/creds.localjceks
/run/cloudera-scm-agent/process/9456-sentry-SENTRY_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9455-oozie-OOZIE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9454-hue-KT_RENEWER/creds.localjceks
/run/cloudera-scm-agent/process/9452-hue-HUE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9448-hive-HIVEMETASTORE/creds.localjceks
/run/cloudera-scm-agent/process/9446-sentry-SENTRY_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9445-oozie-OOZIE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9444-hue-KT_RENEWER/creds.localjceks
/run/cloudera-scm-agent/process/9442-hue-HUE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9438-hive-HIVEMETASTORE/creds.localjceks
/run/cloudera-scm-agent/process/9437-hue-KT_RENEWER/creds.localjceks
/run/cloudera-scm-agent/process/9435-hue-HUE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9429-impala-CATALOGSERVER/creds.localjceks
/run/cloudera-scm-agent/process/9424-oozie-OOZIE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9420-hive-HIVEMETASTORE/creds.localjceks
/run/cloudera-scm-agent/process/9400-sentry-SENTRY_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9399-yarn-RESOURCEMANAGER/creds.localjceks
/run/cloudera-scm-agent/process/9388-yarn-JOBHISTORY/creds.localjceks
/run/cloudera-scm-agent/process/9413-hbase-REGIONSERVER/creds.localjceks
/run/cloudera-scm-agent/process/9411-hbase-MASTER/creds.localjceks
/run/cloudera-scm-agent/process/9377-hdfs-NAMENODE-nnRpcWait/creds.localjceks
/run/cloudera-scm-agent/process/9361-hdfs-NAMENODE/creds.localjceks
/run/cloudera-scm-agent/process/9351-HBaseShutdown/creds.localjceks
/run/cloudera-scm-agent/process/9343-hue-HUE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9345-hue-KT_RENEWER/creds.localjceks
/run/cloudera-scm-agent/process/9339-hive-HIVEMETASTORE/creds.localjceks
/run/cloudera-scm-agent/process/9338-oozie-OOZIE_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9337-sentry-SENTRY_SERVER/creds.localjceks
/run/cloudera-scm-agent/process/9333-hue-KT_RENEWER/creds.localjceks&lt;/LI-CODE&gt;&lt;P&gt;every roles has their own creds.localjecks, and the default permission is 640. i pick some roles localjecks for your checking&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[root@host21 ~]# ls -l /run/cloudera-scm-agent/process/9478-hue-KT_RENEWER/creds.localjceks
-rw-r----- 1 hue hue 1501 Mar 25 04:11 /run/cloudera-scm-agent/process/9478-hue-KT_RENEWER/creds.localjceks
[root@host21 ~]# ls -l /run/cloudera-scm-agent/process/9471-impala-CATALOGSERVER/creds.localjceks
-rw-r----- 1 impala impala 533 Mar 25 04:01 /run/cloudera-scm-agent/process/9471-impala-CATALOGSERVER/creds.localjceks
[root@host21 ~]# 
[root@host21 ~]# ls -l /run/cloudera-scm-agent/process/8788-hive-HIVEMETASTORE/creds.localjceks
-rw-r----- 1 hive hive 528 Mar  4 09:34 /run/cloudera-scm-agent/process/8788-hive-HIVEMETASTORE/creds.localjceks
[root@host21 ~]# ls -l /run/cloudera-scm-agent/process/9295-yarn-RESOURCEMANAGER/creds.localjceks
-rw-r----- 1 yarn hadoop 533 Mar 25 03:10 /run/cloudera-scm-agent/process/9295-yarn-RESOURCEMANAGER/creds.localjceks&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;when i run hive sql or sqoop , the permission denied of creds.localjecks happended.&amp;nbsp;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 31 Mar 2022 23:05:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340230#M233289</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-03-31T23:05:21Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340233#M233291</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/7922"&gt;@iamfromsky&lt;/a&gt;&amp;nbsp;,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Could you please share the output of this command:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;ls -ln /run/cloudera-scm-agent/process/9295-yarn-RESOURCEMANAGER/creds.localjceks&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Cheers,&lt;/P&gt;&lt;P&gt;André&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 01 Apr 2022 01:38:46 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340233#M233291</guid>
      <dc:creator>araujo</dc:creator>
      <dc:date>2022-04-01T01:38:46Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340234#M233292</link>
      <description>&lt;P&gt;Also this:&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;namei -l /run/cloudera-scm-agent/process/9295-yarn-RESOURCEMANAGER/creds.localjceks&lt;/LI-CODE&gt;</description>
      <pubDate>Fri, 01 Apr 2022 01:39:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340234#M233292</guid>
      <dc:creator>araujo</dc:creator>
      <dc:date>2022-04-01T01:39:48Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340237#M233295</link>
      <description>&lt;P&gt;Hi, araujo&lt;/P&gt;&lt;P&gt;please refer to the below&amp;nbsp; information:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[appadmin@host21 ~]$ namei -l /run/cloudera-scm-agent/process/9295-yarn-RESOURCEMANAGER/creds.localjceks
f: /run/cloudera-scm-agent/process/9295-yarn-RESOURCEMANAGER/creds.localjceks
dr-xr-xr-x root         root         /
drwxr-xr-x root         root         run
drwxr-xr-x cloudera-scm cloudera-scm cloudera-scm-agent
drwxr-x--x root         root         process
drwxr-x--x yarn         hadoop       9295-yarn-RESOURCEMANAGER
-rw-r----- yarn         hadoop       creds.localjceks&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[appadmin@host21 ~]$ ls -ln /run/cloudera-scm-agent/process/9295-yarn-RESOURCEMANAGER/creds.localjceks
-rw-r----- 1 981 984 533 Mar 25 03:10 /run/cloudera-scm-agent/process/9295-yarn-RESOURCEMANAGER/creds.localjceks&lt;/LI-CODE&gt;&lt;P&gt;the creds.localjecks owner is 981:984, the below output is yarn user id and hadoop group id.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[root@host21 ~]# cat /etc/passwd | grep 981
solr:x:987:981:Solr:/var/lib/solr:/sbin/nologin
yarn:x:981:975:Hadoop Yarn:/var/lib/hadoop-yarn:/bin/bash
[root@host21 ~]# 
[root@host21 ~]# cat /etc/group | grep hadoop
hadoop:x:984:hdfs,mapred,yarn&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 01 Apr 2022 02:07:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340237#M233295</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-04-01T02:07:09Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340238#M233296</link>
      <description>&lt;P&gt;actually , i don't know which user should visit this file while running map/reduce(hive query or sqoop, maybe there are also other programs)&lt;/P&gt;</description>
      <pubDate>Fri, 01 Apr 2022 02:10:07 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340238#M233296</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-04-01T02:10:07Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340258#M233300</link>
      <description>&lt;P&gt;Which log file are those errors coming from?&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;André&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 01 Apr 2022 04:54:38 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340258#M233300</guid>
      <dc:creator>araujo</dc:creator>
      <dc:date>2022-04-01T04:54:38Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340297#M233305</link>
      <description>&lt;LI-CODE lang="markup"&gt;	
Log Type: container-localizer-syslog

Log Upload Time: Thu Mar 31 02:24:54 +0800 2022

Log Length: 3720

2022-03-31 02:24:16,275 WARN [main] org.apache.hadoop.security.LdapGroupsMapping: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password: 
java.io.IOException: Configuration problem with provider path.
	at org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2272)
	at org.apache.hadoop.conf.Configuration.getPassword(Configuration.java:2191)
	at org.apache.hadoop.security.LdapGroupsMapping.getPassword(LdapGroupsMapping.java:719)
	at org.apache.hadoop.security.LdapGroupsMapping.setConf(LdapGroupsMapping.java:616)
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
	at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:106)
	at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:102)
	at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:451)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:352)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:314)
	at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1973)
	at org.apache.hadoop.security.UserGroupInformation.createLoginUser(UserGroupInformation.java:743)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:693)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:604)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer.main(ContainerLocalizer.java:461)
Caused by: java.io.FileNotFoundException: /var/run/cloudera-scm-agent/process/26618-yarn-NODEMANAGER/creds.localjceks (Permission denied)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.&amp;lt;init&amp;gt;(FileInputStream.java:138)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.getInputStreamForFile(LocalJavaKeyStoreProvider.java:83)
	at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.locateKeystore(AbstractJavaKeyStoreProvider.java:321)
	at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.&amp;lt;init&amp;gt;(AbstractJavaKeyStoreProvider.java:86)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.&amp;lt;init&amp;gt;(LocalJavaKeyStoreProvider.java:58)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.&amp;lt;init&amp;gt;(LocalJavaKeyStoreProvider.java:50)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider$Factory.createProvider(LocalJavaKeyStoreProvider.java:177)
	at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:73)
	at org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2253)
	... 15 more
2022-03-31 02:24:16,438 INFO [main] org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer: Disk Validator: yarn.nodemanager.disk-validator is loaded.
2022-03-31 02:24:17,272 WARN [ContainerLocalizer Downloader] org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error
2022-03-31 02:24:19,294 WARN [ContainerLocalizer Downloader] org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error&lt;/LI-CODE&gt;&lt;P&gt;it shows up in&amp;nbsp;container-localizer-syslog.&amp;nbsp; &amp;nbsp;as you know, every map/reduce task has many logs when we open yarn web-ui, pick any one job ,there are map/reduce tasks, click these task to check task details, we can see the below logs:&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;container-localizer-syslog : Total file length is 3398 bytes.

prelaunch.err : Total file length is 0 bytes.

prelaunch.out : Total file length is 70 bytes.

stderr : Total file length is 1643 bytes.

stdout : Total file length is 0 bytes.

syslog : Total file length is 141307 bytes.&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 01 Apr 2022 11:07:04 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340297#M233305</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-04-01T11:07:04Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340299#M233306</link>
      <description>&lt;P&gt;i give you a whole job log of sqoop&amp;nbsp; for checking more details (this is just a example, hive query is the same)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;[root@host243 ~]# sqoop export --connect jdbc:mysql://10.37.144.6:3306/xaxsuatdb?characterEncoding=utf-8 --username root --password xaxs2016  --table customer_feature --export-dir "/user/hive/warehouse/penglin.db/label_cus_kpi_hightable_h" --input-fields-terminated-by '\001' --input-null-string '\\N' --input-null-non-string '\\N' --update-key CUSTOMER_BP,ORG_CODE,TAG_ID,VERSION --columns CUSTOMER_BP,ORG_CODE,CPMO_COP,TAG_ID,TAG_NAME,TAG_VALUE,VERSION,UPDATE_TIME  --update-mode allowinsert -m 1;

Warning: /data/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/data/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/data/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
22/04/01 19:10:03 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7-cdh6.2.0
22/04/01 19:10:04 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
22/04/01 19:10:04 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
22/04/01 19:10:04 INFO tool.CodeGenTool: Beginning code generation
22/04/01 19:10:04 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer_feature` AS t LIMIT 1
22/04/01 19:10:04 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `customer_feature` AS t LIMIT 1
22/04/01 19:10:04 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /data/cloudera/parcels/CDH/lib/hadoop-mapreduce
22/04/01 19:10:05 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-root/compile/7255ac988b70c7d9b5eb963a6f4946f5/customer_feature.java to /root/./customer_feature.java. Error: Destination '/root/./customer_feature.java' already exists
22/04/01 19:10:05 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-root/compile/7255ac988b70c7d9b5eb963a6f4946f5/customer_feature.jar
22/04/01 19:10:05 WARN manager.MySQLManager: MySQL Connector upsert functionality is using INSERT ON
22/04/01 19:10:05 WARN manager.MySQLManager: DUPLICATE KEY UPDATE clause that relies on table's unique key.
22/04/01 19:10:05 WARN manager.MySQLManager: Insert/update distinction is therefore independent on column
22/04/01 19:10:05 WARN manager.MySQLManager: names specified in --update-key parameter. Please see MySQL
22/04/01 19:10:05 WARN manager.MySQLManager: documentation for additional limitations.
22/04/01 19:10:05 INFO mapreduce.ExportJobBase: Beginning export of customer_feature
22/04/01 19:10:06 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
22/04/01 19:10:06 WARN mapreduce.ExportJobBase: IOException checking input file header: java.io.EOFException
22/04/01 19:10:06 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
22/04/01 19:10:06 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
22/04/01 19:10:06 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
22/04/01 19:10:07 INFO client.ConfiguredRMFailoverProxyProvider: Failing over to rm127
22/04/01 19:10:07 INFO hdfs.DFSClient: Created token for hive: HDFS_DELEGATION_TOKEN owner=hive@DEV.ENN.CN, renewer=yarn, realUser=, issueDate=1648811407106, maxDate=1649416207106, sequenceNumber=176449, masterKeyId=2259 on ha-hdfs:nameservice1
22/04/01 19:10:07 INFO security.TokenCache: Got dt for hdfs://nameservice1; Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for hive: HDFS_DELEGATION_TOKEN owner=hive@DEV.ENN.CN, renewer=yarn, realUser=, issueDate=1648811407106, maxDate=1649416207106, sequenceNumber=176449, masterKeyId=2259)
22/04/01 19:10:07 INFO mapreduce.JobResourceUploader: Disabling Erasure Coding for path: /user/hive/.staging/job_1648759620123_0052
22/04/01 19:10:09 INFO input.FileInputFormat: Total input files to process : 37
22/04/01 19:10:09 INFO input.FileInputFormat: Total input files to process : 37
22/04/01 19:10:09 INFO mapreduce.JobSubmitter: number of splits:2
22/04/01 19:10:09 INFO Configuration.deprecation: yarn.resourcemanager.zk-address is deprecated. Instead, use hadoop.zk.address
22/04/01 19:10:09 INFO Configuration.deprecation: yarn.resourcemanager.system-metrics-publisher.enabled is deprecated. Instead, use yarn.system-metrics-publisher.enabled
22/04/01 19:10:09 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1648759620123_0052
22/04/01 19:10:09 INFO mapreduce.JobSubmitter: Executing with tokens: [Kind: HDFS_DELEGATION_TOKEN, Service: ha-hdfs:nameservice1, Ident: (token for hive: HDFS_DELEGATION_TOKEN owner=hive@DEV.ENN.CN, renewer=yarn, realUser=, issueDate=1648811407106, maxDate=1649416207106, sequenceNumber=176449, masterKeyId=2259)]
22/04/01 19:10:09 INFO conf.Configuration: resource-types.xml not found
22/04/01 19:10:09 INFO resource.ResourceUtils: Unable to find 'resource-types.xml'.
22/04/01 19:10:09 INFO impl.YarnClientImpl: Submitted application application_1648759620123_0052
22/04/01 19:10:09 INFO mapreduce.Job: The url to track the job: http://host243.master.dev.cluster.enn.cn:8088/proxy/application_1648759620123_0052/
22/04/01 19:10:09 INFO mapreduce.Job: Running job: job_1648759620123_0052
22/04/01 19:10:17 INFO mapreduce.Job: Job job_1648759620123_0052 running in uber mode : false
22/04/01 19:10:17 INFO mapreduce.Job:  map 0% reduce 0%
22/04/01 19:10:26 INFO mapreduce.Job:  map 50% reduce 0%
22/04/01 19:10:38 INFO mapreduce.Job:  map 59% reduce 0%
22/04/01 19:10:44 INFO mapreduce.Job:  map 67% reduce 0%
22/04/01 19:10:50 INFO mapreduce.Job:  map 75% reduce 0%
22/04/01 19:10:56 INFO mapreduce.Job:  map 84% reduce 0%
22/04/01 19:11:02 INFO mapreduce.Job:  map 92% reduce 0%
22/04/01 19:11:07 INFO mapreduce.Job:  map 100% reduce 0%
22/04/01 19:11:07 INFO mapreduce.Job: Job job_1648759620123_0052 completed successfully
22/04/01 19:11:07 INFO mapreduce.Job: Counters: 34
        File System Counters
                FILE: Number of bytes read=0
                FILE: Number of bytes written=504986
                FILE: Number of read operations=0
                FILE: Number of large read operations=0
                FILE: Number of write operations=0
                HDFS: Number of bytes read=135301211
                HDFS: Number of bytes written=0
                HDFS: Number of read operations=113
                HDFS: Number of large read operations=0
                HDFS: Number of write operations=0
                HDFS: Number of bytes read erasure-coded=0
        Job Counters 
                Launched map tasks=2
                Other local map tasks=1
                Data-local map tasks=1
                Total time spent by all maps in occupied slots (ms)=106796
                Total time spent by all reduces in occupied slots (ms)=0
                Total time spent by all map tasks (ms)=53398
                Total vcore-milliseconds taken by all map tasks=53398
                Total megabyte-milliseconds taken by all map tasks=109359104
        Map-Reduce Framework
                Map input records=1500414
                Map output records=1500414
                Input split bytes=3902
                Spilled Records=0
                Failed Shuffles=0
                Merged Map outputs=0
                GC time elapsed (ms)=296
                CPU time spent (ms)=35690
                Physical memory (bytes) snapshot=930320384
                Virtual memory (bytes) snapshot=5727088640
                Total committed heap usage (bytes)=1557135360
                Peak Map Physical memory (bytes)=534118400
                Peak Map Virtual memory (bytes)=2866765824
        File Input Format Counters 
                Bytes Read=0
        File Output Format Counters 
                Bytes Written=0
22/04/01 19:11:07 INFO mapreduce.ExportJobBase: Transferred 129.0333 MB in 60.3414 seconds (2.1384 MB/sec)
22/04/01 19:11:07 INFO mapreduce.ExportJobBase: Exported 1500414 records.&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;we can see, the sqoop job is finished and successful. but we can find this error log from container logs.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;Log Type: container-localizer-syslog

Log Upload Time: Fri Apr 01 19:11:14 +0800 2022

Log Length: 3398

2022-04-01 19:10:18,487 WARN [main] org.apache.hadoop.security.LdapGroupsMapping: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password: 
java.io.IOException: Configuration problem with provider path.
	at org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2272)
	at org.apache.hadoop.conf.Configuration.getPassword(Configuration.java:2191)
	at org.apache.hadoop.security.LdapGroupsMapping.getPassword(LdapGroupsMapping.java:719)
	at org.apache.hadoop.security.LdapGroupsMapping.setConf(LdapGroupsMapping.java:616)
	at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:77)
	at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:137)
	at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:106)
	at org.apache.hadoop.security.Groups.&amp;lt;init&amp;gt;(Groups.java:102)
	at org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:451)
	at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:352)
	at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:314)
	at org.apache.hadoop.security.UserGroupInformation.doSubjectLogin(UserGroupInformation.java:1973)
	at org.apache.hadoop.security.UserGroupInformation.createLoginUser(UserGroupInformation.java:743)
	at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:693)
	at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:604)
	at org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer.main(ContainerLocalizer.java:461)
Caused by: java.io.FileNotFoundException: /var/run/cloudera-scm-agent/process/26878-yarn-NODEMANAGER/creds.localjceks (Permission denied)
	at java.io.FileInputStream.open0(Native Method)
	at java.io.FileInputStream.open(FileInputStream.java:195)
	at java.io.FileInputStream.&amp;lt;init&amp;gt;(FileInputStream.java:138)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.getInputStreamForFile(LocalJavaKeyStoreProvider.java:83)
	at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.locateKeystore(AbstractJavaKeyStoreProvider.java:321)
	at org.apache.hadoop.security.alias.AbstractJavaKeyStoreProvider.&amp;lt;init&amp;gt;(AbstractJavaKeyStoreProvider.java:86)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.&amp;lt;init&amp;gt;(LocalJavaKeyStoreProvider.java:58)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider.&amp;lt;init&amp;gt;(LocalJavaKeyStoreProvider.java:50)
	at org.apache.hadoop.security.alias.LocalJavaKeyStoreProvider$Factory.createProvider(LocalJavaKeyStoreProvider.java:177)
	at org.apache.hadoop.security.alias.CredentialProviderFactory.getProviders(CredentialProviderFactory.java:73)
	at org.apache.hadoop.conf.Configuration.getPasswordFromCredentialProviders(Configuration.java:2253)
	... 15 more
2022-04-01 19:10:18,723 INFO [main] org.apache.hadoop.yarn.server.nodemanager.containermanager.localizer.ContainerLocalizer: Disk Validator: yarn.nodemanager.disk-validator is loaded.
2022-04-01 19:10:19,741 WARN [ContainerLocalizer Downloader] org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error

Log Type: prelaunch.err

Log Upload Time: Fri Apr 01 19:11:14 +0800 2022

Log Length: 0


Log Type: prelaunch.out

Log Upload Time: Fri Apr 01 19:11:14 +0800 2022

Log Length: 70

Setting up env variables
Setting up job resources
Launching container

Log Type: stderr

Log Upload Time: Fri Apr 01 19:11:14 +0800 2022

Log Length: 0


Log Type: stdout

Log Upload Time: Fri Apr 01 19:11:14 +0800 2022

Log Length: 0


Log Type: syslog

Log Upload Time: Fri Apr 01 19:11:14 +0800 2022

Log Length: 45172

Showing 4096 bytes of 45172 total. Click here for the full log.

ainer_e483_1648759620123_0052_01_000002/transaction-api-1.1.jar:/data/yarn/nm/usercache/hive/appcache/application_1648759620123_0052/container_e483_1648759620123_0052_01_000002/commons-jexl-2.1.1.jar
java.io.tmpdir: /data/yarn/nm/usercache/hive/appcache/application_1648759620123_0052/container_e483_1648759620123_0052_01_000002/tmp
user.dir: /data/yarn/nm/usercache/hive/appcache/application_1648759620123_0052/container_e483_1648759620123_0052_01_000002
user.name: hive
************************************************************/
2022-04-01 19:10:22,636 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: session.id is deprecated. Instead, use dfs.metrics.session-id
2022-04-01 19:10:23,151 INFO [main] org.apache.hadoop.mapred.Task:  Using ResourceCalculatorProcessTree : [ ]
2022-04-01 19:10:23,248 WARN [main] org.apache.hadoop.ipc.Client: Exception encountered while connecting to the server : org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.ipc.StandbyException): Operation category READ is not supported in state standby. Visit https://s.apache.org/sbnn-error
2022-04-01 19:10:23,427 INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: Paths:/user/hive/warehouse/penglin.db/label_cus_kpi_hightable_h/000005_0:0+1582218,/user/hive/warehouse/penglin.db/label_cus_kpi_hightable_h/000008_0:0+22415866,/user/hive/warehouse/penglin.db/label_cus_kpi_hightable_h/000014_0:0+23029717,/user/hive/warehouse/penglin.db/label_cus_kpi_hightable_h/000015_0:0+10901528,/user/hive/warehouse/penglin.db/label_cus_kpi_hightable_h/000017_0:0+17525005,/user/hive/warehouse/penglin.db/label_cus_kpi_hightable_h/000018_0:0+16289569,/user/hive/warehouse/penglin.db/label_cus_kpi_hightable_h/000036_0:0+43553385
2022-04-01 19:10:23,432 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: map.input.file is deprecated. Instead, use mapreduce.map.input.file
2022-04-01 19:10:23,432 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: map.input.start is deprecated. Instead, use mapreduce.map.input.start
2022-04-01 19:10:23,432 INFO [main] org.apache.hadoop.conf.Configuration.deprecation: map.input.length is deprecated. Instead, use mapreduce.map.input.length
2022-04-01 19:11:04,627 INFO [Thread-14] org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
2022-04-01 19:11:04,671 INFO [main] org.apache.hadoop.mapred.Task: Task:attempt_1648759620123_0052_m_000000_0 is done. And is in the process of committing
2022-04-01 19:11:04,715 INFO [main] org.apache.hadoop.mapred.Task: Task 'attempt_1648759620123_0052_m_000000_0' done.
2022-04-01 19:11:04,728 INFO [main] org.apache.hadoop.mapred.Task: Final Counters for attempt_1648759620123_0052_m_000000_0: Counters: 26
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=252493
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=135298087
		HDFS: Number of bytes written=0
		HDFS: Number of read operations=22
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=0
		HDFS: Number of bytes read erasure-coded=0
	Map-Reduce Framework
		Map input records=1500414
		Map output records=1500414
		Input split bytes=778
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=231
		CPU time spent (ms)=34070
		Physical memory (bytes) snapshot=534118400
		Virtual memory (bytes) snapshot=2866765824
		Total committed heap usage (bytes)=788529152
		Peak Map Physical memory (bytes)=534118400
		Peak Map Virtual memory (bytes)=2866765824
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=0
2022-04-01 19:11:04,829 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping MapTask metrics system...
2022-04-01 19:11:04,829 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system stopped.
2022-04-01 19:11:04,829 INFO [main] org.apache.hadoop.metrics2.impl.MetricsSystemImpl: MapTask metrics system shutdown complete&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 01 Apr 2022 11:15:29 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340299#M233306</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-04-01T11:15:29Z</dc:date>
    </item>
    <item>
      <title>Re: Exception while trying to get password for alias hadoop.security.group.mapping.ldap.bind.password:</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340600#M233352</link>
      <description>&lt;P&gt;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/11191"&gt;@araujo&lt;/a&gt;&amp;nbsp; do you have any suggestion for this case ?&amp;nbsp;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 05 Apr 2022 14:43:15 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Exception-while-trying-to-get-password-for-alias-hadoop/m-p/340600#M233352</guid>
      <dc:creator>iamfromsky</dc:creator>
      <dc:date>2022-04-05T14:43:15Z</dc:date>
    </item>
  </channel>
</rss>

