Created on 07-06-2019 01:43 AM - edited 09-16-2022 07:29 AM
{ "traceback": [ [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/core/handlers/exception.py", 41, "inner", "response = get_response(request)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/core/handlers/base.py", 249, "_legacy_get_response", "response = self._get_response(request)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/core/handlers/base.py", 187, "_get_response", "response = self.process_exception_by_middleware(e, request)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/core/handlers/base.py", 185, "_get_response", "response = wrapped_callback(request, *callback_args, **callback_kwargs)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/utils/decorators.py", 185, "inner", "return func(*args, **kwargs)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/desktop/libs/indexer/src/indexer/api3.py", 229, "guess_field_types", "for col in table_metadata:" ] ] }
CDH version: CDH6.2.0, can you give me advises for this iuuse ?
i have UAT , PRD, DEV env, all envs has this issue? i have checked all the machines by CM test function. all the check opition is ok .
Created on 07-06-2019 11:11 AM - edited 07-06-2019 11:13 AM
Hi, Cloudera support:
i am going to give you a summary of CDH 6.2.0 of EL7 HUE issue what it's happened in my env.
1. Hue editor can not execute sqoop1
2. Hue/oozie can not submit sqoop1 workflow
3. hue can not add table or database as above example.
4. hue can not add external database like mysql by using JDBC or rdbms.
if the errors could tell me i lost something or anyelse, but the painful is the error log can tell me nothing why it happended.
anything i have lost while i installed the CDH? cloud you give me advises to slove these problems ?
Created 07-08-2019 05:51 AM
i have thought these issues, at last i can sure these function is all connect to sqoop, Hue add table is oozie/sqoop temp job, hue editor to execute sqoop also is oozie/sqoop temp job, and oozie/sqoop action is still sqoop job.
these all failed, what's the root cause about these issues?
Created 07-08-2019 03:15 PM
https://issues.cloudera.org/browse/HUE-8717E-8717
i have search from google and find out this bug fixed, but thiis is just fix hue 4.4, does cloudera have provided path for hue 4.3 ?
thanks.
Created 07-08-2019 05:28 PM
after sent the last comment on hue little bugs, i was going to test more, and want to share you more details about hue 4.3 bugs.
Hue has provided 3 methos to use sqoop,
1) hue + add table, this is a temp oozie job as i think
2) sqoop1 notebook in hue editor, put sqoop command and execute just like sqoop Cli, i also think this is a temp oozie job ,why i think so, cause these error is the same, i will show you the error logs and graph.
3)oozie sqoop document
one thing i can confirm the 3) is a bug, the link is:https://review.cloudera.org/r/13630/diff/2#0
and get a patch to change some code in
desktop/libs/liboozie/src/liboozie/submission2.py
now, it works fine, but it just works fine of using HIVESERVER2, what i mean is i need to add --HS2-URL , if i lost this parameter, it will show me errors meaning GSS init failed(i am using kerberos)
ok. let's start to test all these 3 method to sqoop table from mysql to hive.
1) hue + add table function
after set mysql jdbc, and choose table or choose all tables, the errors is 'command' , perviously, the errors is not 'command', after i fix the 3 (oozie sqoop document bug), befort that, the errors as i dsecribed in the top of this message.
2) sqoop notebook in hue editor, i used sqoop command what i have tested in sqoop Cli, and oozie sqoop document, the errors also is 'command'
3) sqoop document in oozie, yes, i have changed some code following patch, and its successful. but pay more attention what i have described , i MUST add --HS2-URL parameter, and even thought the job success, it still has some erros like below:
07:53:11.834 [main] INFO org.apache.hadoop.mapreduce.Job - Counters: 33 File System Counters FILE: Number of bytes read=0 FILE: Number of bytes written=546636 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=85 HDFS: Number of bytes written=7864 HDFS: Number of read operations=6 HDFS: Number of large read operations=0 HDFS: Number of write operations=2 HDFS: Number of bytes read erasure-coded=0 Job Counters Launched map tasks=1 Other local map tasks=1 Total time spent by all maps in occupied slots (ms)=6597 Total time spent by all reduces in occupied slots (ms)=0 Total time spent by all map tasks (ms)=6597 Total vcore-milliseconds taken by all map tasks=6597 Total megabyte-milliseconds taken by all map tasks=6755328 Map-Reduce Framework Map input records=92 Map output records=92 Input split bytes=85 Spilled Records=0 Failed Shuffles=0 Merged Map outputs=0 GC time elapsed (ms)=72 CPU time spent (ms)=2810 Physical memory (bytes) snapshot=390754304 Virtual memory (bytes) snapshot=2658488320 Total committed heap usage (bytes)=530055168 Peak Map Physical memory (bytes)=390754304 Peak Map Virtual memory (bytes)=2658488320 File Input Format Counters Bytes Read=0 File Output Format Counters Bytes Written=7864 07:53:11.841 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Transferred 7.6797 KB in 20.8706 seconds (376.7984 bytes/sec) 07:53:11.845 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 92 records. 07:53:11.845 [main] INFO org.apache.sqoop.mapreduce.ImportJobBase - Publishing Hive/Hcat import job data to Listeners for table tbls 07:53:11.849 [main] INFO org.apache.sqoop.hive.HiveServer2Client - Loading uploaded data into Hive. 07:53:11.881 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `tbls` AS t LIMIT 1 07:53:11.889 [main] INFO org.apache.sqoop.hive.HiveServer2ConnectionFactory - Creating connection to HiveServer2 as: hdfs (auth:SIMPLE) 07:53:11.900 [main] INFO org.apache.hive.jdbc.Utils - Supplied authorities: 192.168.71.236:10000 07:53:11.901 [main] INFO org.apache.hive.jdbc.Utils - Resolved authority: 192.168.71.236:10000 07:53:12.667 [main] INFO org.apache.sqoop.hive.HiveClientCommon - Export directory is contains the _SUCCESS file only, removing the directory. 07:53:12.673 [main] INFO org.apache.sqoop.hive.HiveServer2Client - Hive import complete. <<< Invocation of Sqoop command completed <<< Hadoop Job IDs executed by Sqoop: job_1561790724479_0195 java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(LauncherAM.java:410) at org.apache.oozie.action.hadoop.LauncherAM.access$300(LauncherAM.java:55) at org.apache.oozie.action.hadoop.LauncherAM$2.run(LauncherAM.java:223) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.run(LauncherAM.java:217) at org.apache.oozie.action.hadoop.LauncherAM$1.run(LauncherAM.java:153) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875) at org.apache.oozie.action.hadoop.LauncherAM.main(LauncherAM.java:141) Caused by: java.lang.SecurityException: Intercepted System.exit(0) at org.apache.oozie.action.hadoop.security.LauncherSecurityManager.checkExit(LauncherSecurityManager.java:57) at java.lang.Runtime.exit(Runtime.java:107) at java.lang.System.exit(System.java:971) at org.apache.sqoop.Sqoop.main(Sqoop.java:252) at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:214) at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:199) at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:104) at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:51) ... 16 more Intercepting System.exit(0)
if i don't add --HS2-URL, it will tell me GSS failed.
sqoop Cli always success, no any problems.
ok, that's all. all these bugs or not bugs how to fix it ? could you give some advises to me ? thanks very much.