Support Questions

Find answers, ask questions, and share your expertise

Hue + add table failed

Expert Contributor


{ "traceback": [ [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/core/handlers/", 41, "inner", "response = get_response(request)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/core/handlers/", 249, "_legacy_get_response", "response = self._get_response(request)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/core/handlers/", 187, "_get_response", "response = self.process_exception_by_middleware(e, request)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/core/handlers/", 185, "_get_response", "response = wrapped_callback(request, *callback_args, **callback_kwargs)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/build/env/lib/python2.7/site-packages/Django-1.11-py2.7.egg/django/utils/", 185, "inner", "return func(*args, **kwargs)" ], [ "/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hue/desktop/libs/indexer/src/indexer/", 229, "guess_field_types", "for col in table_metadata:" ] ] }


 CDH version: CDH6.2.0,  can you give me advises for this iuuse ? 


i have UAT , PRD, DEV env, all envs has this issue? i have checked all the machines by CM test function. all the check opition is ok .




Expert Contributor

Hi, Cloudera support:


i am going to give you a summary of CDH 6.2.0 of EL7 HUE issue what it's happened in my env.


1. Hue editor can not execute sqoop1

2. Hue/oozie can not submit sqoop1 workflow

3. hue can not add table or database as above example.

4. hue can not add external database like mysql by using JDBC or rdbms. 


if the errors could tell me i lost something or anyelse, but the painful is the error log can tell me nothing why it happended.


anything i have lost while i installed the CDH?  cloud you give me advises to slove these problems ?  



Expert Contributor

i have thought these issues, at last i can sure these function is all connect to sqoop, Hue add table is oozie/sqoop temp job, hue editor to execute sqoop also is oozie/sqoop temp job, and oozie/sqoop action is still sqoop job.


these all failed, what's the root cause about these issues? 

Expert Contributor


i have search from google and find out this bug fixed, but thiis is just fix hue 4.4, does cloudera have provided path for hue 4.3 ?



Expert Contributor

after sent the last comment on hue little bugs, i was going to test more, and want to share you more details about hue 4.3 bugs.


Hue has provided 3 methos to use sqoop,

1) hue + add table, this is a temp oozie job as i think

2) sqoop1 notebook in hue editor, put sqoop command and execute just like sqoop Cli, i also think this is a temp  oozie job ,why i think so, cause these error is the same, i will show you the error logs and graph. 

3)oozie sqoop document


one thing i can confirm the 3) is a bug, the link is:


and get a patch to change some code in 



now, it works fine, but it just works fine of using HIVESERVER2, what i mean is i need to add --HS2-URL , if i lost this parameter, it will show me errors meaning GSS init failed(i am using kerberos)


ok. let's start to test all these 3 method to sqoop table from mysql to hive.


1) hue + add table function



after set mysql jdbc, and choose table or choose all tables, the errors is 'command' , perviously, the errors is not 'command', after i fix the 3 (oozie sqoop document bug), befort that, the errors as i dsecribed in the top of this message.


2) sqoop notebook in hue editor, i used sqoop command what i have tested in sqoop Cli, and oozie sqoop document, the errors also is 'command'



3) sqoop document in oozie, yes, i have changed some code following patch, and its successful. but pay more attention what i have described , i MUST add --HS2-URL parameter, and even thought the job success, it still has some erros like below:

07:53:11.834 [main] INFO  org.apache.hadoop.mapreduce.Job - Counters: 33
	File System Counters
		FILE: Number of bytes read=0
		FILE: Number of bytes written=546636
		FILE: Number of read operations=0
		FILE: Number of large read operations=0
		FILE: Number of write operations=0
		HDFS: Number of bytes read=85
		HDFS: Number of bytes written=7864
		HDFS: Number of read operations=6
		HDFS: Number of large read operations=0
		HDFS: Number of write operations=2
		HDFS: Number of bytes read erasure-coded=0
	Job Counters 
		Launched map tasks=1
		Other local map tasks=1
		Total time spent by all maps in occupied slots (ms)=6597
		Total time spent by all reduces in occupied slots (ms)=0
		Total time spent by all map tasks (ms)=6597
		Total vcore-milliseconds taken by all map tasks=6597
		Total megabyte-milliseconds taken by all map tasks=6755328
	Map-Reduce Framework
		Map input records=92
		Map output records=92
		Input split bytes=85
		Spilled Records=0
		Failed Shuffles=0
		Merged Map outputs=0
		GC time elapsed (ms)=72
		CPU time spent (ms)=2810
		Physical memory (bytes) snapshot=390754304
		Virtual memory (bytes) snapshot=2658488320
		Total committed heap usage (bytes)=530055168
		Peak Map Physical memory (bytes)=390754304
		Peak Map Virtual memory (bytes)=2658488320
	File Input Format Counters 
		Bytes Read=0
	File Output Format Counters 
		Bytes Written=7864
07:53:11.841 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase - Transferred 7.6797 KB in 20.8706 seconds (376.7984 bytes/sec)
07:53:11.845 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase - Retrieved 92 records.
07:53:11.845 [main] INFO  org.apache.sqoop.mapreduce.ImportJobBase - Publishing Hive/Hcat import job data to Listeners for table tbls
07:53:11.849 [main] INFO  org.apache.sqoop.hive.HiveServer2Client - Loading uploaded data into Hive.
07:53:11.881 [main] INFO  org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM `tbls` AS t LIMIT 1
07:53:11.889 [main] INFO  org.apache.sqoop.hive.HiveServer2ConnectionFactory - Creating connection to HiveServer2 as: hdfs (auth:SIMPLE)
07:53:11.900 [main] INFO  org.apache.hive.jdbc.Utils - Supplied authorities:
07:53:11.901 [main] INFO  org.apache.hive.jdbc.Utils - Resolved authority:
07:53:12.667 [main] INFO  org.apache.sqoop.hive.HiveClientCommon - Export directory is contains the _SUCCESS file only, removing the directory.
07:53:12.673 [main] INFO  org.apache.sqoop.hive.HiveServer2Client - Hive import complete.

<<< Invocation of Sqoop command completed <<<

Hadoop Job IDs executed by Sqoop: job_1561790724479_0195

	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(
	at java.lang.reflect.Method.invoke(
	at org.apache.oozie.action.hadoop.LauncherAM.runActionMain(
	at org.apache.oozie.action.hadoop.LauncherAM.access$300(
	at org.apache.oozie.action.hadoop.LauncherAM$
	at Method)
	at org.apache.oozie.action.hadoop.LauncherAM$
	at Method)
	at org.apache.oozie.action.hadoop.LauncherAM.main(
Caused by: java.lang.SecurityException: Intercepted System.exit(0)
	at java.lang.Runtime.exit(
	at java.lang.System.exit(
	at org.apache.sqoop.Sqoop.main(
	at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(
	at org.apache.oozie.action.hadoop.SqoopMain.main(
	... 16 more
Intercepting System.exit(0)

if i don't add --HS2-URL, it will tell me GSS failed.


sqoop Cli always success, no any problems.


ok, that's all. all these bugs or not bugs how to fix it ? could you give some advises to me ? thanks very much.