Member since
02-17-2017
19
Posts
3
Kudos Received
0
Solutions
12-10-2019
01:24 AM
Hi, Have you tried submitting applications to other queue? Could you tell us does it worked on other queues? Queue.default has reached 99.7% and this may cause issues. Thanks AK
... View more
05-28-2019
01:44 AM
@Carol Elliott, 1.Remote server needs to direct its output to our local machine via DISPLAY environment variable. -echo $DISPLAY -> localhost:10.0(With putty, $DISPLAY is automatically set to point at display 10 or above when X-forwarding ) - write in local ssh configuration file : Host remote.host.name ForwardX11 yes 2.We at our end , local machine need to accept the connection after authenticating it -authentication done in two ways -using ssh , which forwards x-connection -making an entry in host mechanism -the magic cookie mechanism Hence ask the remote server to direct its output to you as until they direct it , you cant see the sqoop commands on your display running at remote server.
... View more
01-15-2018
09:21 AM
Logs are hiveserver2.log and hivemetastore.log. I'm gonna ask you some questions. First, check whether existing directory of hive table or not in HDFS. As far as I know, do not use param "--hive-overwrite" with "--delete-target-dir". Just remove "--hive-overwrite", and re-run your query. The sqoop job cannot re-create hive table's directory in HDFS normally after removed existing table directory. I think, it is a bug.. Second, check the type matching of columns between Oracle Table and Hive Table. * had to be cast to a less precise type in Hive <- this issue is related with supporting hive datatypes from Oracle, MySQL, PostgreSQL..etc. Such as "Oracle Table : Timestamp - Hive Table - String.
... View more
01-10-2018
09:06 PM
This worked for us. Thank you, @Sridhar Reddy.
... View more
11-16-2018
12:42 AM
@carol elliott you need to set client option for unsupport terminal prior to launching via nohup: export HADOOP_CLIENT_OPTS="-Djline.terminal=jline.UnsupportedTerminal" nohup beeline -f foo.sql -u ${jdbcurl} >> nohup_beelineoutput.out &
... View more
03-17-2017
08:49 PM
hive.exec.scratchdir on this cluster is /tmp/hive. Don't know why the user appears to be exceeding quota on a personal directory.
... View more
03-17-2017
06:26 AM
4 Kudos
@Carol Elliott can you try the '--target-dir' option. This will import the files into the /dest directory sqoop import --connnect <connect-str> --table <tableName> --target-dir /dest \
... View more