Reply
Highlighted
New Contributor
Posts: 1
Registered: ‎07-02-2017

Hive quit logging with installed Spark

Hello together,

I have a very mysterious problem. On a Hive-Server without Spark, hive.log will be written.
As soon as I install Spark, hive.log is no longer written.
The reason for this is the environment variable SPARK_HOME. If I'm taking the following block
out of /usr/lib/hive/bin/hive, then hive.log is written.

if [[ -z "$SPARK_HOME" ]]
then
bin=`dirname "$0"`
# many hadoop installs are in dir/{spark,hive,hadoop,..}
sparkHome=$(readlink -f $bin/../../spark)
if [[ -d $sparkHome ]]
then
export SPARK_HOME=$sparkHome
fi
fi

Same problem if I set SPARK_HOME manually.

Does somebody still have an idea or hint? Many thanks in advance!

Runnin CDH Version is 5.7.4 Regards, Daniel

 

Announcements