Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

Hive quit logging with installed Spark

Highlighted

Hive quit logging with installed Spark

New Contributor
Hello together,

I have a very mysterious problem. On a Hive-Server without Spark, hive.log will be written.
As soon as I install Spark, hive.log is no longer written.
The reason for this is the environment variable SPARK_HOME. If I'm taking the following block
out of /usr/lib/hive/bin/hive, then hive.log is written.

if [[ -z "$SPARK_HOME" ]]
then
bin=`dirname "$0"`
# many hadoop installs are in dir/{spark,hive,hadoop,..}
sparkHome=$(readlink -f $bin/../../spark)
if [[ -d $sparkHome ]]
then
export SPARK_HOME=$sparkHome
fi
fi

Same problem if I set SPARK_HOME manually.

Does somebody still have an idea or hint? Many thanks in advance!

Runnin CDH Version is 5.7.4 Regards, Daniel