Support Questions

Find answers, ask questions, and share your expertise

How do I suppress INFO and WARNING messages when connecting to Hive?

avatar
Contributor

I am making a connection to Hive in Java with the following:

Connection conn = DriverManager.getConnection(connectionUrl, userName, password);

I then immediately get these INFO messages:

Aug 30, 2016 5:54:53 PM org.apache.hive.jdbc.Utils parseURL
INFO: Supplied authorities: hdb:10000
Aug 30, 2016 5:54:53 PM org.apache.hive.jdbc.Utils parseURL
INFO: Resolved authority: hdb:10000
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.

On a command line, these messages go to STDERR so I could redirect it to /dev/null but that also means I redirect all errors. I would then lose all error message being sent to STDERR which is not good.

Is there a URL connection option or something similar, to suppress INFO and WARNING messages when Hive makes a connection?

1 ACCEPTED SOLUTION

avatar
Contributor

I figured it out. I had to do the following:

1. Add slf4j-api.jar, slf4j-log4j12.jar, log4j.jar, hive-jdbc.jar, and hadoop-common.jar to my class_path.

2. Create a log4j.properties file with:

log4j.rootLogger=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.conversionPattern=%5p [%t] (%F:%L) - %m%n

3. Add the path to a log4j.properties file:

-Dlog4j.configuration=file:/path/to/log4j.properties

That is a lot of work just to change the logging level. I wish Hive would default to only showing ERROR and FATAL errors instead of showing all WARN and INFO messages.

View solution in original post

6 REPLIES 6

avatar
Guru

You should be able to specify "hiveConfs" and then try and set "hive.root.logger" to probably something like "FATAL" or "ERROR", that way you should be able to suppress some of the INFO messages. They are retrieving the values from log4j.properties, so another way would be to set it over there.

Reference:

https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.0/bk_dataintegration/content/hive-jdbc-odbc-d...

avatar
Contributor

I tried adding it as otherSessionConfs and as hiveConfs but neither worked. As hiveConfs, I got this error message:

Failed to open new session: org.apache.hive.service.cli.HiveSQLException: java.lang.IllegalArgumentException: hive configuration hive.root.logger does not exists.

avatar
Guru
@Jon Roberts

Can you try something like this:

[hive@node1 ~]$ cat TestJdbcClient.java
import java.sql.*;
import org.apache.hadoop.security.UserGroupInformation;


public class TestJdbcClient {
  public static void main (String args[]) {
    try {
      org.apache.hadoop.conf.Configuration conf = new     org.apache.hadoop.conf.Configuration();
      conf.set("hadoop.security.authentication", "Kerberos");
      UserGroupInformation.setConfiguration(conf);
      UserGroupInformation.loginUserFromKeytab("hive/node1.hortonworks.com@HWX.COM", "/etc/security/keytabs/hive.service.keytab");
      Class.forName("org.apache.hive.jdbc.HiveDriver");
      System.out.println("getting connection");
      Connection con = DriverManager.getConnection("jdbc:hive2://node1.hortonworks.com:10000/default;hive.root.logger=ERROR,DFRA;principal=hive/node1.hortonworks.com@HWX.COM");
      System.out.println("got connection");


      Statement stmt = con.createStatement();
      String sql = "show tables";
      System.out.println("Running: " + sql);
      ResultSet res = stmt.executeQuery(sql);
      if (res.next()) {
      System.out.println(res.getString(1));
      while (res.next()) {
      System.out.println(res.getString(1));
      }
      }
      con.close();
    }
    catch (Exception e) {
      e.printStackTrace();
    }
    
  }
}


This is my result

[hive@node1 ~]$ java -cp /usr/hdp/2.3.4.0-3485/hive/lib/hive-jdbc-1.2.1.2.3.4.0-3485-standalone.jar:/usr/hdp/2.3.4.0-3485/hadoop/client/commons-configuration-1.6.jar:/etc/hive/conf/hive-site.xml:/usr/hdp/2.3.4.0-3485/hadoop/client/hadoop-common-2.7.1.2.3.4.0-3485.jar:/usr/hdp/2.3.4.0-3485/hadoop-yarn/lib/log4j-1.2.17.jar:/usr/hdp/2.3.4.0-3485/hadoop/client/slf4j-log4j12.jar:/usr/hdp/2.3.4.0-3485/hadoop/hadoop-auth-2.7.1.2.3.4.0-3485.jar:/usr/hdp/2.3.4.0-3485/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar:. TestJdbcClient
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
getting connection
got connection
Running: show tables
csvinternal
customera
cvsexternal
mytemp
sample_07
src
temp_source
test
testabc
testnormal
testnormal1
tgt

avatar
Contributor

Your example also displays the unwanted log4j WARN messages.

avatar
Contributor

I figured it out. I had to do the following:

1. Add slf4j-api.jar, slf4j-log4j12.jar, log4j.jar, hive-jdbc.jar, and hadoop-common.jar to my class_path.

2. Create a log4j.properties file with:

log4j.rootLogger=WARN, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.conversionPattern=%5p [%t] (%F:%L) - %m%n

3. Add the path to a log4j.properties file:

-Dlog4j.configuration=file:/path/to/log4j.properties

That is a lot of work just to change the logging level. I wish Hive would default to only showing ERROR and FATAL errors instead of showing all WARN and INFO messages.

avatar
Explorer

Can you help me with more detail on this. How should I ?

When I connect to hive or run any command in hive I am having plenty of of INFO & WARN message and my hive result is lost in these junk INFO. How should I get rid of this?