Member since
09-04-2018
33
Posts
2
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
14957 | 10-15-2018 09:26 AM | |
22139 | 09-15-2018 08:53 PM |
10-29-2018
03:16 PM
@Christos Stefanopoulos HDP 3.0 has different way of integrating Apache Hive with Apache Spark using Hive Warehouse Connector. Below article explains the steps: https://community.hortonworks.com/content/kbentry/223626/integrating-apache-hive-with-apache-spark-hive-war.html
... View more
10-29-2018
03:13 PM
HDP 3.0 has different way of integrating Apache Hive with Apache Spark using Hive Warehouse Connector. Below article explains the steps: https://community.hortonworks.com/content/kbentry/223626/integrating-apache-hive-with-apache-spark-hive-war.html
... View more
10-15-2018
09:31 AM
@Tongzhou Zhou I copied /etc/hive/hive-site.xml from hive conf directory to /etc/spark2/ and then removed below properties from /etc/spark2/conf/hive-site.xml. It's working now, I can see Hive databases in spark (pyspakr, spark-shell, spark-sql etc). hive.tez.cartesian-product.enabled
hive.metastore.warehouse.external.dir
hive.server2.webui.use.ssl
hive.heapsize
hive.server2.webui.port
hive.materializedview.rewriting.incremental
hive.server2.webui.cors.allowed.headers
hive.driver.parallel.compilation
hive.tez.bucket.pruning
hive.hook.proto.base-directory
hive.load.data.owner
hive.execution.mode
hive.service.metrics.codahale.reporter.classes
hive.strict.managed.tables
hive.create.as.insert.only
hive.optimize.dynamic.partition.hashjoin
hive.server2.webui.enable.cors
hive.metastore.db.type
hive.txn.strict.locking.mode
hive.metastore.transactional.event.listeners
hive.tez.input.generate.consistent.splits Can you please try this and let me know if you still face this issue?
... View more
10-15-2018
09:26 AM
1 Kudo
I copied /etc/hive/hive-site.xml from hive conf directory to /etc/spark2/ and then removed below properties from /etc/spark2/conf/hive-site.xml. It's working now, I can see Hive databases in spark (pyspakr, spark-shell, spark-sql etc). hive.tez.cartesian-product.enabled
hive.metastore.warehouse.external.dir
hive.server2.webui.use.ssl
hive.heapsize
hive.server2.webui.port
hive.materializedview.rewriting.incremental
hive.server2.webui.cors.allowed.headers
hive.driver.parallel.compilation
hive.tez.bucket.pruning
hive.hook.proto.base-directory
hive.load.data.owner
hive.execution.mode
hive.service.metrics.codahale.reporter.classes
hive.strict.managed.tables
hive.create.as.insert.only
hive.optimize.dynamic.partition.hashjoin
hive.server2.webui.enable.cors
hive.metastore.db.type
hive.txn.strict.locking.mode
hive.metastore.transactional.event.listeners
hive.tez.input.generate.consistent.splits Do you see any consequences?
... View more
10-13-2018
05:54 PM
@Felix Albani If I don't copy hive-site.xml from hive conf directory for spark then I can't see Hive databases in spark(pyspark and spark-shell). Could you please explain me what all properties I should add in hive-site.xml of saprk and where should I update hive.metastore.uris? If I copy below property from hive conf to spark conf, will this work? Technical Stack Details: HDP3.0 Spark2.3 Hive3.1 <configuration>
<property>
<name>hive.metastore.uris</name>
<!-- hostname must point to the Hive metastore URI in your cluster -->
<value>thrift://hostname:9083</value>
<description>URI for client to contact metastore server</description>
</property>
</configuration>
... View more
10-11-2018
11:40 AM
df.write.format("orc").mode("overwrite").saveAsTable("database.table-name") When I create a Hive table through Spark, I am able to query the table from Spark but having issue while accessing table data through Hive. I get below error. Error: java.io.IOException: java.lang.IllegalArgumentException: bucketId out of range: -1 (state=,code=0) I am able to view table metadata.
... View more
10-11-2018
09:09 AM
@Tongzhou Zhou Sorry for delayed response. After copying hive-site.xml from hive-conf dir to spark-conf dir, I am able to access Hive databases from pyspark and spark-shell, But I am also getting same error while initiating spark-sql session. Did you find what is the best way to use hive databases within all Spark APIs (spark-sql, pyspark, spark-shell and spark-submit etc)?
... View more
10-11-2018
08:58 AM
I am facing issue while initiating spark-sql session. Initially when I initiated spark session only default database was visible (Not default database of Hive but same of Spark). In order to view hive databases I copied hive-site.xml from hive-conf dir to spark-conf dir. After I copied hive-site.xml I am getting below error. $ spark-sql
WARN HiveConf: HiveConf of name hive.tez.cartesian-product.enabled does not exist
WARN HiveConf: HiveConf of name hive.metastore.warehouse.external.dir does not exist
WARN HiveConf: HiveConf of name hive.server2.webui.use.ssl does not exist
WARN HiveConf: HiveConf of name hive.heapsize does not exist
WARN HiveConf: HiveConf of name hive.server2.webui.port does not exist
WARN HiveConf: HiveConf of name hive.materializedview.rewriting.incremental does not exist
WARN HiveConf: HiveConf of name hive.server2.webui.cors.allowed.headers does not exist
WARN HiveConf: HiveConf of name hive.driver.parallel.compilation does not exist
WARN HiveConf: HiveConf of name hive.tez.bucket.pruning does not exist
WARN HiveConf: HiveConf of name hive.hook.proto.base-directory does not exist
WARN HiveConf: HiveConf of name hive.load.data.owner does not exist
WARN HiveConf: HiveConf of name hive.execution.mode does not exist
WARN HiveConf: HiveConf of name hive.service.metrics.codahale.reporter.classes does not exist
WARN HiveConf: HiveConf of name hive.strict.managed.tables does not exist
WARN HiveConf: HiveConf of name hive.create.as.insert.only does not exist
WARN HiveConf: HiveConf of name hive.optimize.dynamic.partition.hashjoin does not exist
WARN HiveConf: HiveConf of name hive.server2.webui.enable.cors does not exist
WARN HiveConf: HiveConf of name hive.metastore.db.type does not exist
WARN HiveConf: HiveConf of name hive.txn.strict.locking.mode does not exist
WARN HiveConf: HiveConf of name hive.metastore.transactional.event.listeners does not exist
WARN HiveConf: HiveConf of name hive.tez.input.generate.consistent.splits does not exist
INFO metastore: Trying to connect to metastore with URI thrift://<host-name>:9083
INFO metastore: Connected to metastore.
INFO SessionState: Created local directory: /tmp/7b9d5455-e71a-4bd5-aa4b-385758b575a8_resources
INFO SessionState: Created HDFS directory: /tmp/hive/spark/7b9d5455-e71a-4bd5-aa4b-385758b575a8
INFO SessionState: Created local directory: /tmp/spark/7b9d5455-e71a-4bd5-aa4b-385758b575a8
INFO SessionState: Created HDFS directory: /tmp/hive/spark/7b9d5455-e71a-4bd5-aa4b-385758b575a8/_tmp_space.db
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/tez/dag/api/SessionNotRunning
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:529)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:133)
at org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:904)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.tez.dag.api.SessionNotRunning
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
... 13 more
INFO ShutdownHookManager: Shutdown hook called
INFO ShutdownHookManager: Deleting directory /tmp/spark-911cc8f5-f53b-4ae6-add3-0c745581bead
$ I am able to run pyspark and spark-shell session successfully and Hive databases are visible to me in pyspark/spark-shell session. The error is related to tez and I confirmed that tez services are running fine. I am successfully able to access hive tables through hive2. I am using HDP3.0 and for Hive execution engine is Tez (Map-Reduce has been removed).
... View more
Labels:
10-10-2018
11:53 AM
I am getting errors while installing HBase on a 3 node cluster. I am using HDP3.0 on oracle linux based machines. I have ranger installed on cluster. Error in Ambari-UI Installation: ERROR: KeeperErrorCode = NoNode for /hbase-unsecure/meta-region-server
NotImplementedError: fstat unimplemented unsupported or native support failed to load; see http://wiki.jruby.org/Native-Libraries Error in hbase log dir (/var/log/hbase/hbase-hbase-regionserver-<hostname>.out): INFO [main] internal.NativeLibraryLoader: /tmp/liborg_apache_hbase_thirdparty_netty_transport_native_epoll_x86_641206763580541674939.so exists but cannot be executed even when execute permissions set; check volume for "noexec" flag; use -Dio.netty.native.workdir=[path] to set native working directory separately.
ERROR [main] regionserver.HRegionServer: Failed construction RegionServer
java.lang.UnsatisfiedLinkError: failed to load the required native library
at org.apache.hbase.thirdparty.io.netty.channel.epoll.Epoll.ensureAvailability(Epoll.java:81)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoop.<clinit>(EpollEventLoop.java:55)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoopGroup.newChild(EpollEventLoopGroup.java:134)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoopGroup.newChild(EpollEventLoopGroup.java:35)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:84)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:58)
at org.apache.hbase.thirdparty.io.netty.util.concurrent.MultithreadEventExecutorGroup.<init>(MultithreadEventExecutorGroup.java:47)
at org.apache.hbase.thirdparty.io.netty.channel.MultithreadEventLoopGroup.<init>(MultithreadEventLoopGroup.java:59)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoopGroup.<init>(EpollEventLoopGroup.java:104)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoopGroup.<init>(EpollEventLoopGroup.java:91)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoopGroup.<init>(EpollEventLoopGroup.java:68)
at org.apache.hadoop.hbase.util.NettyEventLoopGroupConfig.<init>(NettyEventLoopGroupConfig.java:61)
at org.apache.hadoop.hbase.regionserver.HRegionServer.setupNetty(HRegionServer.java:673)
at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:532)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.hbase.regionserver.HRegionServer.constructRegionServer(HRegionServer.java:2977)
at org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine.start(HRegionServerCommandLine.java:63)
at org.apache.hadoop.hbase.regionserver.HRegionServerCommandLine.run(HRegionServerCommandLine.java:87)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.hbase.util.ServerCommandLine.doMain(ServerCommandLine.java:149)
at org.apache.hadoop.hbase.regionserver.HRegionServer.main(HRegionServer.java:2995)
Caused by: java.lang.UnsatisfiedLinkError: /tmp/liborg_apache_hbase_thirdparty_netty_transport_native_epoll_x86_641206763580541674939.so: /tmp/liborg_apache_hbase_thirdparty_netty_transport_native_epoll_x86_641206763580541674939.so: failed to map segment from shared object: Operation not permitted
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:36)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:243)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:187)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.Native.loadNativeLibrary(Native.java:207)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.Native.<clinit>(Native.java:65)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.Epoll.<clinit>(Epoll.java:33)
... 23 more
Suppressed: java.lang.UnsatisfiedLinkError: /tmp/liborg_apache_hbase_thirdparty_netty_transport_native_epoll_x86_641206763580541674939.so: /tmp/liborg_apache_hbase_thirdparty_netty_transport_native_epoll_x86_641206763580541674939.so: failed to map segment from shared object: Operation not permitted
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1941)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1824)
at java.lang.Runtime.load0(Runtime.java:809)
at java.lang.System.load(System.java:1086)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:36)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader$1.run(NativeLibraryLoader.java:263)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibraryByHelper(NativeLibraryLoader.java:255)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:233)
... 27 more
Suppressed: java.lang.UnsatisfiedLinkError: no org_apache_hbase_thirdparty_netty_transport_native_epoll_x86_64 in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:243)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:124)
... 26 more
Suppressed: java.lang.UnsatisfiedLinkError: no org_apache_hbase_thirdparty_netty_transport_native_epoll_x86_64 in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader$1.run(NativeLibraryLoader.java:263)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibraryByHelper(NativeLibraryLoader.java:255)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:233)
... 27 more
Suppressed: java.lang.UnsatisfiedLinkError: could not load a native library: org_apache_hbase_thirdparty_netty_transport_native_epoll
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:205)
at org.apache.hbase.thirdparty.io.netty.channel.epoll.Native.loadNativeLibrary(Native.java:210)
... 25 more
Caused by: java.io.FileNotFoundException: META-INF/native/liborg_apache_hbase_thirdparty_netty_transport_native_epoll.so
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:161)
... 26 more
Suppressed: java.lang.UnsatisfiedLinkError: no org_apache_hbase_thirdparty_netty_transport_native_epoll in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:243)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:124)
... 26 more
Suppressed: java.lang.UnsatisfiedLinkError: no org_apache_hbase_thirdparty_netty_transport_native_epoll in java.library.path
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1867)
at java.lang.Runtime.loadLibrary0(Runtime.java:870)
at java.lang.System.loadLibrary(System.java:1122)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader$1.run(NativeLibraryLoader.java:263)
at java.security.AccessController.doPrivileged(Native Method)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibraryByHelper(NativeLibraryLoader.java:255)
at org.apache.hbase.thirdparty.io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:233)
... 27 more
... View more
Labels:
10-10-2018
10:15 AM
Very useful information.
... View more