Member since
10-13-2016
6
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1168 | 02-23-2017 05:28 PM |
02-23-2017
05:28 PM
I also found out from phoenix-bulk load example that I need to provide the HADOOP_CLASSPATH. HADOOP_CLASSPATH=/usr/hdp/current/hbase-master/lib/hbase-protocol.jar:/usr/hdp/current/hbase-master/conf hadoop jar $1/phoenix-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool -Dfs.permissions.umask-mode=000 --table TEST.FLIGHT_SEGMENT --input /tmp/test/Segments.tsv -z devone1.lab.com:2181/hbase-unsecure
... View more
02-20-2017
03:59 PM
Running phoenix bulk load: hadoop jar /usr/hdp/current/phoenix-client/phoenix-client.jar org.apache.phoenix.mapreduce.CsvBulkLoadTool -Dfs.permissions.umask-mode=000 --table TEST.FLIGHT_SEGMENT --input /tmp/test/Segments.tsv -z devone1.lab.com:2181/hbase-unsecure I get a NPE error. I verified that Hbase has correct zookeeper.znode.parent = /hbase-unsecure. I am running the jar as hdfs user. Any thoughts? Exception in thread "main" java.sql.SQLException: java.lang.RuntimeException: java.lang.NullPointerException
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2590)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2327)
at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)
at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2327)
at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:233)
at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:142)
at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
at java.sql.DriverManager.getConnection(DriverManager.java:664)
at java.sql.DriverManager.getConnection(DriverManager.java:208)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:305)
at org.apache.phoenix.util.QueryUtil.getConnection(QueryUtil.java:296)
at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.loadData(AbstractBulkLoadTool.java:209)
at org.apache.phoenix.mapreduce.AbstractBulkLoadTool.run(AbstractBulkLoadTool.java:183)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at org.apache.phoenix.mapreduce.CsvBulkLoadTool.main(CsvBulkLoadTool.java:101)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:233)
at org.apache.hadoop.util.RunJar.main(RunJar.java:148)
Caused by: java.lang.RuntimeException: java.lang.NullPointerException
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:208)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:326)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:301)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:166)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:161)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:405)
at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2358)
... 21 more
Caused by: java.lang.NullPointerException
at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.getMetaReplicaNodes(ZooKeeperWatcher.java:395)
at org.apache.hadoop.hbase.zookeeper.MetaTableLocator.blockUntilAvailable(MetaTableLocator.java:562)
at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getMetaRegionLocation(ZooKeeperRegistry.java:61)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateMeta(ConnectionManager.java:1192)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1159)
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:300)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
... 30 more
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
02-02-2017
04:59 PM
The only dependencies I see for servlet are: javax.servlet.jsp:jsp-api:jar:2.1:runtime from hadoop common javax.servlet:javax.servlet-api:jar:3.1.0:provided (version managed from 3.0.1; scope managed from compile) from hadoop-clientdependencies.txt org.glassfish.grizzly:grizzly-http-servlet:jar:2.1.2:compile from hadoop-client org.glassfish:javax.servlet:jar:3.1:compile from hadoop-client org.eclipse.jetty.orbit:javax.servlet:jar:3.0.0.v201112011016:compile from hadoop-client which should I remove? I have uploaded the dependencies in attachment.
... View more
02-01-2017
10:04 PM
I keep getting a NoClassDefFoundError using the Java API for creating a SparkSession My java code is just simple SparkSession: spark = SparkSession.builder()
.master("localhost") .config("SPARK_MAJOR_VERSION", "2") .config("SPARK_HOME", "/usr/hdp/current/spark2-client")
.appName("Spark E2D") .getOrCreate(); Here is the output: Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/02/01 13:53:44 INFO SparkContext: Running Spark version 2.0.0
17/02/01 13:53:44 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/02/01 13:53:44 INFO SecurityManager: Changing view acls to: hdfs
17/02/01 13:53:44 INFO SecurityManager: Changing modify acls to: hdfs
17/02/01 13:53:44 INFO SecurityManager: Changing view acls groups to:
17/02/01 13:53:44 INFO SecurityManager: Changing modify acls groups to:
17/02/01 13:53:44 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hdfs); groups with view permissions: Set(); users with modify permissions: Set(hdfs); groups with modify permissions: Set()
17/02/01 13:53:44 INFO Utils: Successfully started service 'sparkDriver' on port 37105.
17/02/01 13:53:44 INFO SparkEnv: Registering MapOutputTracker
17/02/01 13:53:44 INFO SparkEnv: Registering BlockManagerMaster
17/02/01 13:53:44 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0b01321d-cafe-4de1-b3f8-f232ce42754b
17/02/01 13:53:44 INFO MemoryStore: MemoryStore started with capacity 863.4 MB
17/02/01 13:53:44 INFO SparkEnv: Registering OutputCommitCoordinator Exception in thread "main" java.lang.NoClassDefFoundError: javax/servlet/FilterRegistration
at org.spark_project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:142)
at org.spark_project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:135)
at org.spark_project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:129)
at org.spark_project.jetty.servlet.ServletContextHandler.<init>(ServletContextHandler.java:99)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:128)
at org.apache.spark.ui.JettyUtils$.createServletHandler(JettyUtils.scala:115)
at org.apache.spark.ui.WebUI.attachPage(WebUI.scala:80)
at org.apache.spark.ui.WebUI$anonfun$attachTab$1.apply(WebUI.scala:64)
at org.apache.spark.ui.WebUI$anonfun$attachTab$1.apply(WebUI.scala:64)
at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
at org.apache.spark.ui.WebUI.attachTab(WebUI.scala:64)
at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:68)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:81)
at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:215)
at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:157)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:443)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2256)
at org.apache.spark.sql.SparkSession$Builder$anonfun$8.apply(SparkSession.scala:831)
at org.apache.spark.sql.SparkSession$Builder$anonfun$8.apply(SparkSession.scala:823)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823) My POM dependencies: <dependencies> <dependency>
<groupId>org.apache.hbase</groupId>
<artifactId>hbase-client</artifactId>
<version>1.2.4</version> </dependency>
<dependency>
<groupId>org.json</groupId>
<artifactId>json</artifactId>
<version>20090211</version>
</dependency>
<dependency>
<groupId>junit</groupId>
<artifactId>junit</artifactId>
<scope>test</scope>
</dependency> <dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.7.3</version>
<exclusions>
<exclusion>
<groupId>javax.servlet</groupId>
<artifactId>servlet-api</artifactId>
</exclusion>
</exclusions>
</dependency> <dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-api</artifactId>
</dependency>
<dependency>
<groupId>joda-time</groupId>
<artifactId>joda-time</artifactId>
</dependency> <dependency>
<groupId>org.apache.commons</groupId>
<artifactId>commons-csv</artifactId>
</dependency>
<dependency>
<groupId>commons-io</groupId>
<artifactId>commons-io</artifactId>
</dependency> <!-- Spark --> <dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.11</artifactId>
<version>2.0.0</version>
</dependency> <dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.11</artifactId>
<version>2.0.0</version>
</dependency>
... View more
Labels:
- Labels:
-
Apache Spark