Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Spark is not able to load Phoenix Tables.,Not able to load Phoenix table in spark

avatar
Contributor

@Guilherme Braccialli

I have created a table in phoenix and tried to load it using dataframe api in spark-shell.

I am getting error as " org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table undefined. tableName=TEST"

phoenixtable.pngspark-phoenix-tableread2.png

Note :

1. I checked in Hbase shell, and table which I created using Phoenix shell exists in Hbase too.

2. I am launching the spark shell as below :

spark-shell --master yarn-client \ --jars /usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar,/usr/hdp/current/phoenix-client/lib/phoenix-spark-4.4.0.2.3.4.0-3485.jar \ --conf "spark.executor.extraClassPath=/usr/hdp/current/phoenix-client/phoenix-4.4.0.2.3.4.0-3485-client.jar"

- Attached the snapshots.


spark-phoenix-tableread2.pngphoenixtable.png
1 ACCEPTED SOLUTION

avatar
Master Guru

try

/usr/hdp/current/phoenix-client/bin/sqlline.py myserver:2181:/hbase-unsecure

check your table and table space.

make sure the table is there. is there a schema?

and try the same query

0: jdbc:phoenix:coolserverhortonworks> !tables
+------------+--------------+--------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---+
| TABLE_CAT  | TABLE_SCHEM  |  TABLE_NAME  |  TABLE_TYPE   | REMARKS  | TYPE_NAME  | SELF_REFERENCING_COL_NAME  | REF_GENERATION  | INDEX_STATE  | IMMUTABLE_ROWS  | SALT_BUCKETS  | M |
+------------+--------------+--------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---+
|            | SYSTEM       | CATALOG      | SYSTEM TABLE  |          |            |                            |                 |              | false           | null          | f |
|            | SYSTEM       | FUNCTION     | SYSTEM TABLE  |          |            |                            |                 |              | false           | null          | f |
|            | SYSTEM       | SEQUENCE     | SYSTEM TABLE  |          |            |                            |                 |              | false           | null          | f |
|            | SYSTEM       | STATS        | SYSTEM TABLE  |          |            |                            |                 |              | false           | null          | f |
|            |              | PHILLYCRIME  | TABLE         |          |            |                            |                 |              | false           | null          | f |
|            |              | PRICES       | TABLE         |          |            |                            |                 |              | false           | null          | f |
|            |              | TABLE1       | TABLE         |          |            |                            |                 |              | false           | null          | f |
+------------+--------------+--------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---+
0: jdbc:phoenix:coolhortonworks>

https://phoenix.apache.org/phoenix_spark.html

View solution in original post

6 REPLIES 6

avatar

It is actually weird, if table is not found in MeteData cache , it should catch the exception and try to update the cache with the server. Not sure, why the exception is propagated out so early.

avatar
Super Collaborator

@anjul tiwari

have you tried changing the hbase tablespace into '<host>:2181/hbase-unsecure' in stead of /hbase ?

avatar
Contributor

I tried with <host>:2181/hbase-unsecure, but still same issue.

7896-screen-shot-2016-09-22-at-15904-pm.png

7897-screen-shot-2016-09-22-at-15404-pm.png

avatar
Master Guru

try

/usr/hdp/current/phoenix-client/bin/sqlline.py myserver:2181:/hbase-unsecure

check your table and table space.

make sure the table is there. is there a schema?

and try the same query

0: jdbc:phoenix:coolserverhortonworks> !tables
+------------+--------------+--------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---+
| TABLE_CAT  | TABLE_SCHEM  |  TABLE_NAME  |  TABLE_TYPE   | REMARKS  | TYPE_NAME  | SELF_REFERENCING_COL_NAME  | REF_GENERATION  | INDEX_STATE  | IMMUTABLE_ROWS  | SALT_BUCKETS  | M |
+------------+--------------+--------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---+
|            | SYSTEM       | CATALOG      | SYSTEM TABLE  |          |            |                            |                 |              | false           | null          | f |
|            | SYSTEM       | FUNCTION     | SYSTEM TABLE  |          |            |                            |                 |              | false           | null          | f |
|            | SYSTEM       | SEQUENCE     | SYSTEM TABLE  |          |            |                            |                 |              | false           | null          | f |
|            | SYSTEM       | STATS        | SYSTEM TABLE  |          |            |                            |                 |              | false           | null          | f |
|            |              | PHILLYCRIME  | TABLE         |          |            |                            |                 |              | false           | null          | f |
|            |              | PRICES       | TABLE         |          |            |                            |                 |              | false           | null          | f |
|            |              | TABLE1       | TABLE         |          |            |                            |                 |              | false           | null          | f |
+------------+--------------+--------------+---------------+----------+------------+----------------------------+-----------------+--------------+-----------------+---------------+---+
0: jdbc:phoenix:coolhortonworks>

https://phoenix.apache.org/phoenix_spark.html

avatar
Contributor

Thanks a lot @Timothy Spann for your help. Now I am able to load the phoenix table in spark. I was missing a colon ( : ) between port(2181) number and hbase tablespace (hbase-unsecure) while loading the table in spark.

Now after correcting the issue, spark is loading the phoenix table.

avatar
Contributor

Hi @Timothy Spann and @Jasper , I found the cause of issue now.

The issue was I was not putting colon (: ) between port(2181) and hbase tablespace(hbase-unsecure) in spark-shell properly while loading the table.

- Earlier I was loading the table in spark-shell as below, which was giving me no Table found error.

val jdbcDF = sqlContext.read.format("jdbc").options(

Map(

"driver" -> "org.apache.phoenix.jdbc.PhoenixDriver",

"url" -> "jdbc:phoenix:<host>:2181/hbase-unsecure",

"dbtable" -> "TEST_TABLE2")

).load()

- But now after putting colon ( : ) between port(2181) number andhbase tablespace (hbase-unsecure). I am able to load table.

val jdbcDF = sqlContext.read.format("jdbc").options(

Map(

"driver" -> "org.apache.phoenix.jdbc.PhoenixDriver",

"url" -> "jdbc:phoenix:<host>:2181:/hbase-unsecure",

"dbtable" -> "TEST_TABLE2")

).load()