Support Questions

Find answers, ask questions, and share your expertise
Announcements
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

how to use snappydata-0.7 on hdp 2.5.0?

New Contributor

/usr/hdp/2.5.0.0-1245/spark2/bin/spark-shell --conf spark.snappydata.store.sys-disk-dir=quickstartdatadir --conf spark.snappydata.store.log-file=quickstartdatadir/quickstart.log --conf spark.sql.warehouse.dir=file:/hadoop/snappydata/warehouse --jars file:/home/hadoop/snappydata-0.7-s_2.11.jar

when I goes to step create snapytable, error came up: snappy.createTable(tableName = "colTable", provider = "column", // Create a SnappyData Column table schema = tableSchema, options = Map.empty[String, String], // Map for options. allowExisting = false)

error: 17/03/10 11:02:14 INFO HiveClientUtil: Default warehouse location is /hadoop/snappydata/warehouse 17/03/10 11:02:14

INFO HiveClientUtil: Using SnappyStore as metastore database, dbURL = jdbc:snappydata:;mcast-port=0;disable-streaming=true;default-persistent=true 17/03/10 11:02:15

INFO metastore: Trying to connect to metastore with URI thrift://bigdata1:9083 17/03/10 11:02:15

INFO metastore: Connected to metastore. 17/03/10 11:02:15

INFO HiveClientUtil: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 17/03/10 11:02:15

INFO metastore: Trying to connect to metastore with URI thrift://bigdata1:9083 17/03/10 11:02:15 INFO metastore: Connected to metastore. 17/03/10 11:02:15

INFO SessionState: Created local directory: /tmp/20569da9-f824-41e8-b89d-a7c2e0ccf177_resources 17/03/10 11:02:15

INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/20569da9-f824-41e8-b89d-a7c2e0ccf177 17/03/10 11:02:15

INFO SessionState: Created local directory: /tmp/hadoop/20569da9-f824-41e8-b89d-a7c2e0ccf177 17/03/10 11:02:15

INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/20569da9-f824-41e8-b89d-a7c2e0ccf177/_tmp_space.db 17/03/10 11:02:15

INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /hadoop/snappydata/warehouse 17/03/10 11:02:15

INFO SessionState: Created local directory: /tmp/511a82b4-c992-48c2-8ca0-7a5d1515c14e_resources 17/03/10 11:02:15

INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/511a82b4-c992-48c2-8ca0-7a5d1515c14e 17/03/10 11:02:15

INFO SessionState: Created local directory: /tmp/hadoop/511a82b4-c992-48c2-8ca0-7a5d1515c14e 17/03/10 11:02:15

INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/511a82b4-c992-48c2-8ca0-7a5d1515c14e/_tmp_space.db 17/03/10 11:02:15

INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /hadoop/snappydata/warehouse

org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:java.security.AccessControlException: Permission denied: user=hadoop, path="file:/hadoop":root:root:drwxr-xr-x)

at org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:312)

at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply$mcV$sp(HiveClientImpl.scala:291)

at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply(HiveClientImpl.scala:291)

at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply(HiveClientImpl.scala:291)

at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:262)

at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:209)

at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:208)

at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:251)

at org.apache.spark.sql.hive.client.HiveClientImpl.createDatabase(HiveClientImpl.scala:290)

at org.apache.spark.sql.hive.SnappyStoreHiveCatalog.<init>(SnappyStoreHiveCatalog.scala:94)

at org.apache.spark.sql.internal.SnappySessionState.catalog$lzycompute(SnappySessionState.scala:198)

at org.apache.spark.sql.internal.SnappySessionState.catalog(SnappySessionState.scala:198)

at org.apache.spark.sql.SnappySession.sessionCatalog$lzycompute(SnappySession.scala:118)

at org.apache.spark.sql.SnappySession.sessionCatalog(SnappySession.scala:118)

at org.apache.spark.sql.SnappySession.createTable(SnappySession.scala:719)

... 58 elided

Caused by: org.apache.hadoop.hive.metastore.api.MetaException:

java.security.AccessControlException: Permission denied: user=hadoop, path="file:/hadoop":root:root:drwxr-xr-x

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:14412)

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:14380)

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result.read(ThriftHiveMetastore.java:14314)

at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78)

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_database(ThriftHiveMetastore.java:625)

at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_database(ThriftHiveMetastore.java:612)

at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:644)

at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)

at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)

at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)

at com.sun.proxy.$Proxy23.createDatabase(Unknown Source)

at org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:306)

... 72 more

However, the permission of /hadoop is drwxr-xr-x. 3 hadoop hadoop 4096 3月 9 13:40 hadoop. Any advice? Is snappydata-0.7 not compatible with hdp2.5.0? Thank you!

2 REPLIES 2

Expert Contributor

Before going into compatibility, from your logs it seems your /hadoop directory have different permissions than what you expect/

java.security.AccessControlException: Permission denied: user=hadoop, path="file:/hadoop":root:root:drwxr-xr-x

/hadoop is owned by root:root with write permission only to root. No other user will be able to write to this directory. Change ownership to root:hadoop with permission 775.

Hope it helps.

New Contributor

Thanks for your reply. I changed the ownership to root:hadoop with permission 777, the same thing goes.

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.