Member since
11-22-2016
3
Posts
0
Kudos Received
0
Solutions
03-20-2017
08:39 AM
Thanks for your reply. I changed the ownership to root:hadoop with permission 777, the same thing goes.
... View more
03-14-2017
11:20 AM
/usr/hdp/2.5.0.0-1245/spark2/bin/spark-shell --conf spark.snappydata.store.sys-disk-dir=quickstartdatadir --conf spark.snappydata.store.log-file=quickstartdatadir/quickstart.log --conf spark.sql.warehouse.dir=file:/hadoop/snappydata/warehouse --jars file:/home/hadoop/snappydata-0.7-s_2.11.jar when I goes to step create snapytable, error came up:
snappy.createTable(tableName = "colTable", provider = "column", // Create a SnappyData Column table schema = tableSchema, options = Map.empty[String, String], // Map for options. allowExisting = false) error:
17/03/10 11:02:14 INFO HiveClientUtil: Default warehouse location is /hadoop/snappydata/warehouse 17/03/10 11:02:14 INFO HiveClientUtil: Using SnappyStore as metastore database, dbURL = jdbc:snappydata:;mcast-port=0;disable-streaming=true;default-persistent=true 17/03/10 11:02:15 INFO metastore: Trying to connect to metastore with URI thrift://bigdata1:9083 17/03/10 11:02:15 INFO metastore: Connected to metastore. 17/03/10 11:02:15 INFO HiveClientUtil: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes. 17/03/10 11:02:15 INFO metastore: Trying to connect to metastore with URI thrift://bigdata1:9083 17/03/10 11:02:15 INFO metastore: Connected to metastore. 17/03/10 11:02:15 INFO SessionState: Created local directory: /tmp/20569da9-f824-41e8-b89d-a7c2e0ccf177_resources 17/03/10 11:02:15 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/20569da9-f824-41e8-b89d-a7c2e0ccf177 17/03/10 11:02:15 INFO SessionState: Created local directory: /tmp/hadoop/20569da9-f824-41e8-b89d-a7c2e0ccf177 17/03/10 11:02:15 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/20569da9-f824-41e8-b89d-a7c2e0ccf177/_tmp_space.db 17/03/10 11:02:15 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /hadoop/snappydata/warehouse 17/03/10 11:02:15 INFO SessionState: Created local directory: /tmp/511a82b4-c992-48c2-8ca0-7a5d1515c14e_resources 17/03/10 11:02:15 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/511a82b4-c992-48c2-8ca0-7a5d1515c14e 17/03/10 11:02:15 INFO SessionState: Created local directory: /tmp/hadoop/511a82b4-c992-48c2-8ca0-7a5d1515c14e 17/03/10 11:02:15 INFO SessionState: Created HDFS directory: /tmp/hive/hadoop/511a82b4-c992-48c2-8ca0-7a5d1515c14e/_tmp_space.db 17/03/10 11:02:15 INFO HiveClientImpl: Warehouse location for Hive client (version 1.2.1) is /hadoop/snappydata/warehouse org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:java.security.AccessControlException: Permission denied: user=hadoop, path="file:/hadoop":root:root:drwxr-xr-x) at org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:312) at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply$mcV$sp(HiveClientImpl.scala:291) at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply(HiveClientImpl.scala:291) at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$createDatabase$1.apply(HiveClientImpl.scala:291) at org.apache.spark.sql.hive.client.HiveClientImpl$$anonfun$withHiveState$1.apply(HiveClientImpl.scala:262) at org.apache.spark.sql.hive.client.HiveClientImpl.liftedTree1$1(HiveClientImpl.scala:209) at org.apache.spark.sql.hive.client.HiveClientImpl.retryLocked(HiveClientImpl.scala:208) at org.apache.spark.sql.hive.client.HiveClientImpl.withHiveState(HiveClientImpl.scala:251) at org.apache.spark.sql.hive.client.HiveClientImpl.createDatabase(HiveClientImpl.scala:290) at org.apache.spark.sql.hive.SnappyStoreHiveCatalog.<init>(SnappyStoreHiveCatalog.scala:94) at org.apache.spark.sql.internal.SnappySessionState.catalog$lzycompute(SnappySessionState.scala:198) at org.apache.spark.sql.internal.SnappySessionState.catalog(SnappySessionState.scala:198) at org.apache.spark.sql.SnappySession.sessionCatalog$lzycompute(SnappySession.scala:118) at org.apache.spark.sql.SnappySession.sessionCatalog(SnappySession.scala:118) at org.apache.spark.sql.SnappySession.createTable(SnappySession.scala:719) ... 58 elided Caused by: org.apache.hadoop.hive.metastore.api.MetaException: java.security.AccessControlException: Permission denied: user=hadoop, path="file:/hadoop":root:root:drwxr-xr-x at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:14412) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result$create_database_resultStandardScheme.read(ThriftHiveMetastore.java:14380) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$create_database_result.read(ThriftHiveMetastore.java:14314) at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:78) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.recv_create_database(ThriftHiveMetastore.java:625) at org.apache.hadoop.hive.metastore.api.ThriftHiveMetastore$Client.create_database(ThriftHiveMetastore.java:612) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createDatabase(HiveMetaStoreClient.java:644) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156) at com.sun.proxy.$Proxy23.createDatabase(Unknown Source) at org.apache.hadoop.hive.ql.metadata.Hive.createDatabase(Hive.java:306) ... 72 more However, the permission of /hadoop is drwxr-xr-x. 3 hadoop hadoop 4096 3月 9 13:40 hadoop.
Any advice? Is snappydata-0.7 not compatible with hdp2.5.0? Thank you!
... View more
Labels:
- Labels:
-
Apache Spark
11-22-2016
10:58 AM
@mliem: how did you integrate kerberos and openldap? Could you give me some advice ?
... View more