Member since
09-28-2015
34
Posts
10
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
993 | 01-20-2016 08:09 PM | |
4618 | 12-08-2015 09:50 PM | |
833 | 12-02-2015 10:42 PM | |
1772 | 10-09-2015 05:28 PM |
02-09-2016
04:05 PM
What is the solution to this problem? I also encounter it, able to delete and re-install, but fail with same error.
... View more
01-20-2016
10:10 PM
We suspect a Hive table CBO statistics is incorrect. How can we check it and how to re-compute them, should we run the commands again?
analyze table t compute statistics; analyze table t compute statistics for columns;
... View more
Labels:
- Labels:
-
Apache Hive
01-20-2016
08:09 PM
We set the previous version of conf as current, and it works.
... View more
01-20-2016
07:17 PM
1 Kudo
Client is unhappy about "Hive on Spark" is not supported yet. It is a more Hive feature than Spark's: https://cwiki.apache.org/confluence/display/Hive/Hive+on+Spark What is our stand here? Do we know when it is going to be available?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
01-19-2016
06:41 PM
Ambari 1.7, after decommissioning nodes (delete host), we see new version of conf created to HDFS, Yarn, HBase, Hive... but they are identical to previous conf. Is it a known issue? can we make the previous version as current to avoid "restart required labels"?
... View more
Labels:
- Labels:
-
Apache Ambari
12-15-2015
10:51 PM
Cannot bring up 2 NodeMangers, the other 19 works fine.
I got this:
FATAL nodemanager.NodeManager (NodeManager.java:initAndStartNodeManager(465)) - Error starting NodeManager java.lang.UnsatisfiedLinkError: Could not load library. Reasons: [no leveldbjni64-1.8 in java.library.path, no leveldbjni-1.8 in java.library.path, no leveldbjni in java.library.path, Permission denied]
It is not caused by "chmod 777 /tmp", any clue?
... View more
Labels:
- Labels:
-
Apache YARN
12-08-2015
09:50 PM
1 Kudo
Figured it out, it has to be HiveContext, not SQLContext, after making below change, it works: HiveContext hiveContext = new org.apache.spark.sql.hive.HiveContext(sc.sc()); //SQLContext sqlContext = new org.apache.spark.sql.SQLContext(sc);
... View more
12-08-2015
08:45 PM
I am using Spark 1.3.1. Seems that saveAsTable() creates internal Spark table source. http://stackoverflow.com/questions/31482798/save-spark-dataframe-to-hive-table-not-readable-because-parquet-not-a-sequence
... View more