Member since
10-01-2015
3933
Posts
1150
Kudos Received
374
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 3657 | 05-03-2017 05:13 PM | |
| 3015 | 05-02-2017 08:38 AM | |
| 3275 | 05-02-2017 08:13 AM | |
| 3220 | 04-10-2017 10:51 PM | |
| 1684 | 03-28-2017 02:27 AM |
02-16-2016
09:11 PM
@Robin Dong
take a look at webhcat and webhdfs https://cwiki.apache.org/confluence/display/Hive/WebHCat+UsingWebHCat https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-hdfs/WebHDFS.html
... View more
02-16-2016
04:56 PM
@hoda moradi please add HDP repositories to your pom, make sure to add dependencies for our versions of Spark and Kafka, 0.8.1 is old. https://community.hortonworks.com/questions/626/how-do-i-resolve-maven-dependencies-for-building-a.html
... View more
02-16-2016
04:17 PM
@marko check whether firewall is blocking the port on each node.
... View more
02-16-2016
04:05 PM
@marko please use the steps here http://hortonworks.com/hadoop-tutorial/apache-spark-1-6-technical-preview-with-hdp-2-3/
... View more
02-16-2016
01:09 PM
Account info can be found reading the instructions in the tools or smartsense tab of the support portal.
... View more
02-16-2016
01:09 PM
@Vinod Nerella you need to go to support.HORTONWORKS.com and in tools tab you can generate smartsense ID please view this slide deck http://www.slideshare.net/dbist/hortonworks-smartsense?from_m_app=android
... View more
02-16-2016
11:19 AM
@Sunile Manjeethis is an educated guess but we're talking about YARN queues with RoundRobin schedule implementation. So instead of capacity scheduler we would implement another scheduler algorithm. I don't think it's viable unless cluster only dedicated to hive it would be cool to try having two root queues supporting multiple schedulers. http://m.linuxjournal.com/content/how-yarn-changed-hadoop-job-scheduling So with that idea in mind you world assign an available queue in round robin fashion to each user.
... View more
02-16-2016
11:11 AM
@cokorda putra susilaspark 1.5.2 is officially supported in hdp2.3.4 please upgrade to HDP 2.3.4 for that version of spark. We will support individual components like spark upgrade in the future.
... View more
02-16-2016
11:08 AM
@rbalam you might have a stale pid with wrong permissions. I've seen this before. Delete the pid, check permissions on directory, make sure directory is not mounted as noexec.
... View more
02-15-2016
08:42 PM
@Pedro Gandola good find, glad you were able to identify this.
... View more