Member since
01-23-2017
14
Posts
0
Kudos Received
0
Solutions
05-27-2019
01:22 PM
the step 3. Stop all services via Ambari what means . Al the services of the cluster, including by example kafka, spark etc or only all the services of service hdfs. I need know.
... View more
09-04-2018
12:24 PM
Cuando deja de funcionar el "Check MapReduce2" , y funcionaba se soluciona borrando la file cache.
rm -rf /tmp/hadoop/yarn/local/filecache/*
... View more
Labels:
11-02-2017
07:45 AM
I use HDP 2.4 , when I use deploy mode client in Spark-submit run well, I unkown where I need put the jar for deploy mode cluster, I need installl in all nodes or you think that I need add in a hdfs path ? The file not found in HDP 2.4 /etc/spark/conf/hive-site.xml , I think that this file is managed in ambari, I see a similar file in /usr/hdp/current/spark-client/conf and in this path the file appears.
... View more
11-01-2017
06:51 PM
Please any comment or any idea. https://community.hortonworks.com/answers/146868/view.html
... View more
Labels:
11-01-2017
06:49 PM
https://community.hortonworks.com/answers/146868/view.html
... View more
Labels:
11-01-2017
06:47 PM
@aervits could you send me any idea
... View more
10-31-2017
07:23 AM
Please any idea to solve, in order to google it or to read some blog or book about it.
... View more
10-30-2017
07:47 PM
Hi Guys I have a HDP 2.4 cluster. We are working with spark and we use spark-submmit with --deploy-mode client and the cluster run well. We try to use the --deploy-mode cluster and we cannot execute Hive DDL as we made use client deploy mode. I made a simple test case in order you could reproduce the problem. In hive shell I create the database and a partitioned table CREATE DATABASE TEST
CREATE TABLE test.testcase (
FIELD STRING
) partitioned by (p_kpart date)
STORED AS ORC
LOCATION "hdfs://rsicluster01/tmp/delete/test.db/testcase" We create a python file named testcase.py #!/usr/bin/env Python
# -*- coding: utf-8 -*-
from pyspark import SparkContext, SparkConf
from pyspark.sql import HiveContext
conf = SparkConf()
sc = SparkContext(conf=conf)
sqlContext = HiveContext(sc)
ori_query_p = "ALTER TABLE TEST.TESTCASE ADD IF NOT EXISTS PARTITION (p_kpart= '{0}')".format("2017-10-30")
print ("ori_query_p:", ori_query_p)
addpart = sqlContext.sql(ori_query_p)
print ("Alter OK")
We run this code use spark-submit --deploy-mode cluster --master yarn testcase.py Generated the following log, I cut a lot of lines ('ori_query_p:', "ALTER TABLE TEST.TESTCASE ADD IF NOT EXISTS PARTITION (p_kpart= '2017-10-30')")
Traceback (most recent call last):
File "testcase.py", line 12, in <module>
addpart = sqlContext.sql(ori_query_p)
File "/tmp/hadoop/yarn/local/usercache/hdfs/appcache/application_1509376396249_0059/container_e09_1509376396249_0059_02_000001/pyspark.zip/pyspark/sql/context.py", line 583, in sql
File "/tmp/hadoop/yarn/local/usercache/hdfs/appcache/application_1509376396249_0059/container_e09_1509376396249_0059_02_000001/pyspark.zip/pyspark/sql/context.py", line 691, in _ssql_ctx
Exception: ("You must build Spark with Hive. Export 'SPARK_HIVE=true' and run build/sbt assembly", Py4JJavaError(u'An error occurred while calling None.org.apache.spark.sql.hive.HiveContext.\n', JavaObject id=o50))
How I could config the HDP 2.4 Cluster in order that I could use the --deploy-mode cluster for Hive DDL
... View more
Labels:
04-12-2017
10:06 AM
I have a similar problem. I could use the jdbc format but I cannot use the zurl samples https://phoenix.apache.org/phoenix_spark.html May be solved in Spark 1.5 but the problem continues in Spark 1.6 format("org.apache.phoenix.spark") Whe I use the write interface run well. Please send the fix whe you find out any request.
... View more
04-05-2017
01:39 PM
I try to use in HDP 2.4.0 and cannot run the samples files. I have a Cluster HDP 2.4 mvn clean package test Cause: java.net.BindException: Address already in use at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
.... [INFO] HBase Spark Connector Project Parent POM ........... SUCCESS [ 1.542 s]
[INFO] HBase Spark Connector Project Core ................. FAILURE [01:04 min]
[INFO] HBase Spark Connector Project Examples ............. SKIPPED
... View more
01-23-2017
12:58 PM
In try to install form public one. You suggest that i cannot install from a public repo and only I could install for a local repo or a localhost repo.
... View more
01-23-2017
11:44 AM
I made a more of the hdp.repo file more hdp.repo #VERSION_NUMBER=2.4.2.0-258
[HDP-2.4.2.0] name=HDP Version - HDP-2.4.2.0
baseurl=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0 gpgcheck=1 gpgkey=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1 priority=1
[HDP-UTILS-1.1.0.20]
name=HDP Utils Version - HDP-UTILS-1.1.0.20 baseurl=http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/centos7 gpgcheck=1 gpgkey=http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0/RPM-GPG-KEY/RPM-GPG-KEY-Jenkins
enabled=1 priority=1 I made a yum clean all and a yum clean metada and the result is similar Determining fastest mirrors
* base: anorien.csc.warwick.ac.uk
* epel: epel.besthosting.ua
* extras: anorien.csc.warwick.ac.uk
* updates: centos.mirroring.pulsant.co.uk
Error: No matching Packages to list
... View more
01-23-2017
11:13 AM
Hi, I download the repo using wget -nv http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0/hdp.repo -O /etc/yum.repos.d/hdp.repo wget -nv http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0/hdp.repo -O /etc/yum.repos.d/hdp.repo
2017-01-23 12:09:50 URL:http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0/hdp.repo [575/575] -> "/etc/yum.repos.d/hdp.repo" [1] I launch the comand yum info ranger
Loaded plugins: fastestmirror
Loading mirror speeds from cached hostfile
* base: mirror.mhd.uk.as44574.net
* epel: epel.besthosting.ua
* extras: anorien.csc.warwick.ac.uk
* updates: anorien.csc.warwick.ac.uk
Error: No matching Packages to list I think that I do not donload the correct repo file ?
... View more
01-23-2017
10:07 AM
Hi guys, I try to register a new version using the wizard of ambari. Versions-> Register Version details - HDP 2.4. 2.0 I mark the redhat 7 check (I using centos) HDP- http://public-repo-1.hortonworks.com/HDP/centos7/2.x/updates/2.4.2.0/ HDP-UTILS http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.20/repos/ The message that generate was Some of the repositories failed validation. Make changes to the base url or skip validation if you are sure that urls are correct Any ideas I like to past form HDP-2.3.4.0 to HDP-2.4.2.0 I have acces permanely to internet using a proxy and I can download from external repos. Thanks you in avance
... View more
Labels: