Member since
07-11-2017
3
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
10552 | 07-31-2017 07:50 AM |
07-31-2017
07:50 AM
That was a problem of version compatibility between spark in Ambari and my spark version imported with python.
... View more
07-27-2017
10:13 AM
I've install a cluster with one node on a amazon machine thanks to ambari. I'm trying to use spark from an other machine thanks to pySpark. This is my code : from pyspark import SparkConf, SparkContext
conf = SparkConf().setAppName('hello').setMaster('spark://MYIP:7077')
sc = SparkContext(conf=conf) The problem is that I have a connection refused when I run the program : WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master "MYIP" So, I tried this command to start the master : ./sbin/start-master.sh And now, I have this error :
17/07/27 12:07:15 WARN StandaloneAppClient$ClientEndpoint: Failed to connect to master XX.XXX.XXX.XX:7077
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:205)
at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)
at org.apache.spark.rpc.RpcEnv.setupEndpointRefByURI(RpcEnv.scala:100)
at org.apache.spark.rpc.RpcEnv.setupEndpointRef(RpcEnv.scala:108)
at org.apache.spark.deploy.client.StandaloneAppClient$ClientEndpoint$$anonfun$tryRegisterAllMasters$1$$anon$1.run(StandaloneAppClient.scala:106)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: java.io.StreamCorruptedException: invalid stream header: 01000C31 This is not a problem of port because the port 7077 is open. I don't find any answer for that problem on the forum, do you have any idea ?
... View more
Labels:
- Labels:
-
Apache Spark
07-11-2017
07:36 AM
I'm trying to access thanks to a java script to HBase located in another instance. This is the code : protected void instantiateHBase() {
config = HBaseConfiguration.create();
config.set("hbase.zookeeper.quorum", "MYIP");
config.set("hbase.zookeeper.property.clientPort","2181");
config.set("zookeeper.znode.parent", "/hbase-unsecure");
try {
TableName nameTable = TableName.valueOf(NAMETABLE);
Connection conn = ConnectionFactory.createConnection(config);
hTable = conn.getTable(nameTable);
} catch (IOException e) {
e.printStackTrace();} } There is no error at this step. So I think that the connection to the server is okay. But when I want to insert a row in the table, the program stay in blocked state. I don't have any error but nothing happen.. This is the code of the insertion : Put p = new Put(Bytes.toBytes(informations.getString("id")));
p.add(Bytes.toBytes("informations"),Bytes.toBytes("carId"),Bytes.toBytes(carId));
hTable.put(p); The execution is blocked at the line "hTable.put(p);".
... View more
Labels:
- Labels:
-
Apache HBase