Member since
06-21-2018
27
Posts
0
Kudos Received
0
Solutions
08-15-2018
10:15 AM
I am trying to save data to Elasticsearch using spark, I am working with xml files, as you know Elasticsearch accept only Json files, so I need to convert the xml file into Json file using Scala. Could you help me?
... View more
Labels:
07-23-2018
12:04 PM
I am trying to save a json file to Elasticsearch, I am getting this error (Exception in thread "main" java.lang.AbstractMethodError) I am using "spark 2.3.0" and "scala 2.11.6" please find the code Below: import org.apache.spark.sql.SQLContext
import org.elasticsearch.spark.sql._
import org.apache.spark.{SparkConf, SparkContext}
object OrangetoES {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("OrangetoES").setMaster("local[*]")
val sc = new SparkContext(conf)
conf.set("es.index.auto.create", "true")
val sqlc = new SQLContext(sc)
val df = sqlc.read.json("../Orange.json")
df.saveToEs("orangetoes/people")
}
}
... View more
Labels:
07-19-2018
12:50 PM
I am trying to save a json file in Elasticsearch using spark scala, but the code not working. here is the code: import org.apache.spark
import org.apache.spark.{SparkConf, SparkContext}
import org.elasticsearch.spark._
import org.apache.spark.sql._
object JsontoES {
def main(args: Array[String]): Unit = {
val conf = new SparkConf().setAppName("HelloWorld").setMaster("local")
val sc = new SparkContext(conf)
conf.set("es.index.auto.create", "true")
val spark = SparkSession
.builder()
.appName("Spark SQL basic example")
.config("spark.some.config.option", "some-value")
.getOrCreate()
val jsonpath = "/home/aymenstien/Bureau/mydoc/Orange.json"
//val ds= spark.read.json(jsonpath)
sc.esJsonRDD(jsonpath)
val RDD = sc.esRDD("radio/artists")
//rddds.saveToEs("jsontoes/json-trips")
//sc.makeRDD(Seq(ds)).saveToEs("jsontoEs/docs")
}
}
<br>
... View more
Labels:
07-05-2018
11:28 AM
0favorite
I am trying to execute this command: aymenstien@aymenstien-VPCEH2Q1E:/usr/share/logstash$ ./bin/logstash -f /home/aymenstien/Bureau/fb.conf here is the config file: input {
file {
path => "/home/aymenstien/Bureau/mydoc/*"
start_position => beginning
codec => json
sincedb_path => "/home/aymenstien/Bureau/mydoc/postj1.sincedb"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => "http://localhost:9200"
index => "fbpost"
document_type => "post"
timeout => 30
workers => 1
}
} and I am getting the result of execution below: aymenstien@aymenstien-VPCEH2Q1E:/usr/share/logstash$ ./bin/logstash -f /home/aymenstien/Bureau/fb.conf
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[FATAL] 2018-07-05 12:47:56.496 [main] runner - An unexpected error occurred! {:error=>#<ArgumentError: Path "/usr/share/logstash/data" must be a writable directory. It is not writable.>, :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/settings.rb:448:in `validate'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:230:in `validate_value'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:141:in `block in validate_all'", "org/jruby/RubyHash.java:1343:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:140:in `validate_all'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:279:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:238:in `run'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/clamp-0.6.5/lib/clamp/command.rb:132:in `run'", "/usr/share/logstash/lib/bootstrap/environment.rb:73:in `<main>'"]}
[ERROR] 2018-07-05 12:47:56.554 [main] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit Can anyone help me?
... View more
06-27-2018
09:29 AM
I am working with sqoop, I can import and export data from mysql to hive or hbase with the command: sqoop import \ --connect "jdbc:mysql://localhost:3306/retail_db" \ --username=retail_dba \ --password=password \ --table departments \ --as-sequencefile \ --target-dir=/user/departments It work fine for me. Now I am trying to import and export from diffrent machine (my friend's machine)what I have to do. Thanks
... View more
Labels:
06-21-2018
11:33 AM
I am lokking for the current stable release of hadoop.
... View more
Labels:
03-15-2018
09:08 PM
# mysql --host=hadoop --port=3306 -u admin -p
Enter password:
ERROR 2003 (HY000): Can't connect to MySQL server on 'hadoop' (111)
... View more
03-15-2018
03:36 PM
@Geoffrey Shelton Okot yes the ambari port is 8080, I want to connect to mysql service.
... View more
03-15-2018
03:26 PM
I am trying to connect to this url :http://hadoop:3306 but its not working, can someone explain to me.
... View more
Labels:
03-09-2018
03:09 PM
root@hadoop:/home/kali# netstat -tnlpa | grep 3306
tcp 0 0 127.0.0.1:3306 0.0.0.0:* LISTEN 978/mysqld
root@hadoop:/home/kali# less /var/log/mysql/mysql.lo /var/log/mysql/mysql.lo: No such file or directory
... View more
03-09-2018
12:18 AM
I am trying to execute this command line $sqoop import --connect jdbc:mysql://hadoop:3306/Testdb --table widgets -m 4 --username root --password mypwd --driver com.mysql.jdbc.Driver but i am getting this error: SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/03/09 01:15:55 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.0.3-8
18/03/09 01:15:55 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/03/09 01:15:55 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
18/03/09 01:15:55 INFO manager.SqlManager: Using default fetchSize of 1000
18/03/09 01:15:55 INFO tool.CodeGenTool: Beginning code generation
18/03/09 01:15:55 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
... View more
Labels:
03-09-2018
12:04 AM
I get this error: 18/03/09 01:01:51 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1659)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
... View more
03-08-2018
11:55 PM
everything is ok, may be i have to add the port to jdbc:mysql://hadoop:3306 right ?
... View more
03-08-2018
11:50 PM
The last packet sent successfully to the server was 0 milliseconds ago. The driver has not received any packets from the server.
com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
... View more
03-08-2018
11:48 PM
#su -l sqoop -c "sqoop import --connect jdbc:mysql://hadoop/Testdb --table widgets -m 4 --username root --password CHANGEME --driver com.mysql.jdbc.Driver" com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
... View more
03-08-2018
11:38 PM
$ cp -f /usr/share/java/mysql-connector-java.jar /usr/hdp/current/sqoop-client/lib/ cp: '/usr/share/java/mysql-connector-java.jar' and '/usr/hdp/current/sqoop-client/lib/mysql-connector-java.jar' are the same file
... View more
03-08-2018
11:24 PM
$ls -ltr /usr/share/java/mysql-connector-java-5.1.17.jar ls: cannot access '/usr/share/java/mysql-connector-java-5.1.17.jar': No such file or directory
... View more
03-08-2018
11:02 PM
I am trying to import a table named 'widgets' in the data base 'Testdb' with the command below: $sqoop import --connect jdbc:mysql://hadoop/Testdb --table widgets -m 4 -username root -P I get the error "Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.mysql.jdbc.Driver"
... View more
Labels:
03-07-2018
05:32 AM
I stopped some services and I started what I need, Thanks for your answer.
... View more
03-07-2018
05:21 AM
i clicked yes and i had all this alerts in this case what shoud i do?
... View more
03-07-2018
05:03 AM
Is it normal to continue the install of the services with these warnings as showing in the picture below.
... View more
Labels:
03-06-2018
10:37 PM
awesome it works, thank you very much!
... View more
03-06-2018
07:33 PM
$ ambari-agent start ambari-agent: command not found
... View more
03-06-2018
06:51 PM
ssh: connect to host hadoop port 22: Connection refused
SSH command execution finished
host=hadoop, exitcode=255
Command end time 2018-03-06 19:50:20
ERROR: Bootstrap of host hadoop fails because previous action finished with non-zero exit code (255)
ERROR MESSAGE: ssh: connect to host hadoop port 22: Connection refused
STDOUT:
ssh: connect to host hadoop port 22: Connection refused
... View more
03-06-2018
02:39 PM
I am trying to install hdp locally on ubuntu 16.04.
I installed ambari, i launched the cluster installed wizzard,I Used Local Repository, here i left it empty i don't know what i have to put.
then i entred my host 'hadoop' and i entred the 'ssh private key'.
In the end of all I get an error during installation of 'Ambari' agent like its showing in the picture below. ,
I am trying to install hdp locally on ubuntu 16.04.
I installed ambari, i launched the cluster installed wizzard,I Used Local Repository, here i left it empty i don't know what i have to put.
then i entred my host 'hadoop' and i entred the 'ssh private key'.
In the end of all I get an error during installation of 'Ambari' agent like its showing in the picture below.
... View more
Labels:
11-13-2017
11:42 AM
hello I tried to connect to localhost but it was closed. I tried to start the postgresql service but failed. postgresql won't work. how can i find the ambari-server.log (I am just a biginner ) find the screenshots.
... View more
11-11-2017
01:19 PM
Hi everyone, I hope that all is fine with you'll, I am running HDP 2.6 docker version on ubuntu. I was trying to start the HDP "./start_sandbox-hdp.sh" but it look its not going to start, alot of problems and a lot of errors. I am a beginner in Big Data, who can help me to resolve these problems. please find screen shot below. Thanks 🙂
... View more
Labels: