Member since
06-21-2018
27
Posts
0
Kudos Received
0
Solutions
08-15-2018
10:15 AM
I am trying to save data to Elasticsearch using spark, I am working with xml files, as you know Elasticsearch accept only Json files, so I need to convert the xml file into Json file using Scala. Could you help me?
... View more
Labels:
- Labels:
-
Apache Spark
07-05-2018
11:28 AM
0favorite
I am trying to execute this command: aymenstien@aymenstien-VPCEH2Q1E:/usr/share/logstash$ ./bin/logstash -f /home/aymenstien/Bureau/fb.conf here is the config file: input {
file {
path => "/home/aymenstien/Bureau/mydoc/*"
start_position => beginning
codec => json
sincedb_path => "/home/aymenstien/Bureau/mydoc/postj1.sincedb"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => "http://localhost:9200"
index => "fbpost"
document_type => "post"
timeout => 30
workers => 1
}
} and I am getting the result of execution below: aymenstien@aymenstien-VPCEH2Q1E:/usr/share/logstash$ ./bin/logstash -f /home/aymenstien/Bureau/fb.conf
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[FATAL] 2018-07-05 12:47:56.496 [main] runner - An unexpected error occurred! {:error=>#<ArgumentError: Path "/usr/share/logstash/data" must be a writable directory. It is not writable.>, :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/settings.rb:448:in `validate'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:230:in `validate_value'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:141:in `block in validate_all'", "org/jruby/RubyHash.java:1343:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/settings.rb:140:in `validate_all'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:279:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:238:in `run'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/clamp-0.6.5/lib/clamp/command.rb:132:in `run'", "/usr/share/logstash/lib/bootstrap/environment.rb:73:in `<main>'"]}
[ERROR] 2018-07-05 12:47:56.554 [main] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit Can anyone help me?
... View more
06-27-2018
09:29 AM
I am working with sqoop, I can import and export data from mysql to hive or hbase with the command: sqoop import \ --connect "jdbc:mysql://localhost:3306/retail_db" \ --username=retail_dba \ --password=password \ --table departments \ --as-sequencefile \ --target-dir=/user/departments It work fine for me. Now I am trying to import and export from diffrent machine (my friend's machine)what I have to do. Thanks
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache NiFi
-
Apache Sqoop
06-21-2018
11:33 AM
I am lokking for the current stable release of hadoop.
... View more
Labels:
- Labels:
-
Apache Hadoop
03-15-2018
09:08 PM
# mysql --host=hadoop --port=3306 -u admin -p
Enter password:
ERROR 2003 (HY000): Can't connect to MySQL server on 'hadoop' (111)
... View more
03-15-2018
03:36 PM
@Geoffrey Shelton Okot yes the ambari port is 8080, I want to connect to mysql service.
... View more
03-15-2018
03:26 PM
I am trying to connect to this url :http://hadoop:3306 but its not working, can someone explain to me.
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop
03-09-2018
03:09 PM
root@hadoop:/home/kali# netstat -tnlpa | grep 3306
tcp 0 0 127.0.0.1:3306 0.0.0.0:* LISTEN 978/mysqld
root@hadoop:/home/kali# less /var/log/mysql/mysql.lo /var/log/mysql/mysql.lo: No such file or directory
... View more
03-09-2018
12:18 AM
I am trying to execute this command line $sqoop import --connect jdbc:mysql://hadoop:3306/Testdb --table widgets -m 4 --username root --password mypwd --driver com.mysql.jdbc.Driver but i am getting this error: SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.0.3-8/accumulo/lib/slf4j-log4j12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
18/03/09 01:15:55 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.6.0.3-8
18/03/09 01:15:55 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/03/09 01:15:55 WARN sqoop.ConnFactory: Parameter --driver is set to an explicit driver however appropriate connection manager is not being set (via --connection-manager). Sqoop is going to fall back to org.apache.sqoop.manager.GenericJdbcManager. Please specify explicitly which connection manager should be used next time.
18/03/09 01:15:55 INFO manager.SqlManager: Using default fetchSize of 1000
18/03/09 01:15:55 INFO tool.CodeGenTool: Beginning code generation
18/03/09 01:15:55 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.CommunicationsException: Communications link failure
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop
03-09-2018
12:04 AM
I get this error: 18/03/09 01:01:51 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1659)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:107)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:488)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
... View more