Member since
12-14-2015
45
Posts
20
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1449 | 05-03-2016 02:27 PM | |
1364 | 04-27-2016 02:22 PM | |
29161 | 04-27-2016 08:00 AM | |
1416 | 04-21-2016 02:29 PM | |
5764 | 02-03-2016 08:24 AM |
04-29-2016
12:52 PM
Hello, I came across a particular issue. The active NameNode (master01) returned a socket timeout on zkfc, soon after he performed automatically failover bringing master02 to active. But master01 remained in a stalemate, on Ambari NameNode could see up, but without a state (active or stand-by); the process of NameNode on server was up and answered the call. On NameNode log there are no errors, on zkfc log there are some SocketTimeout (I attached the log). For resolve this situation we had to restart the NameNode service on the master01, which is automatically left in stand-by just started. Then I tried to do many manual failover and have positive results all time. On system log no have error, lan is always up and no have error for communicate with server. As I wrote above, the NameNode service is up and running on all 2 server. Have an idea of what might have happened? PS:HDFS service work correctly nn-errors.txt
... View more
Labels:
- Labels:
-
Apache Hadoop
04-28-2016
03:59 PM
Hi @Ludovic Rouleau, I think the only place where it could do the distcp is on the first action:
action name = "shell_date" Look the script is loaded, you should find something like:
hadoop distcp hdfs://nn1:8020/xxx hdfs://nn2:8020/xxx Or you could try to jump the first action and start the workflow directly from the second action: from: <start to="shell_date"/> to: <start to = "maj_t" />
... View more
04-28-2016
10:37 AM
Ok, finish upgrade now, all works correctly. Thanks @Ignacio Pérez Torres
... View more
04-28-2016
08:47 AM
Great! I can install now, I try to complete the process.
... View more
04-27-2016
02:25 PM
ps: you can download HDP windows version here:
http://hortonworks.com/downloads/#data-platform
... View more
04-27-2016
02:22 PM
http://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.0-Win/bk_HDP_Install_Win/content/ref-9bdea823-d29d-47f2-9434-86d5460b9aa9.1.html
Windows 10 is not official supported (only windows server edition), but I follow this guide for install HDP on single Windows 10 x64 and it works.
... View more
04-27-2016
10:47 AM
you are welcome
... View more
04-27-2016
09:10 AM
Can you attach all log? By the way I think the problem is --target-dir: use /user/root/test For 2 reason: 1) Have permission to write with root user on /user/maria_dev? Default no.
2) target-dir must not exist, or job have this error:
ERROR tool.ImportTool: Encountered IOException running import job: org.apache.hadoop.mapred.FileAlreadyExistsException: Output directory hdfs://sandbox.hortonworks.com:8020/user/root/test already exists
... View more
04-27-2016
08:00 AM
hi, correct syntax is: select empid, ename from emp WHERE eid > '200' and $CONDITIONS (no ' ' on $CONDITIONS)
... View more
04-21-2016
02:29 PM
3 Kudos
You need use sqoop for direct import on hive form sql. First download sql jdbc
https://www.microsoft.com/en-us/download/details.aspx?id=11774 place jar on sqoop master server:
/usr/hdp/current/sqoop-server/lib use this command for import: import --connect jdbc:sqlserver://[SQL_SERVER_NAME]:[SQL_PORT]/[DB-NAME] --username "[SQL_USERNAME]" --password "[PASSWORD]" --query '[INSERT QUERY HERE] WHERE $CONDITIONS' -m 1 --hive-import --hive-database [DB_HIVE] --create-hive-table --hive-table [HIVE_TABLE]
if use --query you must have WHERE $CONDITIONS on query.
... View more