Member since
03-14-2016
4721
Posts
1111
Kudos Received
874
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2445 | 04-27-2020 03:48 AM | |
4880 | 04-26-2020 06:18 PM | |
3976 | 04-26-2020 06:05 PM | |
3219 | 04-13-2020 08:53 PM | |
4925 | 03-31-2020 02:10 AM |
02-21-2017
03:00 AM
@jayaprakash gadi If you are using Ambari to start and stop your services then you should make changes via ambari "hadoop-env.sh" instead, Else it will override your System level (global setting) while starting the process. For spark executor you can also pass your own log4j.properties as well: https://community.hortonworks.com/questions/36027/log4jproperties-override-spark-executor.html .
... View more
02-20-2017
12:04 PM
@Oriane Your error [ssh_exchange_identification: read] is very similar to the one mentioned in the following link.... basically it is OS configuration related: https://codeplanet.io/connection-reset-by-peer/
This message can appear for a number of reasons, but in particular it may mean that your ssh logins are now being blocked for security reasons. This block is triggered automatically whenever several login failures are made in a short space of time. If this happens to you, wait a while then try again - the block is only temporary.
... View more
02-20-2017
11:42 AM
@Oriane \Your screen shot suggests that you are trying to do [ssh root@127.0.0.1 -p 2222] from inside your sandbox which is not right : Following is the output that i got from your attached screenshot "2222conectionrefused.png " You are already inside the sandbox in that screenshot as following: You should do ssh from your local machine (laptop) to the sandbox VM .
... View more
02-20-2017
11:28 AM
1 Kudo
@Oriane Looks like all those command not found errors are occurring because you are doing SSH using an incorrect port. 2122. You should be using the following: ssh root@127.0.0.1 -p 2222 . Can you try this.
... View more
02-20-2017
11:17 AM
1 Kudo
@Oriane I noticed that you are using port 2122 in order to do ssh ssh root@127.0.0.1 -p 2122 . Can you please try the 2222 port instead and then see of it works ssh root@127.0.0.1 -p 2222 .
... View more
02-20-2017
11:07 AM
@Oriane Also the "update" command was actually Postgres command for that you will need to login to Ambari DB. As mentioned in my previous update. Default Ambari DB username is "ambari" and password is "bigdata" Example: [root@sandbox tmp]# psql -Uambari ambari
Password for user ambari: bigdata
psql (8.4.20)
Type "help" for help.
ambari=> update ambari.users set user_password='538916f8943ec225d97a9a86a2c6ec0818c1cd400e09e03b660fdaaec4af29ddbb6f2b1033b81b00' where user_name='admin';
Are you able to access the Web Terminal: http://127.0.0.1:4200 ? If yes then can you try running the same commands from the web terminal to see if it works? Ideally the commands should be present in the following locations: [root@sandbox tmp]# which ambari-server
/usr/sbin/ambari-server
[root@sandbox tmp]# which ambari-agent
/usr/sbin/ambari-agent
[root@sandbox tmp]# which ambari-admin-password-reset
/usr/sbin/ambari-admin-password-reset .
... View more
02-20-2017
10:58 AM
@Oriane Additionally regarding the "http://127.0.0.1:8888" not accessible, I will suggest you to please check the "Port Forwarding" if it is setup properly or not? The following link provides more detailed information about port forwarding and Guest Port configuration: https://community.hortonworks.com/articles/65914/how-to-add-ports-to-the-hdp-25-virtualbox-sandbox.html
... View more
02-20-2017
10:44 AM
1 Kudo
@Saurabh You can try the following code: import java.io.*;
import java.util.*;
import java.net.*;
import org.apache.hadoop.fs.*;
import org.apache.hadoop.conf.*;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapred.*;
import org.apache.hadoop.util.*;
public class FileStatusChecker {
public static void main (String [] args) throws Exception {
try{
FileSystem fs = FileSystem.get(new Configuration());
FileStatus[] status = fs.listStatus(new Path("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log")); // you need to pass in your hdfs path
for (int i=0;i<status.length;i++){
String path = status[i].getPath().toString();
String owner = status[i].getOwner();
System.out.println("\n\t PATH: " + path + "\t OWNER: " +owner);
}
} catch(Exception e){
System.out.println("File not found");
e.printStackTrace();
}
}
} . Here in the above code you can pass either a specific file newPath("hdfs://sandbox.hortonworks.com:8020/testing/ambari-server.log") Or a directory as well: newPath("hdfs://sandbox.hortonworks.com:8020/testing")
... View more
02-20-2017
10:24 AM
@Oriane
If you are using HDP 2.5 sandbox in VM then Regarding the command "ambari-admin-password-reset command not found" not found issue, you can find the answer in : https://community.hortonworks.com/questions/58247/hdp-25-sandboxvm-commandsscripts-are-not-found.html - One other way to reset ambari-admin password will be to update the ambari database "users" table entry as following: [root@sandbox tmp]# psql -Uambari ambari
Password for user ambari:
psql (8.4.20)
Type "help" for help.
ambari=> update ambari.users set user_password='538916f8943ec225d97a9a86a2c6ec0818c1cd400e09e03b660fdaaec4af29ddbb6f2b1033b81b00' where user_name='admin'; - Your screenshot also suggest that you are trying to start the agent with a wrong command: ambari agent start (incorrect) Correct command will be : ambari-agent start
For server it should be ambari-server start . The following link provides a great detail of using Sandbox: http://hortonworks.com/hadoop-tutorial/learning-the-ropes-of-the-hortonworks-sandbox/#install-sandbox
... View more
02-20-2017
03:56 AM
1 Kudo
@Anandha L Ranganathan The "MySQLSyntaxErrorException" looks strange here looks buggy. Which version of ambari are you using? . - However as a quick workaround you might want to try the following approach to delete the "PRESTO" service completely. 1. Stop ambari-server ambari-server stop 2. Take ambari Database backup for Safety. mysqldump $dbname >/tmp/ambari_db_dump.sql
Exampel:
mysqldump ambari >/tmp/ambari_db_dump.sql 3. Now in the ambari Database run the following commands. This will clear the PRESTO service entry from you ambari DB.
delete from hostcomponentstate where service_name = 'PRESTO';
delete from hostcomponentdesiredstate where service_name = 'PRESTO';
delete from servicecomponentdesiredstate where service_name = 'PRESTO';
delete from servicedesiredstate where service_name = 'PRESTO';
delete from serviceconfighosts where service_config_id in (select service_config_id from serviceconfig where service_name = 'PRESTO');
delete from serviceconfigmapping where service_config_id in (select service_config_id from serviceconfig where service_name = 'PRESTO');
delete from serviceconfig where service_name = 'PRESTO';
delete from requestresourcefilter where service_name = 'PRESTO';
delete from requestoperationlevel where service_name = 'PRESTO';
delete from clusterservices where service_name ='PRESTO';
delete from clusterconfig where type_name like 'presto%';
delete from clusterconfigmapping where type_name like 'presto%'; 4. Now restart ambari-server ambari-server start .
... View more