Member since
11-07-2016
637
Posts
253
Kudos Received
144
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2721 | 12-06-2018 12:25 PM | |
| 2863 | 11-27-2018 06:00 PM | |
| 2194 | 11-22-2018 03:42 PM | |
| 3567 | 11-20-2018 02:00 PM | |
| 6276 | 11-19-2018 03:24 PM |
01-09-2018
05:48 PM
@Amogh Suman, Can you please try running the below commands and see if core-site.xml is created in /usr/hdp/current/hadoop-client/conf/ folder and also /etc/hadoop/conf is created # curl -k -u {username}:{password} -H "X-Requested-By:ambari" -i -X PUT -d '{"HostRoles": {"state": "INSTALLED"}}' http://{ambari-host}:{ambari-port}/api/v1/clusters/{clustername}/hosts/{hostname}/host_components/HDFS_CLIENT
# curl -k -u {username}:{password} -H "X-Requested-By:ambari" -i -X PUT -d '{"HostRoles": {"state": "INSTALLED"}}' http://{ambari-host}:{ambari-port}/api/v1/clusters/{clustername}/hosts/{hostname}/host_components/YARN_CLIENT
# curl -k -u {username}:{password} -H "X-Requested-By:ambari" -i -X PUT -d '{"HostRoles": {"state": "INSTALLED"}}' http://{ambari-host}:{ambari-port}/api/v1/clusters/{clustername}/hosts/{hostname}/host_components/MAPREDUCE2_CLIENT
# yum install -y hadoop hadoop-hdfs hadoop-libhdfs hadoop-yarn hadoop-mapreduce hadoop-client openssl In the curl commands replace the {username} with ambari username, {password} with ambari password , {ambari-host} with hostname, {port} with ambari port(default 8080) ,{clustername} and {hostname}. Thanks, Aditya
... View more
01-09-2018
05:22 PM
2 Kudos
@Gagandeep Singh Chawla, The usage depends on your use case. 1)The main disadvantage of fs -cp is that all data has to transit via the machine you issue the command on, depending on the size of data you want to copy the time consumed increases. DistCp is distributed as its name implies, so there is no bottleneck of this kind. 2) distcp runs a MR job behind and cp command just invokes the FileSystem copy command for every file. 3) If there are existing jobs running, then distcp might take time depending memory/resources consumed by already running jobs.In this case cp would be better. 4) Also, distcp will work between 2 clusters. Thanks, Aditya
... View more
01-09-2018
04:06 PM
@Amogh Suman, I have seen couple of questions from you which were related to file missing. There is a chance that your sandbox may be corrupted. I suggest you to try downloading latest sandbox and use it. For the above question, can you please list the files in the folder '/usr/hdp/current/hadoop-client/conf/' and check if core-site.xml exits. Also check if this folder exists '/etc/hadoop/conf'. Thanks, Aditya
... View more
01-09-2018
09:01 AM
1 Kudo
@Ravikiran Dasari, Option 1 using hive queries. Let us assume there are two tables A and B. You wan't to copy data from B to A then insert into table A select * from B; // You can use this when both of them have same schema
insert into table A select col1, col2, col3 from B; // You can use this when schemas are different and //you want to insert selected columns from table B Option 2 using sqoop. You can schedule a sqoop job to do this with incremental mode. Below links gives more info https://community.hortonworks.com/questions/10710/sqoop-incremental-import-working-fine-now-i-want-k.html https://sqoop.apache.org/docs/1.4.2/SqoopUserGuide.html#_incremental_imports Thanks, Aditya
... View more
01-09-2018
08:48 AM
@Ravikiran Dasari, -S is for running hive shell in silent mode. Yes you can create 2 tables with same schema. Make sure that table names are different
... View more
01-08-2018
11:35 AM
1 Kudo
@yassine sihi, Try removing the file manually and install hbase rm -rf /usr/hdp/2.5.0.0-1245/hbase/conf Thanks, Aditya
... View more
01-08-2018
10:50 AM
1 Kudo
@Ravikiran Dasari, If you are using shell script, then you use the below script to create hive table with timestamp # curr_timestamp=`date +%s`
# hive -S -e "create table test_$curr_timestamp(name string)"
# hive -S -e "show tables"
> test_1515408162 // output of show tables Thanks, Aditya
... View more
01-08-2018
07:11 AM
1 Kudo
@Gayathri Devi, You can use the below script . beeline -u "{connection-string}" -e "show tables" | grep $1
if [ $? -eq 0 ]
then
echo "table found"
else
echo "table not found"
fi But the content in a file say checktable.sh and run the below steps chmod +x checktable.sh
./checktable.sh {tablename to check} Thanks, Aditya
... View more
01-05-2018
01:54 PM
@yassine sihi, Can you please paste the curl command which you have used. Looks like the url is malformed according to the error response.
... View more
01-05-2018
11:30 AM
@yassine sihi, You should get response like below and also in the ambari GUI operations, you should see a request to install HBASE client. HTTP/1.1 202 Accepted
X-Frame-Options: DENY
X-XSS-Protection: 1; mode=block
X-Content-Type-Options: nosniff
Cache-Control: no-store
Pragma: no-cache
Set-Cookie: AMBARISESSIONID=1s6kicjia68zn3c1f77kpzx1o;Path=/;HttpOnly
Expires: Thu, 01 Jan 1970 00:00:00 GMT
User: admin
Content-Type: text/plain
Vary: Accept-Encoding, User-Agent
Content-Length: 136
{
"href" : "http://172.31.194.7:8080/api/v1/clusters/cl1/requests/30",
"Requests" : {
"id" : 30,
"status" : "Accepted"
}
} What is the error that you are facing while running the command.
... View more