Member since
01-25-2016
345
Posts
86
Kudos Received
25
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4998 | 10-20-2017 06:39 PM | |
3532 | 03-30-2017 06:03 AM | |
2585 | 02-16-2017 04:55 PM | |
16096 | 02-01-2017 04:38 PM | |
1141 | 01-24-2017 08:36 PM |
10-18-2017
08:06 AM
Thanks Deepesh and Divakar
... View more
04-06-2017
12:04 AM
2 Kudos
We are working on the HDP 2.6 Sandbox. We are in the final stages of testing. We expect to have something available soon. @Saumitra Buragohain
... View more
03-31-2017
09:08 AM
Thank you for giving the video link . It was very helpful.
... View more
02-22-2017
04:08 PM
3 Kudos
@Divakar Annapureddy yes via WebHCAT and WebHDFS # this will execute a hive query and save result to hdfs file in your home directory called output
curl -s -d execute="select+*+from+sample_08;" \
-d statusdir="output" \
'http://localhost:50111/templeton/v1/hive?user.name=root'
# if you ls on the directory, it will have two files, stderr and stdout
hdfs dfs -ls output
# if the job succeeded, you can cat the stdout file and view the results
hdfs dfs -cat output/stdout
WebHDFS
# list the output directory, notice the webhdfs port
curl -i "http://sandbox.hortonworks.com:50070/webhdfs/v1/user/root/output/?op=LISTSTATUS"
# read the output file
curl -i -L "http://sandbox.hortonworks.com:50070/webhdfs/v1/user/root/output/stdout?op=OPEN"
# rename a file, if you get dr. who error, add &user.name=root or any other user in the context
curl -i -X PUT "sandbox.hortonworks.com:50070/webhdfs/v1/user/root/output/stdout?op=RENAME&user.name=root&destination=/user/root/newname"
# read the output of the new file
curl -i -L "http://sandbox.hortonworks.com:50070/webhdfs/v1/user/root/newname?op=OPEN"
... View more
02-16-2017
04:55 PM
@Baruch AMOUSSOU DJANGBAN You don't have Ambari2.4.2 version available in your Repo. First set up repo with Ambari-server2.4.2 and run yum list ambari-server again. ideally you command will show like Available Package Ambari-server2.4.2 Installed Package Ambari-server2.4.0 Once you are getting available package with 2.4.2 then try to Upgarde.
... View more
02-03-2017
12:04 AM
Thanks for your inputs. Reason for change: we can avoid dependency on third party tools like F5 if ZK is doing the same.
... View more
02-06-2017
02:42 PM
You can check if the values you configured are actually in use with this command: hdfs getconf -confKey yarn.nodemanager.resource.memory-mb or from the ResourceManager UI: <rm.host>:8088/conf If the values are in sync, you can also check the ResourceManager logs for further information.
... View more
01-30-2017
05:18 PM
+ 1 more ... i would recommend you to take a look at beaker notebook : http://beakernotebook.com/getting-started?scroll It is a easy to use polyglot notebook interface..
... View more
01-24-2017
08:36 PM
1 Kudo
@PJ It's hard to tell but I would suggest you to go with hot fix when you don't have time to validate all your production jobs with HDP2.5.0 stack if you have time I would suggest you to install HDP2.5.0 in your Dev environment first and test all you production jobs, ensure that all are running with out having any issue.
... View more