Member since
12-10-2017
9
Posts
1
Kudos Received
0
Solutions
07-20-2018
03:03 PM
Thanks for sharing the detailed information and will try to update the spark-env to use default mode as yarn.
... View more
12-12-2017
09:35 AM
@Gaurav Parmar Here is the documentation of the Cluster Applications API you are using. As you can see under the "Query Parameters Supported", to list jobs for a particular time frame you can use 4 parameters: startedTimeBegin startedTimeEnd finishedTimeBegin finishedTimeEnd All the parameters are specified in milliseconds since epoch, so you have to convert your time interval to Unix Timestamp. For example last week is 2017-12-04:00:00:01=1512345601000 - 2017-12-10:23:59:59=1512950399000. To list all the applications that were started and finished this week you can use http://hostname:8088/ws/v1/cluster/apps?startedTimeBegin=1512345601000&finishedTimeEnd=1512950399000&states=FINISHED
... View more
12-10-2017
07:17 PM
1 Kudo
@Gaurav Parmar If you are asking about the numbers : 1324256400 (Monday, December 19, 2011 1:00:00 AM) and 1324303200 (GMT: Monday, December 19, 2011 2:00:00 PM), they are the epoch timestamp. I am not sure about your use case on how/when are you going to supply the timestamp. But, this is one reference to convert human readable dates and time to timestamps and vice versa. https://www.epochconverter.com/
... View more
12-11-2017
03:50 PM
what should be the url/command when we need to access hadoop jobs for a specified time duration ?
... View more