Member since
09-18-2015
9
Posts
27
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
949 | 05-01-2016 04:54 PM |
12-28-2016
03:53 PM
2 Kudos
@Brad Bukacek Jr By design, the HBase REST server returns content encoded response with base64. So all your content, like the column family, the qualifier and the raw content will be encoded. You just need to create a custom JSON deserializer. Here is an awesome blog about this subject: https://blog.layer4.fr/2016/11/16/hbase-rest-api-knox-java/ There is a special section about your problem.
... View more
11-23-2015
09:40 PM
@Brad Bukacek J
Blog : https://www.linkedin.com/pulse/yarn-queues-hadoop-neeraj-sabharwal Capacity sheduler view has an option for user mapping format
... View more
10-16-2015
02:23 AM
Please see this It's supported #cluster1 properties
cluster1.oozie_url = http://node-1.example.com:11000/oozie/
cluster1.oozie_location = /usr/lib/oozie/bin
cluster1.qa_host = node-1.example.com
cluster1.service_user = falcon
cluster1.password = rgautam
cluster1.hadoop_url = node-1.example.com:8020
cluster1.hadoop_location = /usr/lib/hadoop/bin/hadoop
cluster1.hostname = http://node-1.example.com:15000
cluster1.cluster_readonly = webhdfs://node-1.example.com:50070
cluster1.cluster_execute = node-1.example.com:8032
cluster1.cluster_write = hdfs://node-1.example.com:8020
cluster1.activemq_url = tcp://node-1.example.com:61616?daemon=true
cluster1.storeLocation = hdfs://node-1.example.com:8020/apps/falcon
cluster1.colo = default
cluster1.namenode.kerberos.principal = nn/node-1.example.com@none
cluster1.hive.metastore.kerberos.principal = hive/node-1.example.com@none
cluster1.hcat_endpoint = thrift://node-1.example.com:9083
cluster1.service_stop_cmd = /usr/lib/falcon/bin/falcon-stop cluster1.service_start_cmd = /usr/lib/falcon/bin/falcon-start
... View more
11-28-2017
09:10 AM
Hi Team, I have tried above and I see the Job status KILLED after running the workflow. After launching Oozie, I can see the workflow changing status from RUNNING to KILLED. Is there a way to troubleshoot. I can run hadoop fs -ls commands on my s3 bucket so definitely got access. I suspect its the s3 URL. I tried downloading the xml changing the URL and uploading with no luck. Any other suggestions. Appreciate all your help/support in advance. Regards Anil
... View more
10-08-2015
08:20 PM
To elaborate a bit more, there is an InvokeHttp processor that is able to utilize basic authentication. In order to connect, the processor has a property called "Basic Authentication Password". The user of the UI has to input this when configuring the processor. Since it is a password it is considered a sensitive property and once set it won't be able to be seen in the UI and it is encrypted when in use. Also when exporting in a template the sensitive properties are not transferred.
... View more
10-31-2016
06:08 PM
Bojan, if you are using Apache NiFi 1.0.0 or later, use this guide by @Bryan Bende.
... View more