Member since
10-21-2016
7
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3817 | 10-21-2016 06:01 PM |
04-16-2019
01:15 PM
@Jay Kumar SenSharma Referring to the PUT command used for triggering the alert manually, Is it possible to pass some parameters/custom information to the script which gets trigged by the alert ? If yes, is it via headers OR body of the PUT request ? Thanks.
... View more
04-16-2019
01:08 PM
Referring to the PUT command used for triggering the alert manually, Is it possible to pass some parameters/custom information to the script which gets trigged by the alert ? If yes, is it via headers OR body of the PUT request ? Thanks.
... View more
05-19-2017
05:56 AM
1 Kudo
I have configured the s3 keys (access key and secret key) in a jceks file using hadoop-credential api. Commands used for the same are as below: hadoop credential create fs.s3a.access.key -provider jceks://hdfs@nn_hostname/tmp/s3creds_test.jceks hadoop credential create fs.s3a.secret.key -provider jceks://hdfs@nn_hostname/tmp/s3creds_test.jceks Then, I am opening a connection to Spark Thrift Server using beeline and passing the jceks file path in the connection string as below: beeline -u "jdbc:hive2://hostname:10001/;principal=hive/_HOST@?hadoop.security.credential.provider.path=jceks://hdfs@nn_hostname/tmp/s3creds_test.jceks; Now, when I try to create an external table with the location in s3, it fails with the below exception: CREATE EXTERNAL TABLE IF NOT EXISTS test_table_on_s3 (col1 String, col2 String) row format delimited fields terminated by ',' LOCATION 's3a://bucket_name/kalmesh/'; Exception: Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: java.nio.file.AccessDeniedException s3a://bucket_name/kalmesh: getFileStatus on s3a://bucket_name/kalmesh: com.amazonaws.services.s3.model.AmazonS3Exception: Forbidden (Service: Amazon S3; Status Code: 403; Error Code: 403 Forbidden; Request ID: request_id), S3 Extended Request ID: extended_request_id=) (state=,code=0) However, this works fine with Hive Thrift Server. HDP version: HDP 2.5 Spark version: 1.6
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
10-21-2016
06:01 PM
2 Kudos
Change the nfs dump directory to /tmp/.nfs OR /tmp/.hdfsnfs (remove - in the directory name) instead of /tmp/.hdfs-nfs via Ambari configs page. This will solve the issue.
... View more