Created 12-03-2024 08:52 AM
I'm running several ETL flows in CDF/Nifi. The flows all involve writing some data to Iceberg tables. These flows are mostly running on a schedule that ranges from hourly to weekly (i.e., infrequent writes). Each data flow deployment has a few independent DAGs that all end with the PutIceberg processor, such that there are multiple separate ETL processes running at different intervals all in the same Nifi deployment.
The problem I see occurs pretty rarely (~once a month) and only for some deployments. Occasionally, the PutIceberg processor will error with 'Failed to acquire a SAS token' (see full error log below).
This will continue happening every time the processor runs unless I restart the flow and then it will run fine again. It seems to happen more often when the processor runs at a daily interval. My flows are using Nifi runtime 1.27.0.2.3.14.0-14 and I'm on CDP Public Cloud on Azure.
```
org.apache.hadoop.fs.azurebfs.contracts.exceptions.SASTokenProviderException: Failed to acquire a SAS token for create-file on [my-data-warehouse-bucket-and-table]/metadata/bb545710-14ea-4b07-b0f5-668978be4e8d-m1.avro due to org.apache.hadoop.security.AccessControlException: org.apache.ranger.raz.intg.RangerRazException: <!doctype html><html lang="en"><head><title>HTTP Status 401 ??? Unauthorized</title><style type="text/css">body {font-family:Tahoma,Arial,sans-serif;} h1, h2, h3, b {color:white;background-color:#525D76;} h1 {font-size:22px;} h2 {font-size:16px;} h3 {font-size:14px;} p {font-size:12px;} a {color:black;} .line {height:1px;background-color:#525D76;border:none;}</style></head><body><h1>HTTP Status 401 ??? Unauthorized</h1><hr class="line" /><p><b>Type</b> Status Report</p><p><b>Message</b> org.apache.hadoop.security.authentication.util.SignerException: Invalid signature</p><p><b>Description</b> The request has not been applied to the target resource because it lacks valid authentication credentials for that resource.</p><hr class="line" /><h3>Apache Tomcat/8.5.96</h3></body></html>; HttpStatus: 401
at org.apache.hadoop.fs.azurebfs.services.AbfsClient.appendSASTokenToQuery(AbfsClient.java:1233)
at org.apache.hadoop.fs.azurebfs.services.AbfsClient.appendSASTokenToQuery(AbfsClient.java:1199)
at org.apache.hadoop.fs.azurebfs.services.AbfsClient.createPath(AbfsClient.java:396)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.conditionalCreateOverwriteFile(AzureBlobFileSystemStore.java:625)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystemStore.createFile(AzureBlobFileSystemStore.java:568)
at org.apache.hadoop.fs.azurebfs.AzureBlobFileSystem.create(AzureBlobFileSystem.java:335)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1177)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1157)
at org.apache.hadoop.fs.FileSystem.create(FileSystem.java:1046)
at org.apache.iceberg.hadoop.HadoopOutputFile.createOrOverwrite(HadoopOutputFile.java:85)
... 19 common frames omitted
```
Created 12-03-2024 09:07 AM
@MattWho @SAMSAL @venkatsambath Hi! Do you have any insights here? Thanks!
Regards,
Diana Torres,Created 12-04-2024 02:38 AM
Hi,
we experience very the same issue but on AWS. This is Kerberos authN related.
The Kerberos token expires leading to issues when an AWS STS or Azure SAS token is about to be acquired.
Created 12-04-2024 04:52 AM
Good to know it's not an Azure specific issue then. Thanks @DanielR
Created 12-10-2024 04:33 AM
Do you know how to keep it from expiring or renew the token from within Nifi?
Created 12-09-2024 05:07 PM
@ipson Has the reply helped resolve your issue? If so, please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future. Thanks.
Regards,
Diana Torres,