Support Questions
Find answers, ask questions, and share your expertise

Zeppelin Cannot access - Public Cloud

New Contributor

Hi

 

I am having a challenge with the access control for credentials while trying to execute the spark process in Zeppelin, can anyone help?

 

ERROR idbroker.AbstractIDBClient: Cloud Access Broker response: { "error": "There is no mapped role for the group(s) associated with the authenticated user.",

 

Regards

Lakshmi Segu

 

2 REPLIES 2

Contributor

Hello Lakshmi

 

- The role you are mapping to the user in the IDBroker mapping section has the right S3 bucket specified? Or are you using the same bucket created during the DataLake deployment?
	- Can you also make sure that Spark is configured to point to the S3 bucket [1]. For Spark, it is required to define the S3 bucket name in the following property "spark.yarn.access.hadoopFileSystems".
	Example: If using a DataHub cluster, Access to the DH in the Management Console > CM-UI > Clusters > Spark > Configurations > Create a file names "spark-defaults.conf" or update the existing file with the property:
	spark.yarn.access.hadoopFileSystems=s3a://bucket_name

or 

DL--Manage Access-- IDBroker Mappings -- edit -- It was given the Data Access Role.
DH --Manage Access-- Assigned your self required roles

 

Super Collaborator

Hello @LakshmiSegu 

 

We hope your query was addressed by Shehbaz's response. In Summary,

(I) Ensure you Username has an IDBroker Mapping (Actions > Manage Access > IDBroker Mappings). 

(II) Include the "spark.yarn.access.hadoopFileSystems" Parameter to point to the S3 Path [1]. 

 

Regards, Smarak

 

[1] https://docs.cloudera.com/runtime/7.2.15/developing-spark-applications/topics/spark-s3.html

 

; ;