Support Questions
Find answers, ask questions, and share your expertise

No sample data added to Apache Atlas Server

Explorer

Hi,

I have created HDP on AWS But Atlas web UI is not working.I have installed Atlas,Hbase, Kafka and Ambari Infra (Solr).I tried to load sample model and data using ,

bin/quick_start.py  http://localhost:21000/

But it throw exception:

Creating sample types: Exception in thread "main" org.apache.atlas.AtlasServiceException: Metadata service API org.apache.atlas.AtlasBaseClient$APIInfo@5d534f5d failed with status 409 (Conflict) Response Body ({"errorCode":"ATLAS-409-00-001","errorMessage":"Given type Dimension already exists"}) at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:337) at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:287) at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:429) at org.apache.atlas.AtlasClientV2.createAtlasTypeDefs(AtlasClientV2.java:217) at org.apache.atlas.examples.QuickStartV2.createTypes(QuickStartV2.java:191) at org.apache.atlas.examples.QuickStartV2.runQuickstart(QuickStartV2.java:147) at org.apache.atlas.examples.QuickStartV2.main(QuickStartV2.java:132) No sample data added to Apache Atlas Server.
5 REPLIES 5

Expert Contributor

The existing quick start is unforgiving, in the sense that it stops (throws exception) if type being created by quick start already exists in the database.

If don't have any data within the database, I would recommend truncating the database and then running quick start.

Using HBase shell (hbase shell) you could use:

  • truncate 'ATLAS_ENTITY_AUDIT_EVENTS'
  • truncate 'atlas_titan'

Hope this helps.

Explorer

Hi @Ashutosh Mestry,

I am not able to truncate the hbase table nor even disable.when i tried to truncate hbase table it throw error ,

Truncating 'ATLAS_ENTITY_AUDIT_EVENTS' table (it may take a while): ERROR: Unknown table ATLAS_ENTITY_AUDIT_EVENTS!

I have created HDP on AWS. so when i tried to disable hbase table then it throw permission issue.

ERROR: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions for user 'cloudbreak' (action=create)

I tried with below command as the atlas user :

su atlas -c '/usr/hdp/current/atlas-server/bin/quick_start.py'

I got only Atlas UI user(admin) and password(admin).I didn't get which password should i use here.

Expert Contributor

By default user name and password are same (both are admin).

I am not familiar with HDP on AWS.

One other thing you could potentially try is to use CURL commands to delete the existing classifications.

Retrieve classification definition first using:

curl -X GET -u admin:admin -H 'Content-Type: application/json'
"http://localhost:21000/api/atlas/v2/types/classificationdef/name/PII" > pii.json

Now, this would work only if you don't have any entities associated with that classification. You will also need to massage the contents you got from the previous step. I have attached sample (pii-colljson.zip).

curl -X DELETE -u admin:admin -H 'Content-Type: application/json' -d @pii-coll.json
http://localhost:21000/api/atlas/v2/types/typedefs

Explorer

Hi , @Ashutosh Mestry

I got the solution.I Added sample data in Atlas server successfully using the below command,

sudo su atlas -c '/usr/hdp/current/atlas-server/bin/quick_start.py'

Thank you for your help.

Expert Contributor

Thanks for letting me know. I am glad your problem is solved.

; ;