Member since
04-10-2017
25
Posts
2
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2482 | 02-10-2021 08:35 AM |
03-24-2021
10:41 AM
Hi Christ, Hope you are doing good! I see that you would like to configure the Atlas UI in different languages. On testing, though I didn't do any changes from the Service configs but we can achieve this via the Web-browser Settings itself. a) Like for Chrome browser: Go to Settings >> Search for 'language' >> click on language tab and add your preferred language on top of the list. b) OR you can just Right-click on the Atlas UI webpage and click on 'Translate to' and select the required, that would change just your Atlas UI session language quick (if that suits your requirement): Ref: https://support.google.com/chrome/answer/173424?co=GENIE.Platform%3DDesktop&hl=en Regards, Shantanu Singh
... View more
02-18-2021
07:21 PM
Hi CV, I see that you have setup Standalone Atlas on CDH edgenode. Though I haven't tested this setup, ideally Apache Atlas should have the pre-defined system types (DataSet). Let me know if you tried creating custom Entities via Atlas API to check if it's getting reflected in the Atlas UI or with the atlas hive import script if it popultaes the meta[1] <atlas package>/hook-bin/import-hive.sh Also, are you observing any exceptions on Atlas Metadata Application logs? You can also attach the atlas configs here to compare validate with CDP/HDP cluster configs if something missing. [1] https://atlas.apache.org/1.2.0/Hook-Hive.html
... View more
02-11-2021
04:55 PM
1 Kudo
Hello there, The below property would be required if you would like to set RPC encryption[1]: hadoop.rpc.protection = privacy authentication : authentication only (default); integrity : integrity check in addition to authentication; privacy : data encryption in addition to integrity RPC encryption [2]:The most common way for a client to interact with a Hadoop cluster is through RPC. A client connects to a NameNode (NN) over RPC protocol to read or write a file. RPC connections in Hadoop use Java’s Simple Authentication & Security Layer (SASL) which supports encryption. When hadoop.rpc.protection property is set to 'privacy' the data over RPC is encrypted with symmetric keys. Ref: [1] https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/configuring-wire-encryption/content/enabling_rpc_encryption.html Kindly check the below Additional references for Wire Encryption and RPC Encryption blog post with detailed explanation[3]: Ref: [2] https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/configuring-wire-encryption/content/wire_encryption.html Ref: [3] https://blog.cloudera.com/wire-encryption-hadoop/ Ref: [4] Apache Jir Ref: Hadoop in Secure Mode: https://hadoop.apache.org/docs/r2.8.0/hadoop-project-dist/hadoop-common/SecureMode.html Hope this helps! Let me know if you have any queries.
... View more
02-10-2021
08:35 AM
Hi David, Is the issue observed for a specific host? Can you try moving the concerned keytabs to another dir location. Then from Ambari UI perform the "Only regenerate keytabs for missing hosts and components". Ambari Agent logs should report this with below trace: Missing keytabs:
Keytab: /etc/security/keytabs/smokeuser.headless.keytab Principal: The keytab file ideally gets modified by Ambari if the content is not matching with the keytabs in the Ambari cache. Hope this helps!
... View more
02-09-2021
06:11 PM
Hello there, I understand your use-case to save up some HDFS space. Though I haven't tested zipping possibilities for hdfs level files[2]. Alternately, you may consider reviewing HDFS Erasure Coding[1] if that suits your requirement: ErasureCoding in HDFS significantly reduces storage overhead while achieving similar or better fault tolerance through the use of parity cells (similar to RAID5). Prior to the introduction of EC, HDFS used 3x replication for fault tolerance exclusively, meaning that a 1GB file would use 3 GB of raw disk space. With EC, the same level of fault tolerance can be achieved using only 1.5 GB of raw disk space. Please refer the below article[1] for more insights on EC: Ref[1]: https://blog.cloudera.com/hdfs-erasure-coding-in-production/ [2] https://docs.cloudera.com/cloudera-manager/7.2.6/managing-clusters/topics/cm-choosing-configuring-data-compression.html
... View more
02-09-2021
05:19 PM
If you are still looking for this, the below references might be of help for some insights with AtlasRelationship v2 APIs: As with other typeDefs the AtlasRelationshipDef has a name. Once created the RelationshipDef has a guid. The name and the guid are the 2 ways that the RelationshipDef is identified. Ref: https://stackoverflow.com/questions/57385463/simple-example-for-adding-relationships-between-atlas-entities Additional Ref: https://atlas.apache.org/api/v2/json_AtlasRelationshipDef.html https://atlas.apache.org/api/v2/resource_RelationshipREST.html
... View more