Member since
02-13-2017
59
Posts
0
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2201 | 11-01-2017 12:30 PM |
02-25-2022
03:51 PM
1 Kudo
@regeamor As this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post. Thanks!
... View more
07-25-2021
01:34 PM
@USMAN_HAIDER There is this step below did you perform that? Kerberos must be specified as the security mechanism for Hadoop infrastructure, starting with the HDFS service. Enable Cloudera Manager Server security for the cluster on an HDFS service. After you do so, the Cloudera Manager Server automatically enables Hadoop security on the MapReduce and YARN services associated with that HDFS service. In the Cloudera Manager Admin Console:
Select Clusters > HDFS-n.
1.Click the Configuration tab.
2.Select HDFS-n for the Scope filter.
3.Select Security for the Category filter.
4.Scroll (or search) to find the Hadoop Secure Authentication property.
5.Click the Kerberos button to select Kerberos: Please revert
... View more
12-14-2020
08:22 PM
@ASIF123 Can you share the stack trace of the error and also share the commands you used to delete the notebook
... View more
12-13-2020
04:32 AM
Hi As per the link, https://zeppelin.apache.org/docs/0.8.0/usage/rest_api/interpreter.html#restart-an-interpreter To restart zeppelin interpreter we need to know interpreter id , which we are unable to find . https://zeppelin.apache.org/docs/0.8.0/usage/rest_api/interpreter.html#restart-an-interpreter Also interpreter restart can only be done with admin user, hence that also needs to be passed via script. Thanks ASIF.
... View more
11-02-2020
02:57 AM
Asif, please see if the below table creation query helps. This is used while restoring HBase table: create 'atlas_titan' , {NAME => 'e', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'FAST_DIFF', TTL => '2592000', COMPRESSION => 'GZ', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'} ,{NAME => 'g', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'FAST_DIFF', TTL => '2592000', COMPRESSION => 'GZ', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'} , {NAME => 'i', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'FAST_DIFF', TTL => '2592000', COMPRESSION => 'GZ', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'} , {NAME => 'l', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'FAST_DIFF', TTL => '2592000', COMPRESSION => 'GZ', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'} , {NAME => 'm', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'FAST_DIFF', TTL => '2592000', COMPRESSION => 'GZ', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'} , {NAME => 's', BLOOMFILTER => 'ROW', VERSIONS => '1', IN_MEMORY => 'false', KEEP_DELETED_CELLS => 'FALSE', DATA_BLOCK_ENCODING => 'FAST_DIFF', TTL => '2592000', COMPRESSION => 'GZ', MIN_VERSIONS => '0', BLOCKCACHE => 'true', BLOCKSIZE => '65536', REPLICATION_SCOPE => '0'}
... View more
10-29-2020
12:22 AM
Hello @ASIF123 Atlas Janus is an HBase Table, created with Column Family specification only. The DDL of the concerned Atlas Table is available in the HMaster UI & Master Logs, when Atlas Service is initialised for 1st time. However, Column Families are added eventually & each Row of the concerned Table has Different Column Qualifier. In short, there is no Static Definition of Atlas Janus Table. Any Phoenix Table or View mapping requires explicit specification of the Column Family & Column Qualifier. Henceforth, the Phoenix Table or View DDL will be specific to Customer's Env. Your team can review the Atlas Janus Table Output via Scan & confirm the Column Qualifier against each Column Family. Accordingly, the Phoenix Table or View can be created. - Smarak
... View more
09-17-2018
03:28 PM
@ASIF Khan Remove users.xml and authorizations.xml files from Nifi installation directory. Then restart Nifi , the authorization will rebuild new versions of these files based on the current configuration in your authorizers.xml file.
... View more
02-26-2018
08:09 AM
@ASIF Khan Please refer to the following links to know more about Shiro Authentication and roles: https://zeppelin.apache.org/docs/0.6.2/security/shiroauthentication.html
... View more
09-12-2017
03:24 PM
1 Kudo
Hi @ASIF Khan Hadoop 2.6.0 was HDP 2.2.x You can view the chart here near the bottom that shows the HDP versions, Hadoop versions, and component versions: https://hortonworks.com/products/data-center/hdp/
... View more
11-01-2017
12:30 PM
Issue has been resolved as we have to enable Backup-Master for Hbase.
... View more