Member since
12-21-2021
10
Posts
0
Kudos Received
0
Solutions
02-01-2022
08:45 AM
Hi, I have installed Ranger in my CDP cluster with RangerAdmin, Tagsync and Usersync instances. I have a problem when trying to access the admin web UI (6080) with the credentials admin:admin or ranger:ranger. I get "Unable to connect to DB" on the interface and WARN com.mchange.v2.resourcepool.BasicResourcePool: Having failed to acquire a resource, com.mchange.v2.resourcepool.BasicResourcePool@2bf8ec30 is interrupting all Threads waiting on a resource to check out. Will try again in response to new client requests. WARN com.mchange.v2.resourcepool.BasicResourcePool: com.mchange.v2.resourcepool.BasicResourcePool$ScatteredAcquireTask@781133d5 -- Acquisition Attempt Failed!!! Clearing pending acquires. While trying to acquire a needed new resource, we failed to succeed more than the maximum number of allowed acquisition attempts (30). Last acquisition attempt exception: org.postgresql.util.PSQLException: FATAL: password authentication failed for user "ranger" in the ranger-admin log. Is there a way to verify what the username and password for the web UI are ? otherwise is there a way to create a hive security policy through commands without accessing the UI ? Thank you.
... View more
01-21-2022
05:44 AM
Indeed @DigitalPlumber , EvaluateXpath can't handle variables, only an Xpath. Queryrecord might be an alternative if i was using another type of file but since im using an XML file and this processor performs an SQL Query what can be my parameters in the query ?
... View more
01-20-2022
02:41 AM
Hi @DigitalPlumber thanks for your response, im actually dealing with an XML file so i need to use EvaluateXpath to extract values from my file ( i have Getfile -> SplitXml -> EvaluateXpath in my flow). On the other hand i have a CSV file that im extracting a value from ( i have another flow Getfile -> UpdateAttribute -> QueryRecord -> ReplaceText-> UpdateAttribute) My goal is to pass the value of the UpdateAttribute into the EvaluateXpath so i can extract a value from my XML file based on the ouput value extracted from my CSV.
... View more
01-19-2022
08:38 AM
Hi, I'm trying to pass an output value of a processor dynamically into another and use it as a new property , here is my flow: ReplaceText (gives the output)-> UpdateAttribute (i created a property which value is referencing output) -> EvaluateXpath ( i want to pass this value in a new property here) . I tried passing it by ${property_name} and $property_name but nothing works. Is it impossible to have a property referencing another in the EvaluateXpath processor ? Thanks.
... View more
Labels:
- Labels:
-
Apache NiFi
01-18-2022
01:23 AM
Thank you for your response , so do you think that maybe because i shut down the virtual machines everyday the data is sometimes lost ? it is very unusual since the issue now is not happening every time but sometimes hdfs is corrupt and shows blocks missing and sometimes it's healthy from the start so i don't know what to think of it.
... View more
01-17-2022
08:19 AM
I suggest you try to set your path with "\" instead of "/" for example: C:\ Users\kiran\Downloads\mysql\mysql-connector-java-8.0.17.jar it worked out for me!
... View more
01-14-2022
06:14 AM
Thank you @GangWar i tried these steps before and turns out that i had been modifying these parameters in the wrong core-site.xml file , it works just fine now 😉
... View more
12-27-2021
08:45 AM
Hello, When trying to access Hbase through Hue i have this error in the log WARN org.eclipse.jetty.server.HttpChannel: handleException / org.apache.hadoop.security.authorize.AuthorizationException: User: hbase is not allowed to impersonate hue knowing that i had set a Thrift Server instance on Hbase and it is running okay.
... View more
Labels:
- Labels:
-
Apache HBase
12-24-2021
06:40 AM
Try to clean your metadata with ./hbase-cleanup.sh --cleanAll command and restart your services. If you get "Regionservers are not expired. Exiting without cleaning hbase data" stop Hbase service before running the command.
... View more
12-21-2021
07:41 AM
Hi, I am facing issues with HDFS in my Cloudera Manager cluster. I have a cluster of 4 virtual machines (1 master and 3 slaves) on cloud and at every shutdown and restart of the cluster/vms HDFS shows some missing blocks. I have found a "workaround" as deleting the missing files and restarting the service would solve the problem but sometimes the missing files contain important data related to other hadoop services that generate issues if they are removed. Is there another solution to avoid deleting data ? Thank you.
... View more
Labels:
- Labels:
-
Cloudera Manager
-
HDFS