Member since
11-06-2016
42
Posts
25
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7208 | 05-17-2017 01:38 PM | |
5917 | 02-07-2017 01:06 PM | |
4158 | 03-08-2016 07:25 PM |
10-22-2019
07:50 PM
Hi @Jonas Straub,do as your article ,i create collection by curl command,and got the 401 error: curl –negotiate –u : ‘http://myhost:8983/solr/admin/collections?action=CREATE&name=col&numShards=1&replicationFactor=1&collection.configName=_default&wt=json’ { “responseHeader”:{ “status”:0, “QTime”:31818}, “failure”:{ “myhost:8983_solr”:”org.apache.solr.client.solrj.impl.HttpSolrClient$RemoteSolrException:Error from server at http://myhost:8983/solr:Excepted mime type application/octet-stream but got text/html. <html> <head> <meta http-equiv=\”Content-Type\” content=\”text/html;charset=utf-8\”/>” <title> Error 401 Authentication required </title> </head> <body> <h2>HTTP ERROR 401</h2> <p> Problem accessing /solr/admin/cores.Reason: <pre> Authentication required</pre> </p> </body> </html> } } When I debug the solr source code, found this exception is returned by “coreContainer.getZKController().getOverseerCollectionQueue().offer(Utils.toJson(m), timeout)”,so I doubt maybe the solr don’t authenticate zookeeper info and I use a no-kerberos zookeeper to replace the Kerberos zookeeper, solr collection can be created successfully. How to solve the problem with Kerberos ZK?
... View more
03-21-2018
05:20 AM
Hi All, I am also facing the same issue. after enabling the kerberos, impala services are not started. Tried to start manually, but failed. in UI, i am able to see all the 3 instances in started state, but when i am running impala-shell command i am getting the error. the java processes for impala are also not present. Please help me in troubleshooting the issue. my hostname is also in small letters.... Thanks in advance...
... View more
02-07-2017
01:06 PM
@Ajay @prsingh Ranger HDFS policy were configured was but since Hadoop ACL's are in place it should have let falcon user in . As a workaround we tried a different path and that worked . Unable to replicate the same again . I guess we can close this ticket for now . Thanks ...
... View more
01-04-2017
10:09 PM
@rgangappa after restarting ambari I saw the same issue , but now it works . I still am not sure why this happened.
... View more
08-30-2016
04:37 PM
@ssoldatov for some reason my syntax is not coming through proper . I did put in the zookeeper node and i want to use a specific keytab.
... View more
07-26-2016
04:47 AM
1 Kudo
@Jonas Straub : Thanks Jonas , the above issue was more related to a hard coded value for index lock in solr.in.sh but I am also facing issue as you mentioned above on one of the solr cloud nodes. It kind of makes sense to increase the sleep # for solr graceful shutdown process .
... View more
03-04-2016
04:01 PM
1 Kudo
@Jagdish Saripella See this http://atlas.incubator.apache.org/Security.html
... View more
03-08-2016
07:25 PM
1 Kudo
@Neeraj Sabharwal @vpoornalingam @Artem Ervits Update :looking at admin > stack & versions > versions found that HDP( on one of the environments) was not finalized to 2.3 . Finalizing the stack now picks up ATLAS software . Thanks all for your help.
... View more
07-21-2016
12:30 PM
I am using Atlas-Ranger sandbox machine,so is it possible to delete those tag which are present on Atlas UI? if yes,then how can we delete those using REST API or with something different techniques? Here is the Atlas UI,I am using of sandbox machine. atlas-home-screen.png
... View more
01-27-2016
06:51 AM
1 Kudo
Hi @Jagdish Saripella Okay, I tried to run your script on my sandbox, and found that you need commas in your "STORE raw_data INTO 'hbase..." command like STORE raw_data INTO 'hbase://test1' USING org.apache.pig.backend.hadoop.hbase.HBaseStorage('test_data:firstname,test_data:lastname,test_data:age,test_data:profession'); You also have to pre-create your table, for example from hbase shell: create 'test1', 'test_data'. If you keep the header it will be loaded as well with rowkey='Custno'. Most likely that's not what you want. Hint: Next time when you have troubles with Pig, switch the debug mode on. You can do it by running "SET debug 'on'". That's how I discovered that HBaseStorage is trying to add a column using all that text in brackets without commas. With commas it correctly creates 4 columns.
... View more