Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 781 | 06-04-2025 11:36 PM | |
| 1358 | 03-23-2025 05:23 AM | |
| 672 | 03-17-2025 10:18 AM | |
| 2431 | 03-05-2025 01:34 PM | |
| 1591 | 03-03-2025 01:09 PM |
02-23-2016
07:46 AM
@Roberto Sancho I would go for SpagoBI see this ratings I have a PhD research paper from Geneva university that confirms this choice too
... View more
02-22-2016
05:31 PM
1 Kudo
@Kunal Gaikwad Failed connect to node3.dtitsupport247.net:50075; No route to host !!
Your hostnode3 is not reachable .. most probable issue wit the network setting routing tables.
- Verify the entries in your /etc/hosts
- Check the default gateway
- check the firewall is off
Just check that you didn't miss a step in the attached doc
Then retry !!
... View more
02-22-2016
08:09 AM
@PURUSHOTHAM D If you are using chrome have a shot at this doc or just get a download manager for windows
... View more
02-22-2016
07:06 AM
1 Kudo
@Amit Sharma Good question.. Not sure how is it related to protocol version ! at least the error message is definitely wrong. This is the only workaround I got. The issue is related to restrict access to the Hive metastore service by allowing it to impersonate only a subset of Kerberos users. This can be done by setting the hadoop.proxyuser.hive.groups property in core-site.xml on the Hive metastore host. The issue has something to do with org.apache.thrift.protocol client_protocol, My reasoning was to give the hive user a wildcard privilege like the root.There is a jira out there as I see it resolved your problem then you can accept it as an answer cheers!
... View more
02-21-2016
10:43 PM
1 Kudo
@Amit Sharma Gracefully stop all the services using Ambari. Restart all the services using Ambari @times after reboot of the server you will need to manually start those service that wont be started by Ambari but subsequent Ambari startall /stopall will work correctly. Keep me posted
... View more
02-21-2016
09:50 PM
1 Kudo
@Amit Sharma If you have ambari you can add the properties via: services->HDFS->configs->advanced->custom core-site Add the below properties hadoop.proxyuser.hive.hosts=*
hadoop.proxyuser.hive.groups=* Than restart all the affected the services.
... View more
02-20-2016
07:28 PM
@Prakash Punj Have a look at this doc Fine-Grained Permission with HDFS ACLs
... View more
02-18-2016
08:12 PM
1 Kudo
@Cecilia Posadas Add the mysql-connector-java.jar library into the lib directory located inside the oozie project root directory where the job.properties and workflow.xml files are located. Better solution is to add the mysql-connector-java-*.jar once to share/lib/sqoop directory in HDFS. Please do that and let me know
... View more
02-17-2016
06:01 AM
@Aditya Goyal I would opt to install a MySQL server to resolve the stalemate!
... View more
02-16-2016
05:49 AM
2 Kudos
@Ojustwin Naik If you are using HDP 2.3 here is the solution there was a kira already resolved. link Let me know if that helped
... View more