Member since
01-18-2016
6
Posts
4
Kudos Received
0
Solutions
06-23-2016
06:22 PM
1 Kudo
I would like to check out the underlying back end graph data stored in the Atlas Titan graph db. I am using the default Titan on Berkeley DB. Do you know of any query tool that does this? My main motivation is to delete some entities from the repo, as the default REST API does not support this yet.
... View more
Labels:
- Labels:
-
Apache Atlas
04-19-2016
02:55 PM
I run into the same problem, where the Ambari says it's installed, but the sqoop directory is not there on the data nodes.
I am running in a cluster, but it should be the same for sandbox.
The current answer does not address this, but the only way to fix this is to uninstall the sqoop client, and re-install it with Ambari.
Unfortunately, current web UI does not allow uninstall of clients.
Fortunately, you can do it through API calls.
Command Syntax is follows: URL=https://${AMBARI_HOST}/api/v1/clusters/${CLUSTER_NAME}/hosts/${HOST_FQDN}/host_components/SQOOP
curl -k -u admin:admin -H "X-Requested-By:ambari" -i -X DELETE $URL
After that, you can re-install the sqoop client from the Web UI.
... View more
01-22-2016
03:58 PM
Thanks jwitt. I understand, but the parts are talking about supporting different levels of access within one flow for different users. What I am looking for is creating different flows for the same user or different users. Is the latter currently possible for Nifi? Or is having one flow per instance is the underlying basic assumption of Nifi?
... View more
01-22-2016
12:39 AM
1 Kudo
We are currently building a multi tenant Hadoop platform, and would like to use Nifi as our data flow management tool. I am having a hard time to figure out how can you create more than one Nifi data flow. The reason I want to do that is due to the platform potentially being used by different teams, we don't want everybody just keep adding processor groups to the same flow, and we don't want all the users to be able to see other peoples flow. I am still using the the anonymous authentication, so I don't know if the flows will be per use based once I configured the User Authentication, which I am working on. Any help is appreciated.
... View more
Labels:
- Labels:
-
Apache NiFi
01-20-2016
06:04 PM
2 Kudos
I have found out how to resolve the "Connection failed on host sandbox.hortonworks.com" failures for my sandbox. The fix is to add sandbox.hortonworks.com to your no_proxy variable in /etc/profile. It turns out I receive this error because I am using sandbox behind corporate proxy, and have http_proxy and https_proxy setup. Even though sandbox.hortonworks.com is resolved to a local ip in the /etc/hosts file, that local ip is not always the localhost, but an IP in range 10.0.X.X if you are using NAT. Therefore, the request is hitting the proxy first, which resulted in a failure.
... View more
01-18-2016
05:11 PM
I have scanned all the java classes from the flume 1.5.2 lib, and I have two jars relating to kafka, the kafka-client-0.8.2-beta.jar and kafka_2.10-0.8.2-beta.jar. However, they don't seem to be for source and sink. The flume 1.6 contains classes for kafka source and sink explicitly. Therefore, I doubt the source and sink works with Flume 1.5.2 in HDP 2.3+. I am also looking into ways to get Flume to work with Kafka in HDP 2.3. If any of you have working examples, and can share your conf file, I would highly appreciate it.
... View more