Member since
07-13-2020
58
Posts
2
Kudos Received
10
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1221 | 09-04-2020 12:33 AM | |
7776 | 08-25-2020 12:39 AM | |
2439 | 08-24-2020 02:40 AM | |
2162 | 08-21-2020 01:06 AM | |
1155 | 08-20-2020 02:46 AM |
08-07-2020
02:25 AM
can you try removing this : force_https_protocol=PROTOCOL_TLSv1_2. This was true for previous versions of Ambari but probably not for the new versions. Another try is to try without SSL so as to understand if the communication is broken or the SSL config is a problem. Hope this helps.
... View more
08-04-2020
02:20 AM
Hi @asra we have a workaround in place for now. I am in touch with our security team to understand the root cause but haven't figured it out yet. All i did was to deploy a machine with UI (windows or Ubuntu since centos / redhat are command line only) within the same network as your ambari server. This will bypass any proxy / firewall / group policy settings. If not please provide detailed info about your setup and i will try to point out thing you can try.
... View more
07-28-2020
11:39 PM
Ahh ok...didnt check the documentation my bad. But, the question still lies if it will ignore the directory on all nodes or only old nodes. I am interested how this turns out. Maybe you can do a quick trial? I dont have a dev environment to try at the moment.
... View more
07-24-2020
03:32 AM
Policy is synced to all the nodes. You can check that in Ranger->Audit->Plugins. If not, then you should check if you have access policy for node identities,
... View more
07-24-2020
12:32 AM
You can use the flow you have in mind. But, i would suggest some optimization for your flow here: PutHDFS & PutHiveQL -> I suggest you create an external table on HDFS so you can skip the puthiveql Rest of your flow is to make SQL statements -> As i suggested before, you can look at PutDatabaseRecord since it has a recordreader(in your case AvroReader) and you can mention what type of sql query needs to be generated. This will help you with all the conversions and make the flow much much faster. The process to retrieve selection columns you can still use the same flow, just change the query you retrieve from getfile. Hope this helps.
... View more
07-23-2020
02:30 AM
Hi....yes you can use selecthiveql to run multiple hiveql statements. I would suggest putting the sql statements in a file / files and use list file and putfile processors before selecthiveql....Data is stored in flowfile content so you do not need to have a variable or anything. Finally you can use PutDatabaseRecord to write it to sql db. Ideal flow with no transformations : ListFile->FetchFIle->SelectHiveql->putdatabaserecord Hope this helps.
... View more
07-23-2020
01:52 AM
I am not a 100% sure but i dont think you can add more disks for new machines. HDFS does a round robin writes on all disk, hence you have to either have the same no. of disks or increase the disks on the existing data nodes. Then you update dfs.datanode.data.dir accordingly.
... View more
07-23-2020
01:40 AM
Turns out it was either the antivirus or group policies that was blocking the connection. I am trying to get more info and update the post but if someone stumbles onto this, they should check this first.
... View more
07-15-2020
05:15 AM
1 Kudo
You can try to use this command : http://<ambari-server>:8080/api/v1/stacks/{stackName}/versions/{stackVersion}/services To get help on API calls use this : http://<ambari-server>:8080/api-docs. You can try out api calls to understand what api returns.
... View more
07-15-2020
03:46 AM
Any Guru's here who can help ?
... View more
- « Previous
- Next »