Member since
08-02-2017
16
Posts
8
Kudos Received
0
Solutions
07-23-2018
06:21 PM
See this link: https://community.hortonworks.com/articles/4103/hiveserver2-jdbc-connection-url-examples.html
... View more
07-14-2018
05:29 AM
6 Kudos
I can't to find the Hive View on HDP 3, and i didn't to find any information about this on documentation. What I use to replace the Hive View on HDP 3?
... View more
Labels:
05-08-2018
05:17 PM
Hi @Alex Ye,
I looked on my NiFi logback.xml and my lines is equal to the folow: <encoder>
<pattern>%date %level [%thread] %logger{40} %msg%n</pattern>
</encoder>
... View more
05-05-2018
10:04 PM
Hi @Alex Ye, If I understend correctly, your problem is that on the log is showing two lines with the same information. That is it? About this, let me know how many instances of NiFi do you have on your cluster?
... View more
05-02-2018
06:36 PM
Aways that you perform a save opperation the files will be created acording the number of partition of you DF, and this process generate files names same "part-xxxxx", so, this is the complete file name.
The file name never will be different this. The variable is how many files will be generated. So sorry if I understand you desire.
... View more
05-01-2018
09:59 PM
Hi @Prudhvi Rao Shedimbi, What exactly is your need? I ask this becouse if you simple want to read saved file is only necessary that you set the folder and all content will be read. sc.textFile("foldername/*") So, if what you want is write one unique file, from a previous processing of a DataFrame then you can do this using the "df.repartition(1).saveAsTextFile('HDFSFolder/FileName')" and so, only one file "part-00000" will be generated. If you are using a library like DataBricks you can do so: df.write.format("csv").save("/HDFSFolder/FileName.csv") That's it?
... View more
12-05-2017
01:24 PM
Hi @Sidi Mahmoud RHIL, I searched for information about and found this: Current Apache hive release doesn't support this feature. In the shared apache Hive Jira HIVE-4847, you can see this is a new feature and the patch is available but it is yet to be merged and tagged to a particular release. We can expect this feature in any of the later releases.
... View more
08-19-2017
08:38 PM
I don't know Pyton, but if you don't need specifically to use Pyton, you can use NIFI. The NIFI has many processors to this purpose. You can get files from FTP, FS, HDFS, and to ingest to HDFS. I hope to helped.
... View more