Member since
03-04-2019
67
Posts
2
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
4277 | 03-18-2020 01:42 AM | |
2195 | 03-11-2020 01:09 AM | |
2633 | 12-16-2019 04:17 AM |
02-03-2020
01:28 PM
1 Kudo
Dear Jay what to say excellent answer , you are really one of the best here
... View more
01-21-2020
03:43 PM
Hi @yasminela
Apologies for the temporary unavailability of the HDP Sandbox. The problem appears to have been fully resolved prior to 01-17-2020 and the HDP Sandbox should be available to all for downloading. Let us know if you find that is not the case for you.
Bill
... View more
01-06-2020
11:42 PM
Hi @EricL . Thank you very much... but the same use case is here:- this is my output data structure from a JSON we required:- {
"name": [
{
"use": "official", //here "tab1.use" is column and value
"family": "family",//here "tab1.family" is column and value
"given": [ //this column we need to create and add value from "tab1.fn&ln"
"first1", //here "first1" is coming from tab1.fname
"last1" //here "last1"is coming from tab1.lname
]
},
{
"use": "usual", //here "tab2.use" is column and value
"given": [ //here we need to create column with fn&ln
"first1 last1" //here "first1 last1" is coming from tab1.fname &tab1.lname
]
}
]
}
here we want to create a column(name) from above columns :-
above data is JSON structure but i want in Hive with table columns.
then further we can convert the same into JSON in my use cases.
Note :- structure is matter here.
Thanks
HadoopHelp
... View more
12-17-2019
07:48 AM
Hi All . here is more Details about above :- https://community.cloudera.com/t5/Support-Questions/HDInsight-Vs-HDP-Service-on-Azure-Vs-HDP-on-Azure-IaaS/m-p/166424 Thanks HadoopHelp
... View more
12-17-2019
05:48 AM
1 Kudo
@Bindal Do the following steps sandbox-hdp login: root root@sandbox-hdp.hortonworks.com's password: ..... [root@sandbox-hdp~] mkdir -p /tmp/data [root@sandbox-hdp~]cd /tmp/data Now here you should be in /tmp/data to validate that do [root@sandbox-hdp~]pwd copy your riskfactor1.csv to this directory using some tool win Winscp or Mobaxterm see my screenshot using winscp My question is where is riskfactor1.csv file located? If that's not clear you can upload using the ambari view first navigate to /bindal/data and then select the upload please see attached screenshot to upload the file from your laptop. After the successful upload, you can run your Zeppelin job and keep me posted
... View more
12-16-2019
04:17 AM
Hi @rohitmalhotra . Thanks!! i found this solution already!!! problem was :- not able to get exact HDFS Path as last post i mentioned . i checked core.site.xml file with Azure HDInsight cluster DFS File path:- Thanks HadoopHelp
... View more
12-09-2019
01:50 AM
Hi @redwuie . you can do this activity with Azure HDInsight Cluster as below LInk:- http://dbmentors.blogspot.com/2018/02/integrating-hadoop-cluster-with.html(this is not on premise) But as per my concern i tried a lot to do the same with on premise hadoop cluster but that not worked . there is only single solution for that if you want to move data from Hadoop on premise cluster to Azure data lake or Blob then you have to use Azure Data Box. Thanks HadoopHelp
... View more
08-06-2019
10:55 AM
I'm happy to see you resolved your issue. Please mark the appropriate reply as the solution, as it will make it easier for others to find the answer in the future.
... View more
07-26-2019
02:18 PM
> Is there any option to find empty directory using HDFS command Directly? You can get a list/find empty directories using the 'org.apache.solr.hadoop.HdfsFindTool'. And using the hdfs tool to check/test if _a_ directory is empty, you can use -du or -test; please see the FileSystemShell [0] test
Usage: hadoop fs -test -[defsz] URI
Options:
-d: f the path is a directory, return 0.
-e: if the path exists, return 0.
-f: if the path is a file, return 0.
-s: if the path is not empty, return 0.
-r: if the path exists and read permission is granted, return 0.
-w: if the path exists and write permission is granted, return 0.
-z: if the file is zero length, return 0.
Example:
hadoop fs -test -e filename du
Usage: hadoop fs -du [-s] [-h] [-x] URI [URI ...]
Displays sizes of files and directories contained in the given directory or the length of a file in case its just a file.
Options:
The -s option will result in an aggregate summary of file lengths being displayed, rather than the individual files. Without the -s option, calculation is done by going 1-level deep from the given path.
The -h option will format file sizes in a “human-readable” fashion (e.g 64.0m instead of 67108864)
The -x option will exclude snapshots from the result calculation. Without the -x option (default), the result is always calculated from all INodes, including all snapshots under the given path.
The du returns three columns with the following format:
size disk_space_consumed_with_all_replicas full_path_name
Example:
hadoop fs -du /user/hadoop/dir1 /user/hadoop/file1 hdfs://nn.example.com/user/hadoop/dir1
Exit Code: Returns 0 on success and -1 on error. [0] https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/FileSystemShell.html
... View more
07-26-2019
09:00 AM
Hi @dennisli , I hope You are doing well. Please confirm one time in below fields value what i commented:- <property>
<name>javax.jdo.option.ConnectionDriverName</name>
<value>com.mysql.jdbc.Driver</value> //this is metadata db
</property>
<property>
<name>javax.jdo.option.ConnectionUserName</name>
<value>hive</value> //this is metadata db username
</property>
<property>
<name>javax.jdo.option.ConnectionPassword</name>
<value>mypassword</value> //this is metadatapassword
</property> Thanks HadoopHelp
... View more
- « Previous
-
- 1
- 2
- Next »