Member since
08-31-2017
30
Posts
0
Kudos Received
0
Solutions
12-23-2018
01:01 PM
To upload the file from your Windows machine to a Linux machine, you can use a tool like WinSCP. You configure the session for the Linux machine almost identical to the config in Putty. It gives you a GUI to copy files. On the other hand, when you need to access the Windows machine from Linux, you need to configure an FTP or better SFTP server on Windows that allows access to your NTFS path. Or you use the Windows Network to share, and install Samba, a Windows networking implementation, on the Linux machine.
... View more
01-05-2018
05:26 AM
This would also work. import java.io.File val files = getListOfFiles("/tmp") def getListOfFiles(dir: File):List[File] = dir.listFiles.filter(_.isFile).toList
... View more
12-04-2017
05:26 PM
@Chaitanya D, Thanks a lot for your kind words. Glad that it helped you. Can you please accept the answer. This will really help other community users.
... View more
12-01-2017
02:40 PM
@Chaitanya D, Glad that it worked. Can you please open a new question instead of piggybacking. This will divert the main thread. Please tag me in the new question and I am happy to help 🙂
... View more
11-01-2017
04:57 AM
@Chaitanya D Please run HDFS Service check from Ambari Server UI to see if all the DataNodes are healthy and running? java.lang.Exception: org.apache.hadoop.ipc.RemoteException(java.io.IOException): File
/user/hduser/sqoop_import/customers/_temporary/0/_temporary/attempt_local270107642_0001_m_000000_0/part-m-00000
could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation. Above error indicates that No DataNodes are running or DataNodes are not healthy. So please check if your Sqoop is using the correct hdfs-site.xml / core-site.xml in it's classpath with Valid Running DataNodes. . You can also try running your Sqoop command using "--verbose" option to see the "Classpath" setting to know if it is including the correct "hadoop/conf" directory something like: "/usr/hdp/2.6.0.3-8/hadoop/conf" . Please check the DataNode process is running and try to put sode file to HDFS to see if your HDFS store operations are running fine? # ps -ef | grep DataNode
# su - hdfs
# hdfs dfs -put /var/log/messages /tmp .
... View more