Member since
01-19-2017
3679
Posts
632
Kudos Received
372
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 855 | 06-04-2025 11:36 PM | |
| 1430 | 03-23-2025 05:23 AM | |
| 718 | 03-17-2025 10:18 AM | |
| 2577 | 03-05-2025 01:34 PM | |
| 1691 | 03-03-2025 01:09 PM |
03-13-2018
05:02 PM
@Abdul Saboor Try to inncrease memory on your mappers. Take a look at mapreduce.map.memory.mb, mapreduce.reduce.memory.mb, mapreduce.map.java.opts and mapreduce.reduce.java.opts. In the below example I have sent it to 1GB, please substitute the values according to your environment sqoop import -D mapreduce.map.memory.mb=1024 -D mapreduce.map.java.opts=-Xmx768m --connect jdbc:oracle:thin:@oradbhost:15xx:DB --table test --username SYSTEM -P -m 1 Revert
... View more
03-13-2018
10:56 AM
@hadoop hdfs Did you install JCE to unlimited extension? Have you tried restarting the cluster? If it still fails can you run the below command $ su - hdfs
$ hadoop namenode -recover Please revert
... View more
03-13-2018
06:38 AM
@Bill Brooks That error is due to the timeout setting you can alter that by adding the server.startup.web.timeout property (measured in seconds) to /etc/ambari-server/conf/ambari.properties you can change the timeout to something like this: server.startup.web.timeout=120 Incidentally, the ambari could be accessible have you tried accessing the http://ambari-server:8080
... View more
03-12-2018
11:59 AM
@maha Rm Can you describe your Kafka cluster? Is a standalone cluster and you would like to sink the data to hdfs? here is a link to confluent HDFS connector
... View more
03-11-2018
07:22 PM
@Mudassar Hussain No worries I was worried you got stuck and didn't revert the HCC is full of solutions so, don't hesitate to update the thread if you encounter any problems but if the solution provided resolves your issue then that's great what you need to do is accept and close the thread Cheers
... View more
03-09-2018
05:33 PM
@Mudassar Hussain You haven't given your feedback on the method and solution I provided, you should understand that members go a long way to help out and feedback as to whether the solution resolved your problem is appreciated and in that case you accept the answer to reward the user and close the thread,so that others with a similar problem could use it as a SOLUTION.
... View more
03-08-2018
11:13 PM
@Aymen Rahal Did you copy the JDBC driver (mysql-connector-java*) to the $SQOOP_HOME/lib directory of your Sqoop installation.? Then re-run the command
... View more
03-08-2018
04:18 PM
@Rohit Khose Question you are connecting from a client outside the cluster? Whats the hostname of the client ? can you explain how you are executing from the client. Are you using a jaas configuration file?
... View more
03-08-2018
01:04 PM
@hema moger Great ! Could you the 'Accept' the answer by Clicking on Accept button below, That would be great help to Community users to find solution quickly for this kind of error.
... View more