Member since
05-16-2016
783
Posts
112
Kudos Received
39
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1403 | 06-12-2019 09:27 AM | |
2427 | 05-27-2019 08:29 AM | |
4347 | 05-27-2018 08:49 AM | |
3781 | 05-05-2018 10:47 PM | |
2423 | 05-05-2018 07:32 AM |
06-27-2023
01:54 PM
thanks it solved the problem
... View more
07-18-2022
02:31 AM
@newbieone, as this is an older post, you would have a better chance of receiving a resolution by starting a new thread. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.
... View more
02-21-2022
09:31 PM
Try running invalidate metadata; At end clearing browser cache worked for me.
... View more
01-21-2022
03:08 PM
Hi Nagaraj/ Anyone Can you please share the steps if you remember ? ERROR org.apache.hadoop.hdfs.server.namenode.ha.StandbyCheckpointer: Exception in doCheckpoint java.io.IOException: Exception during image upload: java.lang.NoClassDefFoundError: org/apache/http/client/utils/URIBuilder Caused by: java.lang.NoClassDefFoundError: org/apache/http/client/utils/URIBuilder
... View more
12-07-2021
03:57 AM
yes you can update
... View more
10-18-2021
10:59 AM
Usually, Exception: java.io.IOException: Exceeded MAX_FAILED_UNIQUE_FETCHES; bailing-out is caused by communication issues among Hadoop cluster nodes. To resolve this issue, check the following: a) Whether there are any communication problems among the Hadoop cluster nodes. b) Whether SSL certificate of any data node has expired (If Hadoop cluster is SSL enabled). c) If the SSL changes were made and services that are using the SSL is not restarted after the activity the issue will occur, need to restart the services in the cluster which are using the SSL.
... View more
10-15-2021
07:43 AM
how to solve this issue. Input path does not exist
... View more
10-13-2021
11:11 PM
How did you resolve the issue? I am facing a similar problem with datanode not starting
... View more
05-24-2021
06:46 PM
Hi below are some pointers just in case not tried ( which I believe you using them ) set the column value to TIMESTAMP -- ( map-column-hive <cols_name>=TIMESTAMP) Then please keep in mind the column should be bigint The main issue is the format. Parquet represent time in "msec" where as impala will interpret BigInt as "sec" so to get correct value you need to take care at query level. ( divide by 1000) Regards Jay
... View more