Member since
06-07-2016
923
Posts
322
Kudos Received
115
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 4114 | 10-18-2017 10:19 PM | |
| 4353 | 10-18-2017 09:51 PM | |
| 14885 | 09-21-2017 01:35 PM | |
| 1853 | 08-04-2017 02:00 PM | |
| 2430 | 07-31-2017 03:02 PM |
12-12-2016
04:07 PM
Can you please share the result of following command:
describe 'TABLE_NAME'
... View more
12-12-2016
05:49 AM
@Xiaojie Ma Can you check the setting for keep deleted cells fo your column family? https://hbase.apache.org/book.html#cf.keep.deleted Also, have you set the following? hbase.hstore.time.to.purge.deletes If yes, what's the value?
... View more
12-06-2016
05:08 PM
@subash sharma Thanks. This shows your kerberos hanging issue but not much. Can you please share the logs.
... View more
12-06-2016
02:44 PM
@subash sharma We cannot see any screenshot attached. Better than screenshot, can you get Ambari logs.
... View more
12-06-2016
02:40 PM
@hello hadoop Logs must be there. Without logs, we can't help. Can you please check log files under /var/log/hive?
... View more
12-05-2016
06:33 PM
@hello hadoop To enable LaunchContainerExecutor to impersonate the actual user, you need to change following property to false. yarn.nodemanager.linux-container-executor.nonsecure-mode.limit-users and when you do this, then user "hdpuser001" must exist on all nodes in the cluster, otherwise, part of the job that's running on a node without this user will fail.
... View more
12-05-2016
05:10 PM
@Gurpreet Singh You need to use "GetFile" to get the log file. Then use split text or some filter process based on your requirements to parse log file (or simply just send the whole file) and then send to PutEmail processor to send the emil. https://nifi.apache.org/docs/nifi-docs/components/org.apache.nifi.processors.standard.PutEmail/
... View more
12-05-2016
01:51 AM
@PJ The balancer does not balance between individual disks on a single data node. Assuming Data nodes are fairly balanced across the cluster, why do you care? If you have multiple nodes and this one disk is reaching capacity, that shouldn't affect how Hadoop works. As new data comes in, Hadoop will be smart enough to place blocks on other nodes. Now, a question that comes to mind is, do you have other things on this disk (I mean other than HDFS data)? Can you do a cat /proc/mount to check what is mounted on your /dev/sdi? There might be other things that are taking space.
... View more
11-28-2016
09:27 PM
@Qi Wang You did not installed Storm when you installed HDP. You need to add Storm service also. Then restart Metron.
... View more
11-28-2016
05:32 PM
2 Kudos
If you are doing manual install of ambari agents then you do not need passwordless ssh. https://docs.hortonworks.com/HDPDocuments/Ambari-2.2.1.0/bk_ambari_reference_guide/content/ch_amb_ref_installing_ambari_agents_manually.html
... View more