Member since
09-18-2015
3274
Posts
1159
Kudos Received
426
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 2599 | 11-01-2016 05:43 PM | |
| 8667 | 11-01-2016 05:36 PM | |
| 4912 | 07-01-2016 03:20 PM | |
| 8238 | 05-25-2016 11:36 AM | |
| 4376 | 05-24-2016 05:27 PM |
12-16-2015
02:16 PM
@Yue Chen Can you share the error messages please?
... View more
12-16-2015
01:39 PM
@lobna tonn You get Spark 1.4.1 when you install HDP 2.3.2 but if you want to upgrade to 1.5.2 then we can help you on that. Core business model means 100% open source. Read this
... View more
12-16-2015
01:24 PM
@Nilesh Solution will be to treat production as production 🙂 and have backups of name nodes directories.
... View more
12-16-2015
12:49 PM
hadoop.proxyuser.hive.hosts is Hostname from where superuser hive can connect Based on the above definition, try running the test for hive user. http://docs.hortonworks.com/HDPDocuments/HDP1/HDP-.. @Manoj A
... View more
12-16-2015
12:34 PM
1 Kudo
@Alex Raj Example n1, n2 , n3 = zk address hbase org.apache.hadoop.hbase.mapreduce.CopyTable --peer.adr=n1,n2,n3:2181:/hbase table
... View more
12-16-2015
12:02 PM
1 Kudo
@Nilesh I believe there was recent crash or reboot of servers or some operation that caused the lag. Recent Txid is 293929 , NN is looking for 221561 You have to provide the edit logs. If it's dummy or lab cluster then you may be able to restart the nn by formatting it **It can cause data loss**
... View more
12-16-2015
11:57 AM
1 Kudo
@lobna tonn Please see this https://hortonworks.com/press-releases/hortonworks-accelerates-spark-at-scale-for-the-enterprise/ Spark 1.5.2 is part of HDP stack. You can try running wordcount but I highly recommend to look into the core business model too.
... View more
12-16-2015
11:54 AM
@Anshul Sisodia Please provide log file entries or snapshot
... View more
12-16-2015
11:46 AM
2 Kudos
@lobna tonn Hi Lobna, I highly recommend to do more research on the business model and core technology of these vendors. You can start with a POC (prepare a use case) to load your own data or start with https://github.com/hortonworks/hive-testbench once cluster is up. My linkedin address is in my profile. Please feel free to add me and we can talk about it.
... View more
12-16-2015
03:02 AM
2 Kudos
@Divya S rpm -qa | grep -i hadoop then you can run zypper remove package Also, when you install ambari, you will get warnings and will give you script to cleanup the existing install example: python /usr/lib/python2.6/site-packages/ambari_agent/HostCleanup.py This is for CentOS https://community.hortonworks.com/content/idea/138...
... View more