Member since
02-02-2016
583
Posts
518
Kudos Received
98
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3273 | 09-16-2016 11:56 AM | |
1375 | 09-13-2016 08:47 PM | |
5482 | 09-06-2016 11:00 AM | |
3179 | 08-05-2016 11:51 AM | |
5259 | 08-03-2016 02:58 PM |
06-13-2016
01:46 PM
2 Kudos
@Ashnee Sharma I'm guessing you are facing this issue with Phoenix 4.4.0 on spark 1.5 since you are using HDP 3.2.4, is that right?. I noticed that even HDP 2.4.2 comes with Phoenix 4.4.0 so there is less chance to get it fix by an upgrade. I would suggest you to contact Hortonworks Support for any official patch or workaround.
... View more
06-13-2016
01:25 PM
Sure, we will discuss this on this https://community.hortonworks.com/questions/39369/pheniox-patch-apply.html thread. Lets close this one.
... View more
06-13-2016
01:22 PM
Hi @Ashnee Sharma please accept this answer if you are good with the provided information to close this thread.
... View more
06-13-2016
11:57 AM
Hi @Aidan Condron Did you tried with above steps?
... View more
06-13-2016
11:40 AM
@Ashnee Sharma I can see one patch Phoenix patch in bug https://issues.apache.org/jira/browse/PHOENIX-2287, If you are suing Phoenix then you probably need to apply it in your phoenix env.
... View more
06-13-2016
10:53 AM
@Ashnee Sharma Instead of downgrading the spark version it would be better to work on the actual issue you are facing on spark 1.5 i.e java.lang.ClassCastException: org.apache.spark.sql.catalyst.expressions.GenericMutableRow cannot be cast to org.apache.spark.sql.Row
... View more
06-13-2016
10:18 AM
3 Kudos
I think you can't do it through Ambari, either you can install it manually and build it from spark 1.4 source code. We recommend to keep one spark version which comes with HDP stack. https://spark.apache.org/docs/1.4.1/building-spark.html
... View more
06-13-2016
10:15 AM
3 Kudos
Are you looking for jobtracker or Resource manager port? https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.4.2-Win/bk_HDP_Install_Win/content/ref-79239257-778e-42a9-9059-d982d0c08885.1.html
... View more
06-12-2016
09:10 PM
4 Kudos
@Pedro Rodgers If schema type is same on all the 100 text files then better to create a hive external table since you already have those files on HDFS. Example: If you have all the files under "/user/test/dummy/data" directory than run below command to create the external hive table and point it to the hdfs location. CREATE EXTERNAL TABLE user(
userId BIGINT,
type INT,
level TINYINT,
date String
)
COMMENT 'User Infomation'
PARTITIONED BY (date String)
LOCATION '/user/test/dummy/data'; Then, create the folder date=2011-11-11 inside /user/test/dummy/data/ And put the data files of date 2011-11-11 into the folder, Once you done you also need to add the partition in the hive metastore. ALTER TABLE user ADD PARTITION(date='2011-11-11');
... View more
06-11-2016
05:04 PM
4 Kudos
@Thiago If you are using DistCp and WebHDFS to copy data between a secure cluster and an nonsecure cluster by doing the following:
https://docs.hortonworks.com/HDPDocuments/HDP2/HDP-2.3.2/bk_Sys_Admin_Guides/content/ref-c8ffaa14-eaf8-48a6-9791-307283d5d29d.1.html
Set ipc.client.fallback-to-simple-auth-allowed to true in core-site.xml on the secure cluster side: <property>
<name>ipc.client.fallback-to-simple-auth-allowed</name>
<value>true</value>
</property>
Use below commands from the secure cluster side. distcp webhdfs://nonsecure-cluster webhdfs://secure-cluster
distcp webhdfs://secure-cluster webhdfs://nonsecure-cluster
... View more