Member since
09-02-2017
16
Posts
3
Kudos Received
0
Solutions
02-24-2019
08:02 PM
@jsensharma @asirna any guidance on how to upgrade spark in hdp 2.6 and hdp 3 ? In particular how to compile spark from source and then deploy it?
... View more
02-24-2019
06:48 PM
@asirna @jsensharma could you give some hint on how to upgrade to spark 2.4.0 in HDP 2.6.5 or HDP 3 ?
... View more
02-09-2019
11:25 AM
what about using spark to fetch data from hbase and load into solr ? This would allow transformations. Spark has great connector for both hbase and solr.
... View more
02-07-2019
09:13 PM
1 Kudo
Maybe it is possible to install 2.4.0 manually and keep the 2.3.2 aside. This would not break anything but provide 2.4.0 if needed
... View more
01-23-2019
10:51 AM
1 Kudo
Is there any way to update HDP to spark 2.4.0 manually then ?
... View more
01-10-2019
02:19 PM
I have tested the example and I get this error: java.lang.NoSuchMethodError: org.apache.spark.sql.hive.HiveSessionCatalog.<init>(Lorg/apache/spark/sql/hive/HiveExternalCatalog;Lorg/apache/spark/sql/catalyst/catalog/GlobalTempViewManager;Lorg/apache/spark/sql/hive/HiveMetastoreCatalog;Lorg/apache/spark/sql/catalyst/analysis/FunctionRegistry;Lorg/apache/spark/sql/internal/SQLConf;Lorg/apache/hadoop/conf/Configuration;Lorg/apache/spark/sql/catalyst/parser/ParserInterface;Lorg/apache/spark/sql/catalyst/catalog/FunctionResourceLoader;)V at org.apache.spark.sql.hive.llap.LlapSessionCatalog.<init>(LlapSessionCatalog.scala:49) at org.apache.spark.sql.hive.llap.LlapSessionStateBuilder.catalog$lzycompute(LlapSessionStateBuilder.scala:33) at org.apache.spark.sql.hive.llap.LlapSessionStateBuilder.catalog(LlapSessionStateBuilder.scala:32) at org.apache.spark.sql.hive.llap.LlapSessionStateBuilder.catalog(LlapSessionStateBuilder.scala:26) at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anon$1.<init>(HiveSessionStateBuilder.scala:68) at org.apache.spark.sql.hive.HiveSessionStateBuilder.analyzer(HiveSessionStateBuilder.scala:68) at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) at org.apache.spark.sql.internal.BaseSessionStateBuilder$$anonfun$build$2.apply(BaseSessionStateBuilder.scala:293) at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:79) at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:79) at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:74) at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:638) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357) at py4j.Gateway.invoke(Gateway.java:282) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.GatewayConnection.run(GatewayConnection.java:214) at java.lang.Thread.run(Thread.java:745)
... View more
01-10-2019
10:21 AM
@Sandeep Nemuri the mentioned article is about spark-thrift. I am looking for code for spark. Something equivalent to (from https://github.com/hortonworks-spark/spark-llap): val hive = com.hortonworks.spark.sql.hive.llap.HiveWarehouseBuilder.session(spark).build() hive.execute("describe extended web_sales").show(100, false)
... View more
01-09-2019
07:19 PM
Hi The spark-llap connector states it is compatible with HDP 2.6.X. ( https://github.com/hortonworks-spark/spark-llap#compatibility) However, I am not able to find any documentation on how to make it working. There is no HiveWarehouseBuilder or LlapContext Scala classes in the branch 2.3.0. I have been able to compile and configure, and run spark configured correctly. Right now I am looking for some guidance on how to import the lib in scala and make some queries. Thanks
... View more
Labels:
10-02-2018
07:14 PM
Hi I d'like to migrate ambari on an other server, that is not part of hadoop. Right now, ambari is installed on a namenode server. Is there any guideline for that migration ? Thanks
... View more
Labels:
- Labels:
-
Apache Ambari
10-02-2018
07:09 PM
Hi I guess nothing wrong with it: log4j.rootLogger = warn,xa_log_appender # xa_logger log4j.appender.xa_log_appender=org.apache.log4j.DailyRollingFileAppender log4j.appender.xa_log_appender.file=${logdir}/xa_portal.log log4j.appender.xa_log_appender.datePattern='.'yyyy-MM-dd log4j.appender.xa_log_appender.append=true log4j.appender.xa_log_appender.layout=org.apache.log4j.PatternLayout log4j.appender.xa_log_appender.layout.ConversionPattern=%d [%t] %-5p %C{6} (%F:%L) - %m%n log4j.appender.xa_log_appender.MaxFileSize={{ranger_xa_log_maxfilesize}}MB # xa_log_appender : category and additivity log4j.category.org.springframework=warn,xa_log_appender log4j.additivity.org.springframework=false log4j.category.org.apache.ranger=info,xa_log_appender log4j.additivity.org.apache.ranger=false log4j.category.xa=info,xa_log_appender log4j.additivity.xa=false # perf_logger log4j.appender.perf_appender=org.apache.log4j.DailyRollingFileAppender log4j.appender.perf_appender.file=${logdir}/ranger_admin_perf.log log4j.appender.perf_appender.datePattern='.'yyyy-MM-dd log4j.appender.perf_appender.append=true log4j.appender.perf_appender.layout=org.apache.log4j.PatternLayout log4j.appender.perf_appender.layout.ConversionPattern=%d [%t] %m%n # sql_appender log4j.appender.sql_appender=org.apache.log4j.DailyRollingFileAppender log4j.appender.sql_appender.file=${logdir}/xa_portal_sql.log log4j.appender.sql_appender.datePattern='.'yyyy-MM-dd log4j.appender.sql_appender.append=true log4j.appender.sql_appender.layout=org.apache.log4j.PatternLayout log4j.appender.sql_appender.layout.ConversionPattern=%d [%t] %-5p %C{6} (%F:%L) - %m%n # sql_appender : category and additivity log4j.category.org.hibernate.SQL=warn,sql_appender log4j.additivity.org.hibernate.SQL=false log4j.category.jdbc.sqlonly=fatal,sql_appender log4j.additivity.jdbc.sqlonly=false log4j.category.jdbc.sqltiming=warn,sql_appender log4j.additivity.jdbc.sqltiming=false log4j.category.jdbc.audit=fatal,sql_appender log4j.additivity.jdbc.audit=false log4j.category.jdbc.resultset=fatal,sql_appender log4j.additivity.jdbc.resultset=false log4j.category.jdbc.connection=fatal,sql_appender log4j.additivity.jdbc.connection=false
... View more
09-27-2018
09:59 AM
hi I have ranger setup for hive on a kerberised cluster. Ranger works well for data masking, table access. I have setup ranger to store the logs into hdfs://0.0.0.0:9000/ranger/audit The folder is populated with both daily folder and daily files. When Hive is freshly started, the daily log files are populated for a day. However the day after, they are empty. I need to restart Hive to make the file populated again I cannot see any error in the hiveserve2.log (this apends with both llap and hive1)
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Ranger
09-06-2018
01:11 PM
Hi If I run klist under hive unix user, i observe the ticket refreched every minute. Hive is the only user to have this behabior. There is no crontab to refrech it. I cannot figure out why the ticket is being refreshed that often ! HDP 2.6.5, HIVE 2.1
... View more
Labels:
- Labels:
-
Apache Hive
05-11-2018
10:06 AM
Hi. I had the same issue, and setting hive.llap.client.consistent.splits to false make it working. However, I had to turn set hive.llap.execution.mode=none; to make it working. AFAIK i am now using hive2.10 without llap, but with doas
... View more
02-17-2018
09:36 AM
yes, I do : "Package libgfortran-4.8.5-16.el7_4.1.x86_64 already installed and latest version"
... View more
02-16-2018
11:31 PM
While running spark ml, i am getting those warnings: WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeSystemBLAS
WARN BLAS: Failed to load implementation from: com.github.fommil.netlib.NativeRefBLAS Some information on the web states (https://stackoverflow.com/questions/43286771/spark-with-openblas-on-emr/43398522 and this http://www.spark.tc/blas-libraries-in-mllib/ ) indicates to build spark from source. How can I easily build from source in hdp ? Is it possible to build and then replace the hdp spark folder, without breaking stuff ? Thanks.
... View more
Labels:
09-02-2017
05:07 PM
1 Kudo
hadoop jar avro-tools-1.8.2.jar getschema hdfs_archive/mydoc.avro would also done the job , instead of java -jar, you can directly run it on hdfs thanks to : hadoop jar avro-tools-1.8.2.jar getschema hdfsPathTOAvroFile.avro
... View more