Member since
09-15-2015
457
Posts
507
Kudos Received
90
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
15549 | 11-01-2016 08:16 AM | |
10964 | 11-01-2016 07:45 AM | |
8335 | 10-25-2016 09:50 AM | |
1883 | 10-21-2016 03:50 AM | |
3698 | 10-14-2016 03:12 PM |
11-09-2015
05:53 PM
thanks @Neeraj I will open a support ticket as well
... View more
11-09-2015
05:51 PM
I am getting the same or at least similar error after my Ranger installation. It looks like a bug in Ambari 2.1.2, be careful your installation might have messed up the "kdc_type" (go to <ambari_host>/api/v1/clusters/<clustername> search for kdc_type. If the value is something like "Existing MIT KDC", than you probably have to change it back to avoid any further issues) Update: I discussed this problem with support and we came to the conclusion that this was a weird hick-up that coincidentally happened at the same time on our two clusters. I was not able to reproduce this issue and it seems like this issue did not come up somewhere else. If you still have the logs, please open a support case and forward them to support/engineering.
... View more
11-09-2015
05:20 PM
whats your Ambari version? This might be an Ambari 2.1.2 issue, my Ranger installation just failed with the same error
... View more
11-09-2015
01:10 PM
@Sean Roberts Just finalized my blueprint installation with a modified hive-env. I basically just configured an existing MySQL database in hive-env in the blueprint. Ambari added the rest of the hive-env variables to the configuration, so there are no config values missing.
... View more
11-06-2015
08:32 AM
Thanks for the article! Have you tested the visualization with bigger datasets as well? I am curious how the UI works with bigger datasets or queries that need some time to calculate.
... View more
11-04-2015
07:09 PM
1 Kudo
You could use the official SolrBolt from Lucidworks (https://github.com/LucidWorks/storm-solr) and put your messages into Solr using batch sizes of 1000 or even 10000. As Andrew pointed out, the second option is to write your messages to HDFS and afterwards use the Job Jar to load the data into Solr. The command looks something like this: hadoop jar /opt/lucidworks-hdpsearch/job/lucidworks-hadoop-job-2.0.3.jar com.lucidworks.hadoop.ingest.IngestJob -Dlww.commit.on.close=true -cls com.lucidworks.hadoop.ingest.DirectoryIngestMapper --collection my_collection -i /data/* -of com.lucidworks.hadoop.io.LWMapRedOutputFormat --solrServer http://c6601.ambari.apache.org:8983/solr
... View more
11-03-2015
01:15 PM
1 Kudo
I guess one of the reasons is the fact that Ipython is not an Apache project and doesnt support Scala code. The distribution of Ipython is huge though (http://nbviewer.ipython.org/gist/parente/facb555dfbae28e817e0 200K notebooks on github)
... View more
11-03-2015
01:06 PM
1 Kudo
There is some interesting information in this mail thread https://news.ycombinator.com/item?id=9463809 "I'm one of committers of Apache Zeppelin (incubating). Zeppelin is inspired by iPython notebook and many other amazing softwares that has notebook interface.
I know iPython notebook has long history and large community, i really like it. Zeppelin is young, new project compare to iPython notebook.
Zeppelin and iPython notebook, they are opensource. iPython notebook is lead by IPython Development Team. Zeppelin is under Apache Software Foundation and it is being developed in Apache way, from copyrights, development process, decision making process to community development.
Zeppelin is focusing on providing analytical environment on top of Hadoop eco-system. I'm not sure about iPython's direction, but i don't think it's the same to Zeppelin.
I see many projects that has notebook interface. Not only iPython and Zeppelin, but also Databricks Cloud, Spark Notebook, Beaker and many others. I'm sure they all have their own advantages. Hope see all softwares are beloved by users."
... View more
11-03-2015
10:49 AM
Thanks for the official Oracle bug reference. I'll see if I can raise this internally and we can change the default/recommended JDK to 1.8.0_60 in our docs
... View more