Member since
03-02-2016
19
Posts
28
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
7110 | 03-22-2016 11:15 AM | |
2895 | 03-16-2016 06:34 AM | |
4922 | 03-09-2016 04:30 PM | |
5436 | 03-04-2016 03:53 AM |
06-08-2016
07:55 AM
I already attached the log. Bellow is the link to the script. https://github.com/intel-hadoop/Big-Data-Benchmark-for-Big-Bench/blob/master/bin/runBenchmark Big bench bin folder https://github.com/intel-hadoop/Big-Data-Benchmark-for-Big-Bench/blob/master/bin Big bench Documentation https://github.com/intel-hadoop/Big-Data-Benchmark-for-Big-Bench Exact script https://github.com/intel-hadoop/Big-Data-Benchmark-for-Big-Bench/blob/master/engines/hive/queries/q20/run.sh
... View more
06-08-2016
06:55 AM
q20-hive-engine-validation-power-test-0.txt Hello, I am working on big bench performance benchmark on hpd 2.3. While running 20th query. Im facing the bellow issue. Files /root/BigBench/Big-Data-Benchmark-for-Big-Bench/engines/hive/queries/q20/results/q20-result and /dev/fd/62 differ Validation of /root/BigBench/Big-Data-Benchmark-for-Big-Bench/engines/hive/queries/q20/results/q20-result failed: Query returned incorrect results Validation failed: Query results are not OK cat: Unable to write to output stream. Please find attachment of Error Log.
... View more
Labels:
05-20-2016
01:52 PM
1 Kudo
I am trying to fetch the data from arcsight to elastic search using flume 1.6. Unfortunately, flume official version is not supported the elasticsearch 2.3. So I used the third party code. https://github.com/lucidfrontier45/ElasticsearchSink2 Everything is working which they mentioned in the above link. while executing the flume, it is producing the serializer error. java.lang.IllegalArgumentException: org.apache.flume.sink.elasticsearch.ElasticSearchDynamicSerializer is not an ElasticSearchEventSerializer
at com.frontier45.flume.sink.elasticsearch2.ElasticSearchSink.configure(ElasticSearchSink.java:278)
at org.apache.flume.conf.Configurables.configure(Configurables.java:41)
at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:413)
at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:98)
at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
2016-05-20 09:36:25,003 (conf-file-poller-0) [ERROR - org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:427)] Sink k1 has been removed due to an error during configuration
java.lang.IllegalArgumentException: org.apache.flume.sink.elasticsearch.ElasticSearchDynamicSerializer is not an ElasticSearchEventSerializer
at com.frontier45.flume.sink.elasticsearch2.ElasticSearchSink.configure(ElasticSearchSink.java:278)
at org.apache.flume.conf.Configurables.configure(Configurables.java:41)
at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:413)
at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:98)
at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Hadoop
03-22-2016
11:15 AM
1 Kudo
Problem is with java 1.8 version. Bigbench is compatible with java 1.7. After running with java 1.7. Its working fine.
... View more
03-16-2016
05:43 PM
2 Kudos
While running bigbench Benchmark on HDP 2.3.0.0 using ambari, the following error occurred in data generation stage ------------------------------------------------------------------------------------------------------------------------------------------------------- JAVA_HOME=/usr/lib/jvm/java-1.7.0-openjdk-1.7.0.51-2.4.5.5.el7.x86_64/jre-abrt/bin/java /tmp/pdgfLog/1/pdgf.log: DEBUG main pdgf.generator.BigBenchReviewGenerator - 'Clothing & Accessories_Tops & Tees'
DEBUG main pdgf.generator.BigBenchReviewGenerator - 'Toys & Games_Electronics for Kids'
DEBUG main pdgf.generator.BigBenchReviewGenerator - 'Toys & Games_Vehicles & Remote-Control'
DEBUG main pdgf.core.dataGenerator.scheduler.DefaultPartitioner - Using default Pre-Partitioner from class pdgf.core.dataGenerator.scheduler.TemplatePartitioner
11: <generation>
84: <schema name="default">
85: <tables>
554: <table name="product_reviews">
555: <scheduler name="DefaultScheduler">
556: <partitioner name="pdgf.core.dataGenerator.scheduler.TemplatePartitioner">
DEBUG main pdgf.output.FileOutputSkeleton - def path != null: '"/user/root/benchmarks/bigbench/data_refresh/"+table.getName()+"/"' && !Constants.OUTPUT_FILE_KEEP_OUTPUTDIR:false => ignoring specified <ouputDir> nodes
DEBUG main pdgf.core.dataGenerator.scheduler.DefaultPartitioner - Using default Pre-Partitioner from class pdgf.core.dataGenerator.scheduler.TemplatePartitioner 11: <generation>
59: <scheduler name="DefaultScheduler">
60: <partitioner name="pdgf.core.dataGenerator.scheduler.TemplatePartitioner" staticTableOnAllNodes="false">
DEBUG main pdgf.output.FileOutputSkeleton - def path != null: '"/user/root/benchmarks/bigbench/data_refresh/"+table.getName()+"/"' && !Constants.OUTPUT_FILE_KEEP_OUTPUTDIR:false => ignoring specified <ouputDir> nodes
DEBUG main pdgf.core.dataGenerator.DataGenerator - MemoryAllocatorInterface: add Element: <schema name="bigbench"><table name="store"><field name="s_rec_start_date"><gen name="DateTimeGenerator">
WARN main pdgf.core.dataGenerator.DataGenerator - A 'pdgf.core.exceptions.ConfigurationException Exception occurred during initialization.
Message: The template contains errors: java.lang.RuntimeException: java.io.IOException: invalid constant type: 18
Copy this class in an IDE of your choice to ease debugging:
private class TemplateTester extends pdgf.generator.template.NextValueTemplate {
public void getValue(pdgf.plugin.AbstractPDGFRandom rng,pdgf.core.dataGenerator.beans.FieldValueDTO fvdto, pdgf.core.dataGenerator.beans.GenerationContext gc) throws Exception{
fvdto.setBothValues(generator(0, rng, gc, fvdto) + " " + generator(1, rng, gc, fvdto));
}
} Location: Location:
14: <schema name="bigbench">
2076: <table name="store">
2170: <field name="s_manager" primary="false" size="40" type="VARCHAR">
2171: <gen_NullGenerator name="NullGenerator" probability="${NULL_CHANCE}">
2172: <gen_TemplateGenerator name="TemplateGenerator">
DebugInformation:
:pdgf.core.exceptions.ConfigurationException: The template contains errors: java.lang.RuntimeException: java.io.IOException: invalid constant type: 18
Copy this class in an IDE of your choice to ease debugging:
private class TemplateTester extends pdgf.generator.template.NextValueTemplate {
public void getValue(pdgf.plugin.AbstractPDGFRandom rng,pdgf.core.dataGenerator.beans.FieldValueDTO fvdto, pdgf.core.dataGenerator.beans.GenerationContext gc) throws Exception{
fvdto.setBothValues(generator(0, rng, gc, fvdto) + " " + generator(1, rng, gc, fvdto));
}
}
at pdgf.generator.template.NextValueTemplate.instance(NextValueTemplate.java:97)
at pdgf.generator.TemplateGenerator.initialize(TemplateGenerator.java:102)
at pdgf.core.dbSchema.Element.initStage8_initialize_(Element.java:514)
at pdgf.core.dbSchema.Element.initStage8_initialize_(Element.java:528)
at pdgf.core.dbSchema.Element.initStage8_initialize_(Element.java:528)
at pdgf.core.dbSchema.Element.initStage8_initialize_(Element.java:528)
at pdgf.core.dbSchema.Element.initStage8_initialize_(Element.java:528)
at pdgf.core.dbSchema.Project.initStage8_initialize_(Project.java:722)
at pdgf.core.dataGenerator.DataGenerator.initRootProject(DataGenerator.java:171)
at pdgf.core.dataGenerator.DataGenerator.initialize(DataGenerator.java:139)
at pdgf.core.dataGenerator.DataGenerator.start(DataGenerator.java:214)
at pdgf.actions.StartAction.execute(StartAction.java:112)
at pdgf.actions.ActionPrioritySortObject.execute(ActionPrioritySortObject.java:50)
at pdgf.Controller.parseCmdLineArgs(Controller.java:1248)
at pdgf.Controller.start(Controller.java:1385)
at pdgf.Controller.main(Controller.java:1226)
... View more
Labels:
03-16-2016
06:34 AM
5 Kudos
Initially I installed manually ambari agent and that one I didn’t
uninstall it properly. So, while running the ambari to deploy the agent, script is pointing to wrong link resource_management ->
/usr/lib/ambari-agent/lib/resource_management instead of resource_management ->
/usr/lib/ambari-server/lib/resource_management Solution: goto cd /usr/lib/python2.6/site-packages/ create the correct link ln -s /usr/lib/ambari-server/lib/resource_management installation agent is deployment is working fine.
... View more
03-16-2016
06:33 AM
2 Kudos
Traceback
(most recent call last): File
"/usr/lib/python2.6/site-packages/ambari_server/bootstrap.py", line
41, in <module> from resource_management.core.shell import
quote_bash_args ImportError:
No module named resource_management.core.shell
... View more
Labels:
03-11-2016
11:53 AM
1 Kudo
Copied the below path in the flume-env.sh FLUME_CLASSPATH="/home/hadoop/hadoop/share/hadoop/hdfs/" Hdfs sink is working fine.
... View more
03-09-2016
04:30 PM
3 Kudos
FLUME_CLASSPATH=/root/flume/lib/ copied comon jar files from hadoop folder to the flume folder. cp /root/hadoop/share/hadoop/common/*.jar
/root/flume/lib cp
/root/hadoop/share/hadoop/common/lib/*.jar /root/flume/lib Now the above error is rectified.
... View more
03-09-2016
07:22 AM
3 Kudos
org.apache.flume.sink.DefaultSinkFactory.create:42) - Creating instance of sink: hdfs-sink, type: hdfs
09 Mar 2016 02:07:33,594 ERROR [conf-file-poller-0] (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:145) (org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run:145) - Failed to start agent because dependencies were not found in classpath. Error follows.
java.lang.NoClassDefFoundError: org/apache/hadoop/io/SequenceFile$CompressionType
at org.apache.flume.sink.hdfs.HDFSEventSink.configure(HDFSEventSink.java:239)
at org.apache.flume.conf.Configurables.configure(Configurables.java:41)
at org.apache.flume.node.AbstractConfigurationProvider.loadSinks(AbstractConfigurationProvider.java:413)
at org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:98)
at org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:140)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.runAndReset(FutureTask.java:304)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$301(ScheduledThreadPoolExecutor.java:178)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:744) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.io.SequenceFile$CompressionType
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Hadoop