Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1498 | 10-11-2018 01:38 AM | |
1865 | 09-26-2018 02:24 AM | |
1825 | 06-29-2018 02:35 PM | |
2415 | 06-29-2018 02:34 PM | |
5358 | 06-20-2018 04:30 PM |
09-15-2016
05:07 PM
the twitter related jars I have placed in the 'resource' folder as shown below : [root@hadoop1 resources]# pwd
/root/SparkStreaming/src/main/resources
[root@hadoop1 resources]# ls
dstream-twitter_2.11-0.1.0-SNAPSHOT.jar target twitter4j-core-4.0.4.ziptwitter4j-core-4.0.4.jar twitter4j-stream-4.0.4.jar
[root@hadoop1 resources]# jar tvf dstream-twitter_2.11-0.1.0-SNAPSHOT.jar | grep TwitterUtils
5862 Wed Aug 10 11:11:36 EDT 2016 org/apache/spark/streaming/twitter/TwitterUtils$.class
4749 Wed Aug 10 11:11:34 EDT 2016 org/apache/spark/streaming/twitter/TwitterUtils.class
[root@hadoop1 resources]#
... View more
09-15-2016
05:05 PM
utilitiesscala.txtprinttweetsscala.txtmy build file is ,I am attaching the source code . [root@hadoop1 SparkStreaming]# more build.sbt
name := "SparkStreaming"
version := "1.0"
scalaVersion := "2.11.7"
resolvers += "Akka Repository" at "http://repo.akka.io/releases/"
libraryDependencies ++= Seq("org.apache.spark" % "spark-core_2.10" % "1.2.1","org.apache.spark" % "spark-mllib_2.10" % "1.2.1")
scalacOptions += "-deprecation"
compilation error :
[root@hadoop1 SparkStreaming]# sbt compile
[info] Set current project to SparkStreaming (in build file:/root/SparkStreaming/)
[info] Compiling 2 Scala sources to /root/SparkStreaming/target/scala-2.11/classes...
[error] /root/SparkStreaming/src/main/scala/PrintTweets.scala:7: object twitter is not a member of package org.apache.spark.streaming
[error] import org.apache.spark.streaming.twitter._
[error] ^
[error] /root/SparkStreaming/src/main/scala/PrintTweets.scala:29: not found: value TwitterUtils
[error] val tweets = TwitterUtils.createStream(ssc, None)
[error] ^
[error] two errors found
[error] (compile:compile) Compilation failed
[error] Total time: 3 s, completed Sep 15, 2016 12:57:15 PM
thanks
... View more
Labels:
- Labels:
-
Apache Spark
09-15-2016
02:01 PM
1 Kudo
need to know the right forum for posting Flume related questions
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Hadoop
09-14-2016
10:05 PM
I want to do the same but in a cluster environment , how can I create more brokers on different nodes?
... View more
09-14-2016
03:35 AM
2 Kudos
Gouri you were right it was the privileges issue on Linux /tmp/hive folder , I was changing the permission of the hdfs /tmp/hive folder . I can access beeline now and can connect to the hive store , I have other issues though for which I will open a new post. thanks for your help
... View more
09-13-2016
08:42 PM
hi gouri the hiveserver2 is already running on the hive node hadoop2. you want me to kill the process and run the one you gave in nohup mode so it runs from background? [root@hadoop2 hive]# ps -ef | grep hiveserver2
hive 8474 1 2 16:33 ? 00:00:09 /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.111.x86_64/bin/java -Xmx1024m -Dhdp.version=2.4.3.0-227 -Djava.net.preferIPv4Stack=true -Dhdp.version=2.4.3.0-227 -Dhadoop.log.dir=/var/log/hadoop/hive -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/usr/hdp/2.4.3.0-227/hadoop -Dhadoop.id.str=hive -Dhadoop.root.logger=INFO,console -Djava.library.path=:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64:/usr/hdp/2.4.3.0-227/hadoop/lib/native -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx1024m -XX:MaxPermSize=512m -Xmx4467m -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /usr/hdp/2.4.3.0-227/hive/lib/hive-service-1.2.1000.2.4.3.0-227.jar org.apache.hive.service.server.HiveServer2 --hiveconf hive.aux.jars.path=file:///usr/hdp/current/hive-webhcat/share/hcatalog/hive-hcatalog-core.jar -hiveconf hive.metastore.uris= -hiveconf hive.log.file=hiveserver2.log -hiveconf hive.log.dir=/var/log/hive
root 10843 2790 0 16:40 pts/0 00:00:00 grep hiveserver2
[root@hadoop2 hive]#
... View more
09-13-2016
07:46 PM
hive.zipI bounced all the servers and restarted all components , attaching the new log files can you please see if you still see the hive user permission issues on /tmp/hive ? I am still getting permission denied.
... View more
09-13-2016
07:29 PM
I also tried this command , looks like something is not good in the database ? [root@hadoop2 java]# metatool -listFSRoot
WARNING: Use "yarn jar" to launch YARN applications.
Initializing HiveMetaTool..
16/09/13 15:24:05 INFO metastore.ObjectStore: ObjectStore, initialize called
16/09/13 15:24:05 INFO DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
16/09/13 15:24:05 INFO DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
16/09/13 15:24:05 ERROR Datastore.Schema: Failed initialising database.
Unable to open a test connection to the given database. JDBC url = jdbc:mysql://hadoop2.tolls.dot.state.fl.us/hive?createDatabaseIfNotExist=true, username = hive. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------
java.sql.SQLException: Access denied for user 'hive'@'hadoop2.tolls.dot.state.fl.us' (using password: YES)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1073)
... View more
09-13-2016
07:28 PM
I added the permissions for hive user on /tmp/hive , still no luck ..
[hive@hadoop2 ~]$ id
uid=502(hive) gid=502(hadoop) groups=502(hadoop)
[hive@hadoop2 ~]$
[hive@hadoop2 ~]$ uname -a > a.a
[hive@hadoop2 ~]$ hdfs dfs -copyFromLocal a.a /tmp/hive/b.b
[hive@hadoop2 ~]$
... View more
09-13-2016
06:44 PM
hive.zipthere are many files in this folder . I am uploading them all
... View more