Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1498 | 10-11-2018 01:38 AM | |
1867 | 09-26-2018 02:24 AM | |
1826 | 06-29-2018 02:35 PM | |
2418 | 06-29-2018 02:34 PM | |
5365 | 06-20-2018 04:30 PM |
11-10-2016
04:25 AM
the error was fixed by moving the following files in the respective directories . /root/openCV/opencv/samples/java/sbt/src/main/java
[root@hadoop1 java]#
[root@hadoop1 java]# ls
build.sbt DetectFaceDemo.java.orig HelloOpenCV.java lib project target
[root@hadoop1 java]#
[root@hadoop1 java]# ls lib
libopencv_java249.so opencv-249.jar
[root@hadoop1 java]#
[root@hadoop1 java]# cd ..
[root@hadoop1 main]# pwd
/root/openCV/opencv/samples/java/sbt/src/main
[root@hadoop1 main]# ls
java origscala resources
[root@hadoop1 main]# ls resources
AverageMaleFace.jpg img1.png img2.png lbpcascade_frontalface.xml lena.png
[root@hadoop1 main]#
... View more
11-10-2016
02:49 AM
I am following this tutorial and also copied the library file they mention but still same errors http://docs.opencv.org/2.4/doc/tutorials/introduction/desktop_java/java_dev_intro.html /root/openCV/opencv/samples/java/sbt/src/main/resources
[root@hadoop1 resources]# ls
AverageMaleFace.jpg img1.png img2.png lbpcascade_frontalface.xml
[root@hadoop1 resources]#
... View more
11-10-2016
02:10 AM
I can successfully compile the program below but when I run it I get errors :
[root@hadoop1 java]# more HelloOpenCV.java
import org.opencv.core.Core;
import org.opencv.core.Mat;
import org.opencv.core.MatOfRect;
import org.opencv.core.Point;
import org.opencv.core.Rect;
import org.opencv.core.Scalar;
import org.opencv.highgui.Highgui;
import org.opencv.objdetect.CascadeClassifier;
//
// Detects faces in an image, draws boxes around them, and writes the results
// to "faceDetection.png".
//
class DetectFaceDemo {
public void run() {
System.out.println("\nRunning DetectFaceDemo");
// Create a face detector from the cascade file in the resources
// directory.
CascadeClassifier faceDetector = new CascadeClassifier(getClass().getResource("/lbpcascade_frontalface.xml").getPath());
Mat image = Highgui.imread(getClass().getResource("/lena.png").getPath());
// Detect faces in the image.
// MatOfRect is a special container class for Rect.
MatOfRect faceDetections = new MatOfRect();
faceDetector.detectMultiScale(image, faceDetections);
System.out.println(String.format("Detected %s faces", faceDetections.toArray().length));
// Draw a bounding box around each face.
for (Rect rect : faceDetections.toArray()) {
Core.rectangle(image, new Point(rect.x, rect.y), new Point(rect.x + rect.width, rect.y + rect.height), new Scalar(0, 255
, 0));
}
// Save the visualized detection.
String filename = "faceDetection.png";
System.out.println(String.format("Writing %s", filename));
Highgui.imwrite(filename, image);
}
}
public class HelloOpenCV {
public static void main(String[] args) {
System.out.println("Hello, OpenCV");
// Load the native library.
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
new DetectFaceDemo().run();
}
}
the build file is shown below. The "-Djava.library.path " to the local "./lib" folder where the opencv-249.jar file is located
[root@hadoop1 java]# more build.sbt
name := "DetectFaceDemo"
scalaVersion := "2.11.6"
scalacOptions ++= Seq(
"-unchecked",
"-deprecation",
"-optimize",
"-Xlint"
)
javaOptions in run += "-Djava.library.path=./lib"
lazy val root = (project in file("."))
fork := true
[root@hadoop1 java]# ls ./lib
opencv-249.jar
[root@hadoop1 java]#
if the "sbt package" successfully creates a jar file but when I run it using SBT I get the following error : he is complaining about a file "opencv_java249" where as the file that's produced by compilation is "opencv-249.jar" [root@hadoop1 java]# pwd
/root/openCV/opencv/samples/java/sbt/src/main/java
[root@hadoop1 java]#
[root@hadoop1 java]# ls
build.sbt DetectFaceDemo.java.orig HelloOpenCV.java lib project target
[root@hadoop1 java]#
[root@hadoop1 java]# sbt run
[info] Set current project to DetectFaceDemo (in build file:/root/openCV/opencv/samples/java/sbt/src/main/java/)
[info] Compiling 1 Java source to /root/openCV/opencv/samples/java/sbt/src/main/java/target/scala-2.11/classes...
[info] Running HelloOpenCV
[info] Hello, OpenCV
[error] Exception in thread "main" java.lang.UnsatisfiedLinkError: no opencv_java249 in java.library.path
[error] at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1889)
[error] at java.lang.Runtime.loadLibrary0(Runtime.java:849)
[error] at java.lang.System.loadLibrary(System.java:1088)
[error] at HelloOpenCV.main(HelloOpenCV.java:47)
java.lang.RuntimeException: Nonzero exit code returned from runner: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code returned from runner: 1
[error] Total time: 1 s, completed Nov 9, 2016 9:04:45 PM
on the other hand if I try to run it via java I get different error as follows : [root@hadoop1 scala-2.11]# java -cp detectfacedemo_2.11-0.1-SNAPSHOT.jar HelloOpenCV
Hello, OpenCV
Exception in thread "main" java.lang.NoClassDefFoundError: org/opencv/core/Core
at HelloOpenCV.main(HelloOpenCV.java:47)
Caused by: java.lang.ClassNotFoundException: org.opencv.core.Core
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
... 1 more
... View more
Labels:
- Labels:
-
Security
11-03-2016
05:37 PM
1 Kudo
this post fixed the issue http://stackoverflow.com/questions/19189979/cannot-run-flume-because-of-jar-conflict
... View more
11-03-2016
04:19 PM
I removed twitter4j 4.0.4 version and put twitter4j 3.0.3 but now I am getting another error 16/11/03 12:17:47 INFO twitter.TwitterSource: Twitter source Twitter started.
16/11/03 12:17:47 INFO twitter4j.TwitterStreamImpl: Establishing connection.
16/11/03 12:17:47 INFO twitter4j.TwitterStreamImpl: 404:The URI requested is invalid or the resource requested, such as a user, does not exist.
... View more
11-03-2016
03:01 PM
1 Kudo
I have used this flume-ng command dozens of times but now its failing with the error below. 16/11/03 10:58:46 INFO twitter.TwitterSource: Access Token Secret: 'xxxxxxxxxxxxxxxxxxxxxxxxxx'
16/11/03 10:58:46 ERROR node.AbstractConfigurationProvider: Source Twitter has been removed due to an error during configuration
java.lang.IllegalStateException: consumer key/secret pair already set.
the command is flume-ng agent --conf ./conf/ -f conf/twitter-to-hdfs.properties --name TwitterAgent -Dflume.root.logger=WARN,console -Dtwitter4j.http.proxyHost=dotatofwproxy.tolls.dot.state.fl.us -Dtwitter4j.http.proxyPort=8080
... View more
Labels:
- Labels:
-
Apache Flume
-
Apache Hadoop
11-03-2016
02:12 PM
I am confused , shouldn't the twitter data be same for everyone ? I am looking at the links you have mentioned here and the twitter data is different everywhere ? looking at my data above please advise if its the right twitter record and if not why I am getting this format ?
... View more
11-03-2016
01:58 PM
followed the steps but getting all NULLS . I compiled the serde and copied the json-serde-1.3.8-SNAPSHOT.jar file to the $FLUME_HOME/lib folder. hive> CREATE EXTERNAL TABLE tweetdata3 (
> id string,
> person struct<email:string, first_name:string, last_name:string, location:struct<address:string, city:string, state:string, zipcode:string>, text:string, url:string>)
> ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe'
> LOCATION '/user/flume/tweets';
OK
Time taken: 0.197 seconds
hive> desc tweetdata3;
OK
id string from deserializer
person struct<email:string,first_name:string,last_name:string,location:struct<address:string,city:string,state:string,zipcode:string>,text:string,url:string> from deserializer
Time taken: 0.266 seconds, Fetched: 2 row(s)
hive>
> SELECT id, person.first_name, person.last_name, person.email,
> person.location.address, person.location.city, person.location.state,
> person.location.zipcode, person.text, person.url
> FROM tweetdata3 LIMIT 5;
OK
790657073453424645 NULL NULL NULL NULL NULL NULL NULL NULL NULL
790657073453424645 NULL NULL NULL NULL NULL NULL NULL NULL NULL
Time taken: 0.282 seconds, Fetched: 2 row(s)
hive> SELECT id, person.first_name, person.last_name, person.email,
> person.location.address, person.location.city, person.location.state,
> person.location.zipcode, person.text, person.url
> FROM tweetdata3 LIMIT 5;
OK
790657073453424645 NULL NULL NULL NULL NULL NULL NULL NULL NULL
790657073453424645 NULL NULL NULL NULL NULL NULL NULL NULL NULL
Time taken: 0.063 seconds, Fetched: 2 row(s)
... View more
11-02-2016
09:14 PM
1 Kudo
I want to load a json record into hive {"id":"790657073453424645","user_friends_count":{"int":121},"user_location":{"string":"Europa"},"user_description":{"string":"MITMACHEN \r\n\r\nIm Kampf gegen die EntDemokratisierung durch Freihandelsabkommen! \r\n\r\nStop TTIP - Stop TAFTA\r\n\r\nThe Fight against USA TTIP !"},"user_statuses_count":{"int":7561},"user_followers_count":{"int":1380},"user_name":{"string":"Freihandelsabkommen"},"user_screen_name":{"string":"Stop_TTIP"},"created_at":{"string":"2016-10-24T16:51:50Z"},"text":{"string":"RT @alikonkret: Da wurde die Meinungsmache im Kommentar versteckt um die scheinbare Neutralität zu wahren. #CETA #Wallonia https://t.co/ViA…"},"retweet_count":{"long":0},"retweeted":{"boolean":true},"in_reply_to_user_id":{"long":-1},"source":{"string":"<a href=\"http://www.tweetcaster.com\" rel=\"nofollow\">TweetCaster for Android</a>"},"in_reply_to_status_id":{"long":-1},"media_url_https":null,"expanded_url":null}
I only have the skeleton command CREATE EXTERNAL TABLE tweetdata3(
) ROW FORMAT DELIMITED Fields terminated by ',' STORED as textfile location '/user/flume/tweets';
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
10-27-2016
05:22 PM
as you can see I cant read it using JSON [hdfs@hadoop1 ~]$ more a.py
#!/usr/bin python
import json
with open('FlumeData.1477426267073') as f:
data = f.read()
jsondata = json.loads(data)
print jsondata
[hdfs@hadoop1 ~]$ python a.py
Traceback (most recent call last):
File "a.py", line 7, in <module>
jsondata = json.loads(data)
File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
... View more