Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1490 | 10-11-2018 01:38 AM | |
1859 | 09-26-2018 02:24 AM | |
1819 | 06-29-2018 02:35 PM | |
2410 | 06-29-2018 02:34 PM | |
5347 | 06-20-2018 04:30 PM |
11-16-2016
05:44 PM
yes you were right , some how the java was upgraded to a new release 121 , I fixed the issue by creating a symbolic link as follows : [root@hadoop1 jvm]# pwd
/usr/lib/jvm
[root@hadoop1 jvm]# ln -s /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.121.x86_64 /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.111.x86_64
... View more
11-10-2016
06:42 AM
@Sami Ahmad By default the current directory (where the *.class is present) is already added to the classpath. So do not need to worry about it. The only change you need in your case is to 1. Make sure that the "HelloOpenCV.class" and "lbpcascade_frontalface.xml" are colocated with following code. CascadeClassifier faceDetector = newCascadeClassifier(getClass().getResource("lbpcascade_frontalface.xml").getPath()); OR 2. You should create a JAR where the "HelloOpenCV.class" and "lbpcascade_frontalface.xml" both are present. and then you can use the same code: CascadeClassifier faceDetector = newCascadeClassifier(getClass().getResource("lbpcascade_frontalface.xml").getPath());
... View more
11-10-2016
04:25 AM
the error was fixed by moving the following files in the respective directories . /root/openCV/opencv/samples/java/sbt/src/main/java
[root@hadoop1 java]#
[root@hadoop1 java]# ls
build.sbt DetectFaceDemo.java.orig HelloOpenCV.java lib project target
[root@hadoop1 java]#
[root@hadoop1 java]# ls lib
libopencv_java249.so opencv-249.jar
[root@hadoop1 java]#
[root@hadoop1 java]# cd ..
[root@hadoop1 main]# pwd
/root/openCV/opencv/samples/java/sbt/src/main
[root@hadoop1 main]# ls
java origscala resources
[root@hadoop1 main]# ls resources
AverageMaleFace.jpg img1.png img2.png lbpcascade_frontalface.xml lena.png
[root@hadoop1 main]#
... View more
11-03-2016
05:37 PM
1 Kudo
this post fixed the issue http://stackoverflow.com/questions/19189979/cannot-run-flume-because-of-jar-conflict
... View more
01-14-2019
03:30 PM
1 Kudo
Check out this article , to map JSON to hive columns https://medium.com/datadriveninvestor/analyzing-twitter-feeds-using-hive-7e074025f295
... View more
10-27-2016
05:22 PM
as you can see I cant read it using JSON [hdfs@hadoop1 ~]$ more a.py
#!/usr/bin python
import json
with open('FlumeData.1477426267073') as f:
data = f.read()
jsondata = json.loads(data)
print jsondata
[hdfs@hadoop1 ~]$ python a.py
Traceback (most recent call last):
File "a.py", line 7, in <module>
jsondata = json.loads(data)
File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads
return _default_decoder.decode(s)
File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode
obj, end = self.raw_decode(s, idx=_w(s, 0).end())
File "/usr/lib64/python2.6/json/decoder.py", line 338, in raw_decode
raise ValueError("No JSON object could be decoded")
ValueError: No JSON object could be decoded
... View more
01-29-2018
09:37 AM
@Sami Ahmad did you finally get any solution. After a year also I'm getting same error.
... View more
12-05-2016
02:24 PM
If you are trying to use this: https://github.com/hortonworks/hive-json these two files have 2.2.2 hardcoded: bin/shred-json:gsonVersion = "2.2.2" bin/find-json-schema:gsonVersion = "2.2.2" Just change them to 2.6.2 and mvn again.
... View more
10-12-2016
09:59 PM
2 Kudos
I fixed this error by using another serde
"org.openx.data.jsonserde.JsonSerDe "
... View more
09-29-2016
06:20 PM
@Sami Ahmad glad it's working! That's interesting. & thanks for clarifying; I wasn't sure if the EV was set in the spark2 directory (vs. spark1).
... View more