Member since
04-22-2016
931
Posts
46
Kudos Received
26
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1498 | 10-11-2018 01:38 AM | |
1868 | 09-26-2018 02:24 AM | |
1826 | 06-29-2018 02:35 PM | |
2418 | 06-29-2018 02:34 PM | |
5366 | 06-20-2018 04:30 PM |
11-17-2016
10:11 PM
I really need to get Atlas up and running , can anyone please help ?
... View more
11-17-2016
04:08 PM
application.zip added both the files , bounced the Atlas server but still seeing same errors. Uploading the new application.log file. There are no Errors in the application log and the strange thing is now its not reading the users-credentials file [root@hadoop1 atlas]# grep Error application.log
[root@hadoop1 atlas]# grep policy-store.txt application.log
2016-11-17 10:58:32,134 INFO - [main:] ~ reading the file/etc/atlas/conf/policy-store.txt (FileReaderUtil:40)
[root@hadoop1 atlas]# grep users-credentials.properteies application.log
[root@hadoop1 atlas]#
[root@hadoop1 conf]# pwd
/etc/atlas/conf
[root@hadoop1 conf]# ls -ltr
total 32
-rw-r--r-- 1 root root 1336 Nov 16 11:18 application.properties
-rwxr-xr-x 1 root root 1265 Nov 16 11:18 client.properties
drwxr-xr-x 3 atlas hadoop 4096 Nov 16 11:18 solr
-rw-r--r-- 1 atlas hadoop 3259 Nov 16 11:54 atlas-log4j.xml
-rwxr-xr-x 1 atlas hadoop 1611 Nov 16 12:24 atlas-env.sh
-rw-r--r-- 1 atlas hadoop 325 Nov 17 10:33 policy-store.txt
-rw-r--r-- 1 atlas hadoop 81 Nov 17 10:34 users-credentials.properties
-rw-r--r-- 1 atlas hadoop 3247 Nov 17 10:58 atlas-application.properties
[root@hadoop1 conf]#
[root@hadoop1 conf]# more policy-store.txt
adminPolicy;;admin:rwud;;ROLE_ADMIN:rwud;;type:*,entity:*,operation:*,taxonomy:*,term:*
userReadPolicy;;readUser1:r,readUser2:r;;DATA_SCIENTIST:r;;type:*,entity:*,operation:*,taxonomy:*,term:*
userWritePolicy;;writeUser1:rwu,writeUser2:rwu;;BUSINESS_GROUP:rwu,DATA_STEWARD:rwud;;type:*,entity:*,operation:*,taxonomy:*,term:*
[root@hadoop1 conf]#
[root@hadoop1 conf]# more users-credentials.properties
admin=ADMIN::8c6976e5b5410415bde908bd4dee15dfb167a9c873fc4bb8a81f6f2ab448a918 -
[root@hadoop1 conf]#
... View more
11-17-2016
04:00 AM
application.zipyes its listening and the process is up , attaching the log file .
[root@hadoop1 atlas]# lsof -i :21000
COMMAND PID USER FD TYPE DEVICE SIZE/OFF NODE NAME
java 13343 atlas 251u IPv4 251676 0t0 TCP *:irtrans (LISTEN)
[root@hadoop1 atlas]#
[root@hadoop1 atlas]# ps -ef | grep 13343
atlas 13343 1 0 13:37 ? 00:00:49 /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.111.x86_64/bin/java -Datlas.log.dir=/var/log/atlas -Datlas.log.file=application.log -Datlas.home=/usr/hdp/2.5.0.0-1245/atlas -Datlas.conf=/etc/atlas/conf -Xms2048m -Xmx2048m -XX:MaxNewSize=600m -XX:MaxPermSize=512m -server -XX:SoftRefLRUPolicyMSPerMB=0 -XX:+CMSClassUnloadingEnabled -XX:+UseConcMarkSweepGC -XX:+CMSParallelRemarkEnabled -XX:+PrintTenuringDistribution -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/var/log/atlas/atlas_server.hprof -Xloggc:-worker.log -verbose:gc -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10 -XX:GCLogFileSize=1m -XX:+PrintGCDetails -XX:+PrintHeapAtGC -XX:+PrintGCTimeStamps -Dlog4j.configuration=atlas-log4j.xml -classpath /etc/atlas/conf:/usr/hdp/current/atlas-server/server/webapp/atlas/WEB-INF/classes:/usr/hdp/current/atlas-server/server/webapp/atlas/WEB-INF/lib/atlas-titan-0.7.0.2.5.0.0-1245.jar:/usr/hdp/current/atlas-server/server/webapp/atlas/WEB-INF/lib/*:/usr/hdp/2.5.0.0-1245/atlas/libext/*:/etc/hbase/conf org.apache.atlas.Atlas -app /usr/hdp/current/atlas-server/server/webapp/atlas
root 23419 22082 0 22:58 pts/0 00:00:00 grep 13343
[root@hadoop1 atlas]#
... View more
11-16-2016
06:06 PM
atlas-install-screenshot.jpg please see the attached picture showing successful installation of Atlas . also curl is returning 504 error [root@hadoop1 atlas]# curl -sL -w "%{http_code} %{url_effective}\\n" "http://hadoop1:21000" -o /dev/null
504 http://hadoop1:21000/
[root@hadoop1 atlas]#
... View more
11-16-2016
05:53 PM
even though HDP2.5 console shows ATLAS server and clients as started and no errors but I can't reach the atlas URL hadoop1:21000 I am seeing errors in /var/log/atlas/application.log and /var/log/atlas/audit.log as follows: [root@hadoop1 atlas]# pwd
/var/log/atlas
[root@hadoop1 atlas]# tail -100f application.log
2016-11-16 12:35:06,875 INFO - [qtp964346941-15 - 7e9b2f05-70b3-49eb-b782-fb602a194250:] ~ Audit: UNKNOWN/127.0.0.1-127.0.0.1 performed request GET http://localhost:21000/api/atlas/admin/status (127.0.0.1) at time 2016-11-16T17:35Z (AUDIT:104)
2016-11-16 12:36:06,832 INFO - [qtp964346941-17 - d8126dcf-bafe-4ea5-8169-853fb70b89dd:] ~ Audit: UNKNOWN/127.0.0.1 performed request GET http://localhost:21000/api/atlas/admin/status (127.0.0.1) at time 2016-11-16T17:36Z (AuditFilter:91)
2016-11-16 12:36:06,832 INFO - [qtp964346941-17 - d8126dcf-bafe-4ea5-8169-853fb70b89dd:] ~ Audit: UNKNOWN/127.0.0.1-127.0.0.1 performed request GET http://localhost:21000/api/atlas/admin/status (127.0.0.1) at time 2016-11-16T17:36Z (AUDIT:104)
2016-11-16 12:37:06,911 INFO - [qtp964346941-16 - e51663cb-3132-4bfd-abc7-28c941c1ae29:] ~ Audit: UNKNOWN/127.0.0.1 performed request GET http://localhost:21000/api/atlas/admin/status (127.0.0.1) at time 2016-11-16T17:37Z (AuditFilter:91)
2016-11-16 12:37:06,911 INFO - [qtp964346941-16 - e51663cb-3132-4bfd-abc7-28c941c1ae29:] ~ Audit: UNKNOWN/127.0.0.1-127.0.0.1 performed request GET http://localhost:21000/api/atlas/admin/status (127.0.0.1) at time 2016-11-16T17:37Z (AUDIT:104)
tail -100f audit.log
2016-11-16 12:37:06,911 Audit: UNKNOWN/127.0.0.1-127.0.0.1 performed request GET http://localhost:21000/api/atlas/admin/status (127.0.0.1) at time 2016-11-16T17:37Z
2016-11-16 12:38:06,913 Audit: UNKNOWN/127.0.0.1-127.0.0.1 performed request GET http://localhost:21000/api/atlas/admin/status (127.0.0.1) at time 2016-11-16T17:38Z
... View more
Labels:
11-16-2016
05:44 PM
yes you were right , some how the java was upgraded to a new release 121 , I fixed the issue by creating a symbolic link as follows : [root@hadoop1 jvm]# pwd
/usr/lib/jvm
[root@hadoop1 jvm]# ln -s /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.121.x86_64 /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.111.x86_64
... View more
11-15-2016
10:04 PM
I am getting the following error , even though JAVA is installed on every node. Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 35, in <module>
BeforeAnyHook().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py", line 32, in hook
setup_java()
File "/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/shared_initialization.py", line 181, in setup_java
raise Fail(format("Unable to access {java_exec}. Confirm you have copied jdk to this host."))
resource_management.core.exceptions.Fail: Unable to access /usr/lib/jvm/java-1.7.0-openjdk-1.7.0.111.x86_64/bin/java. Confirm you have copied jdk to this host.
Error: Error: Unable to run the custom hook script ['/usr/bin/python', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY/scripts/hook.py', 'ANY', '/var/lib/ambari-agent/data/command-2222.json', '/var/lib/ambari-agent/cache/stacks/HDP/2.0.6/hooks/before-ANY', '/var/lib/ambari-agent/data/structured-out-2222.json', 'INFO', '/var/lib/ambari-agent/tmp']
... View more
Labels:
11-10-2016
06:24 AM
found the solution online put the following in the code to find the classpath and copy the "xml" and "png" files to this location .
System.out.println(System.getProperty("java.class.path"));
... View more
11-10-2016
04:50 AM
I am following the popular DetectFaceDemo tutorial but I cant seem to run the packaged jar .
[DetectFaceDemo](http://docs.opencv.org/2.4/doc/tutorials/introduction/desktop_java/java_dev_intro.html) I have placed the following files in their respect directories as shown below : [root@hadoop1 java]# pwd
/root/openCV/opencv/samples/java/sbt/src/main/java
[root@hadoop1 java]#
[root@hadoop1 java]# ls
build.sbt DetectFaceDemo.java.orig HelloOpenCV.java lib project target
[root@hadoop1 java]#
[root@hadoop1 java]# ls lib
libopencv_java249.so opencv-249.jar
[root@hadoop1 java]#
[root@hadoop1 java]# cd ..
[root@hadoop1 main]# pwd
/root/openCV/opencv/samples/java/sbt/src/main
[root@hadoop1 main]# ls
java origscala resources
[root@hadoop1 main]# ls resources
AverageMaleFace.jpg img1.png img2.png lbpcascade_frontalface.xml lena.png
[root@hadoop1 main]# I am getting error when I run the jar , I have a feeling the code is not picking up the image and the xml file from the src/main/resources folder ?
I have though provided the absolute path to the resource folder. [root@hadoop1 java]# sbt run
[info] Set current project to DetectFaceDemo (in build file:/root/openCV/opencv/samples/java/sbt/src/main/java/)
[info] Running HelloOpenCV
[info] Hello, OpenCV
[info]
[info] Running DetectFaceDemo
[error] Exception in thread "main" java.lang.NullPointerException
[error] at DetectFaceDemo.run(HelloOpenCV.java:20)
[error] at HelloOpenCV.main(HelloOpenCV.java:48)
java.lang.RuntimeException: Nonzero exit code returned from runner: 1
at scala.sys.package$.error(package.scala:27)
[trace] Stack trace suppressed: run last compile:run for the full output.
[error] (compile:run) Nonzero exit code returned from runner: 1
[error] Total time: 1 s, completed Nov 9, 2016 11:20:34 PM The source code is below : [root@hadoop1 java]# more HelloOpenCV.java
import org.opencv.core.Core;
import org.opencv.core.Mat;
import org.opencv.core.MatOfRect;
import org.opencv.core.Point;
import org.opencv.core.Rect;
import org.opencv.core.Scalar;
import org.opencv.highgui.Highgui;
import org.opencv.objdetect.CascadeClassifier;
//
// Detects faces in an image, draws boxes around them, and writes the results
// to "faceDetection.png".
//
class DetectFaceDemo {
public void run() {
System.out.println("\nRunning DetectFaceDemo");
// Create a face detector from the cascade file in the resources
// directory.
CascadeClassifier faceDetector = new CascadeClassifier(getClass().getResource("/root/openCV/opencv/samples/java/sbt/src/main/resources/lbpcascade_frontalface.xml").getPath());
Mat image = Highgui.imread(getClass().getResource("/root/openCV/opencv/samples/java/sbt/src/main/resources/lena.png").getPath());
// Detect faces in the image.
// MatOfRect is a special container class for Rect.
MatOfRect faceDetections = new MatOfRect();
faceDetector.detectMultiScale(image, faceDetections);
System.out.println(String.format("Detected %s faces", faceDetections.toArray().length));
// Draw a bounding box around each face.
for (Rect rect : faceDetections.toArray()) {
Core.rectangle(image, new Point(rect.x, rect.y), new Point(rect.x + rect.width, rect.y + rect.height), new Scalar(0, 255
, 0));
}
// Save the visualized detection.
String filename = "faceDetection.png";
System.out.println(String.format("Writing %s", filename));
Highgui.imwrite(filename, image);
}
}
public class HelloOpenCV {
public static void main(String[] args) {
System.out.println("Hello, OpenCV");
// Load the native library.
System.loadLibrary(Core.NATIVE_LIBRARY_NAME);
new DetectFaceDemo().run();
}
}
[root@hadoop1 java]#
... View more