Member since
11-22-2016
83
Posts
23
Kudos Received
13
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2046 | 08-03-2018 08:13 PM | |
1824 | 06-02-2018 05:24 PM | |
1222 | 05-31-2018 07:54 PM | |
1988 | 02-08-2018 12:38 AM | |
1421 | 02-07-2018 11:38 PM |
04-04-2024
10:47 PM
1 Kudo
Hello, Is there any resolution for this error, I am facing same issue with Atlas installation.
... View more
11-04-2023
03:48 AM
Hi, I am facing same issue even after changing the para - offsets.topic.replication.factor to 1 in kafka conf. Note that I have CDP 7.1.7 and total 8 brokers. I am able to import using import-hive.sh but not using hook. Any suggestion will be appreciates. Thanks, Syed.
... View more
02-20-2018
04:06 AM
Introduction This post enumerates the steps necessary to setup Atlas development environment using IntelliJ on Mac and Windows. This setup uses the BerkeleyDB for backend and embedded Solr as index engine. Setup with other backend and index engine variations are similar but will involve additional setup. Prerequisites These should be present on your machine before you begin:
Git for cloning repository. The command Git Shell is useful if you are switching between Mac and Windows. Maven for performing command line build. IntelliJ Community Edition or higher. BerkeleyDB as backend. Code Base Setup Download code base from GitHub location. Clone it under c:\work\Apache\atlas on Window and ~/Apache/atlas on Mac. Change directory to the location above and initiate a build (using mvn clean install package). Deploy Directory Setup Create a directory say Deploy (say c:\work on Windows or ~/work on Mac) with a structure below it:
conf
Copy atlas-application.properties, users-credentials.properties, policy-store.txt atlas-log4j.xml and atlas-env.sh here. Use contents of the attached ZIP. data
During runtime, the backend database will create its files here. This may be a location to check. data/solr Copy contents C:\work\atlas\repository\src\test\resources\solr to c:\deploy\data\ for Windows (from ~/Apache/atlas/repository/src/test/resources to ~/Deploy/data on Mac). libext
Copy BerkeleyDB JAR here, say je-5.0.73.jar logs
Logs will be created here. models
Copy contents from c:\work\atlas\addons\models (or ~/Apache/atlas/addons/models for Mac.) webapp (optional)
Deploy the contents of atlas.war here if you are developing on client-side (UI). bin (optional)
Empty for now. When done, your directory layout should look like this: WinUtils (for Windows only) Install WinUtils (link below). Copy WinUtils.exe from C:\Program Files (x86)\WinUtil\WinUtil.exe to C:\Users\ashut\.m2\repository\org\apache\hadoop\bin\WinUtils.exe IntelliJ: 'Atlas - Local' Configuration From IntelliJ's Run/Edit Configurations menu option, create a new configuration, call it 'Atlas - Local'. Details are:
Type: Application Main class: org.apache.atlas.Atlas VM options: These should reflect location of your directory created in the step above. Add the following:
-Datlas.home=C:\work\deploy\ -Datlas.conf=C:\work\deploy\conf -Datlas.data=C:\work\deploy\data -Datlas.log.dir=C:\work\deploy\logs -Dembedded.solr.directory=C:\work\deploy\data (See screen shot Profile-2) Program arguments: --port 31000
This is needed so that Atlas that is being run from IntelliJ does not clash with another version that runs on the development VM. Working directory: Set this to the location of webapp of your code base. (In my case, c:\work\apache\atlas on Windows and ~/Apache/atlas/ on Mac) Use classpath of module: atlas-webapp See screen shots below. Debug Run Within IntelliJ: Set the newly created configuration as active. From View/Tool Windows/Maven Project enable the Maven Projects side pane. From the Profiles select Berkeley-elasticsearch, graph-provider-default, graph-provider-janus. Use Run/Debug - 'Atlas - Local' from the menu. Check if server is up by accessing: http://localhost:31000/ Screen Shots Atlas - Local Profile:
Attachments conf-directory.zip: Contents of configuration directory. References How to install Maven on Windows WinUtil download. Credits
Thanks to Apoorv Naik (@anaik) for the investigation and the coming up with setup steps and helping me with the many setups.
... View more
Labels:
11-07-2017
03:20 PM
Thanks for the lead 🙂
... View more
10-10-2017
03:01 PM
2 Kudos
@Lou Richard Good to know that the issue is resolved. It will be great if you can mark this HCC thread as Answered by clicking on the "Accept" Button. That way other HCC users can quickly find the solution when they encounter the same issue. As it was a long thread hence i am writing a brief summary for the HCC users who might encounter this issue and can quickly find the answer. Issue: Atlas Installation Was failing with the following error: File "/usr/hdp/2.6.1.0-129/atlas/bin/atlas_config.py", line 232, in runProcess p = subprocess.Popen(commandline, stdout=stdoutFile, stderr=stderrFile, shell=shell)
File "/usr/lib64/python2.7/subprocess.py", line 711, in __init__ errread, errwrite)
File "/usr/lib64/python2.7/subprocess.py", line 1327, in _execute_child
raise child_exceptionOSError: [Errno 2] No such file or director . Solution: Making sure that the JAVA_HOME is set correctly on the host and it is pointing to a valid JDK (not JRE) and setting the JAVA_HOME properly inside the global environment variable. # cd /usr/hdp/2.6.1.0-129/atlas/server/webapp/
# mv atlas ../
# /usr/lib/jvm/jre-1.8.0-openjdk/bin/jar -xf atlas.war . Cause: "jar" utility comes with JDK (inside $JAVA_HOME/bin) which is being used by the "atlas_config.py" script to extract the atlas.war.
... View more
06-16-2017
04:18 PM
1 Kudo
These are good questions. I hope I am able to do justice to them with my answers. To elaborate little more on what @Sarath Subramanian said. Kafka is used to do the work of relaying the notifications from Hive to Atlas. Hive publishes to a topic and Atlas subscribes to that and thus receives the notifications. There has been some discussion on using Atlas for MySQL and Oracle. I have not seen any implementation yet. This is possible, provided these 2 products have notification mechanisms. From what I know, these have database change triggers that be used to call a REST API or push some message onto a queue or publish to Kafka. For Oracle, this is what i found. Hope this helps.
... View more