Member since
04-18-2018
23
Posts
0
Kudos Received
0
Solutions
05-21-2018
12:50 AM
Hi Edgar, True, its an Ubuntu VM in Azure. The problem is I am trying to solve is the sandbox script stops at the point of after starting the ambari server. Basically, I am just trying to get to HDFS, and Kafka and Jupyter Notebooks so I can do some data processing with pyspark. I used to start the sandbox and then use those apps/tools from there. Plus, I could use ambari server in the sandbox. If there is a way to run these without starting the sandbox, that would be great. However, I am not sure how to do that. Kind of new to this. How would I do an upgrade from 2.6.1 to 2.6.4? I will search on how to do this, but if you have something that would be great too.
... View more
05-11-2018
02:10 AM
Hello, I have a fresh Ubuntu VM and I like to install the Hortonworks HDP 2.6.4. You must get this a lot, but I am a student and like to try out docker/sandbox, Kafka, Hadoop and all of the applications and tools that come with the bundle. Can you help me get started with where to download and how to install the HDP 2.6.4?
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
05-10-2018
04:40 AM
Hi @Aziz Herch, can you help me out? Looking for some help. Thanks, Paul
... View more
05-10-2018
02:55 AM
Thanks I will give this a try!
... View more
05-09-2018
05:19 AM
Hello, Can you install NiFi on Hortonworks HDP 2.6.1? If so, can you direct me to installation steps? Thanks, Paul
... View more
Labels:
05-09-2018
03:21 AM
Hello, I know there are several posts about this problem, but I could not figure out what is the best post to solve my problem. Here is what I have. I after running: start_sandbox-hdp.sh I get: "Ambari server 'start' completed successfully" I believe my docker has been updated. I see some posts saying to modifying the sandbox startup script, but I am not sure what to change. I have Hortonworks HDP 2.6.1 installed. Any help would be much appreciated. Here is other information about my environment OS: Distributor ID: Ubuntu
Description: Ubuntu 16.04.4 LTS
Release: 16.04
Codename: xenial Could it be because of low disk space? 96% used up space on the file system /dev/sdc1 and /devf/sda1. Would this cause the sandbox to not start? Filesystem 1K-blocks Used Available Use% Mounted on
overlay 64989928 61996332 2977212 96% /
tmpfs 65536 0 65536 0% /dev
tmpfs 16459536 0 16459536 0% /sys/fs/cgroup
/dev/sdc1 67075056 1651732 65423324 3% /data
/dev/sda1 64989928 61996332 2977212 96% /etc/resolv.conf
/dev/sda1 64989928 61996332 2977212 96% /etc/hostname
/dev/sda1 64989928 61996332 2977212 96% /etc/hosts
shm 65536 0 65536 0% /dev/shm The sandbox start HDP sh file: #!/bin/bash
echo "Waiting for docker daemon to start up:"
until docker ps 2>&1| grep STATUS>/dev/null; do sleep 1; done; >/dev/null
docker ps -a | grep sandbox-hdp
if [ $? -eq 0 ]; then
docker start sandbox-hdp
else
docker run --name sandbox-hdp --hostname "sandbox.hortonworks.com" --privileged -v /data:/data -d \
-p 1111:111 \
-p 1000:1000 \
-p 1100:1100 \
-p 1220:1220 \
-p 1988:1988 \
-p 2049:2049 \
-p 2100:2100 \
-p 2181:2181 \
-p 3000:3000 \
-p 4040-4050:4040-4050 \
-p 9999:9999 \
-p 4200:4200 \
-p 4242:4242 \
-p 5007:5007 \
-p 5011:5011 \
-p 6001:6001 \
-p 6003:6003 \
-p 6008:6008 \
-p 6080:6080 \
-p 6188:6188 \
-p 8000:8000 \
-p 8005:8005 \
-p 8020:8020 \
-p 8032:8032 \
-p 8040:8040 \
-p 8042:8042 \
-p 8080:8080 \
-p 8082:8082 \
-p 8086:8086 \
-p 8088:8088 \
-p 8090:8090 \
-p 8091:8091 \
-p 8188:8188 \
-p 8443:8443 \
-p 8744:8744 \
-p 8765:8765 \
-p 8886:8886 \
-p 8888:8888 \
-p 8889:8889 \
-p 8983:8983 \
-p 8993:8993 \
-p 9000:9000 \
-p 9995:9995 \
-p 9996:9996 \
-p 10000:10000 \
-p 10001:10001 \
-p 10015:10015 \
-p 10016:10016 \
bash
echo "Waiting for docker daemon to start up:" In an ambari-agent.out file, this is repeated: WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
javax.jdo.option.ConnectionPassword has been successfully created.
org.apache.hadoop.security.alias.JavaKeyStoreProvider has been updated.
2018-04-14 02:07:31,850 - Testing the JVM's JCE policy to see it if supports an unlimited key length.
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
Apr 14, 2018 2:08:44 AM org.apache.hadoop.util.NativeCodeLoader <clinit>
WARNING: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
oozie.service.JPAService.jdbc.password has been successfully created.
org.apache.hadoop.security.alias.JavaKeyStoreProvider has been updated.
2018-04-14 02:09:33,283 - Testing the JVM's JCE policy to see it if supports an unlimited key length.
2018-04-14 02:10:34,443 - Testing the JVM's JCE policy to see it if supports an unlimited key length.
2018-04-14 02:11:35,382 - Testing the JVM's JCE policy to see it if supports an unlimited key length.
2018-04-14 02:12:35,604 - Testing the JVM's JCE policy to see it if supports an unlimited key length.
2018-04-14 02:13:35,659 - Testing the JVM's JCE policy to see it if supports an unlimited key length.
2018-04-14 02:14:35,635 - Testing the JVM's JCE policy to see it if supports an unlimited key length.
"ambari-agent.out" 112L, 10714C
... View more
Labels:
05-02-2018
02:23 AM
Hi @Shu, Does NiFi support writing the files on C:\ for Win 10 when using Chrome and you are using the PutFile processor? Is there a bug with NiFi? I am running out of options to try to get this working. Thanks for your help.
... View more
05-01-2018
10:38 PM
Thanks for your reply. I am new to Spark on Centos. What file do I make this change?
... View more
05-01-2018
05:23 AM
Hello, I am trying to install, config and run Jupyter Notebook on Hortonworks Docker Sandbox HDP with Centos 7. I have followed these steps. I followed these steps: https://community.hortonworks.com/articles/39128/tutorial-installconfigure-ipython-and-createrun-py.html /start_ipython_notebook.sh I get the following error: SPARK_MAJOR_VERSION is set to 2, using Spark2 Error in pyspark startup:
IPYTHON and IPYTHON_OPTS are removed in Spark 2.0+. Remove these from the environment and set PYSPARK_DRIVER_PYTHON and PYSPARK_DRIVER_PYTHON_OPTS instead. I am not sure what this means or what to do. Where in Centos do I go to set environment variables like these mentioned in the error? Thanks, Paul
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
05-01-2018
04:02 AM
Thanks. I found kafka directory.
... View more
05-01-2018
03:36 AM
I turned on DEBUG and checked the bulletin board and I get this: 3:27:42 UTC
INFO
1967bc4d-0163-1000-1af5-b40c46be92b6
PutFile[id=1967bc4d-0163-1000-1af5-b40c46be92b6] Produced copy of StandardFlowFileRecord[uuid=58d18018-a864-4c12-a98d-498bbf8d19d1,claim=StandardContentClaim [resourceClaim=StandardResourceClaim[id=1525145201278-982, container=default, section=982], offset=336986, length=168493],offset=0,name=3706261132832753,size=168493] at location c:/test/3706261132832753 I made sure the path has full control on c:\test. Still, there are no files being written to c:\test. I even switch around the / to \ in the path.
... View more
05-01-2018
01:13 AM
@Shu, do you have other ideas on how to get the PutFile processor to write results to a log file? I am still not seeing files written to any path. I don't have access to the web server and NiFi is not installed on my Win 10 box. I am using NiFi via Chrome.
... View more
04-29-2018
11:52 PM
Where would the nifi-app.log be located? I am a Win 10 desktop (client) using NiFi with Chrome. I don't have access to the server. Where would these be on my desktop?
... View more
04-29-2018
11:41 PM
Interesting, the success route has 9 in the queue now and 1.45 MB. Nothing going the failure route. Is there other configuration need for the funnel?
... View more
04-29-2018
11:28 PM
Hello, I have the Kafka Broker started on my Oracle VM VirtualBox. So, its there, I think. On another VM (an Ubuntu serer with Hortonworks Sandbox HDP) I see Kafka was
installed under: /usr/hdp/current/kafka-broker/ However, I don’t see this on the Hortonworks Docker Sandbox HDP for Oracle
VM VirtualBox. On that other VM, I used to go to a bin directory and run commands
like: bin/kafka-server-start.sh conf/server.properties bin/kafka-topics.sh bin/kafka-console-consumer.sh bin/kafka-console-producer.sh These created topics, a consumer and a producer so Kafka would
be streaming data. Where on Hortonworks Docker Sandbox HDP for Oracle VM
VirtualBox are these folders and commands for Kafka? Just looking for more information about how to use Kafka Hortonworks Docker Sandbox HDP.
... View more
Labels:
- Labels:
-
Apache Kafka
04-29-2018
09:55 PM
There is no data written to c:\data. Is it a permission issue? There are 10.1 MB in, but 0 bytes out. Could there be something wrong with the PutFile settings? For example:
These are both checked in the Settings tab: Automatically Terminate Relationships failure
Files that could not be written to the output directory for some reason are transferred to this relationship success
Files that have been successfully written to the output directory are transferred to this relationship
... View more
04-29-2018
08:56 PM
Hello, I am using NiFi 1.4.0 and using it with Chrome Version 65.0.3325.181 on a Win 10 box. I have a PutFile process and I am looking to write date to an output file on my local hard drive, like c:\data. The process works and I have c:\data in the Directory field, but I am not getting data written c:\data. The folder is empty, and not sure what else I can do. Can you help? Thanks, Paul
... View more
Labels:
- Labels:
-
Apache NiFi
04-19-2018
02:19 AM
Thanks, this starts my ambari server, but when I run sudo ./start_sandbox-hdp.sh it hangs on "Ambari Server 'start' completed successfully." I will create a new post for this issue.
... View more
04-19-2018
02:00 AM
Now when I run my sudo ./start_sandbox-hdp.sh it hangs. It gets stuck on "Ambari Server 'start' completed successfully' Server PID at: /var/run/ambari-server/ambari-server.pid Server out at: /var/log/ambari-server/ambari-server.out Server log at: /var/log/ambari-server/ambari-server.log Waiting for server start............................................................. Server started listening on 8080
DB configs consistency check: no errors and warnings were found.
Ambari Server 'start' completed successfully.
... View more
04-19-2018
01:27 AM
OK, I set it to 120. I did add the line to the file. I don't see the error now and ambari server says its started.
... View more
04-19-2018
12:38 AM
Yes, I am using the HDP sandbox. I logged in with ssh -p 2222 root@localhost. So, I can see the ambari.properties file now. However, I don't see the line in the file: server.startup.web.timeout=120
Would I add this to the file, if missing?
... View more
04-18-2018
04:58 AM
Hello, I have an Hadoop environment and using Hortonworks Data Platform 2.6.1, https://hortonworks.com/products/data-platforms/hdp/ So, I have Ambari Server installed within a docker on my Linux VM. When I start the Ambari Server, I get DB configs consistency check: no errors and warnings were
found.
ERROR: Exiting with exit code 1.
REASON: Server not yet listening on http port 8080 after 50
seconds. Exiting. I have read that I need to change the timeout setting in the
ambari-properties file. Like: Edit the
"/etc/ambari-server/conf/ambari.properties" and increase the
following property value to 120 or 150 seconds. server.startup.web.timeout=120
However, it seems that I have to be inside the container to
change the settings, but I can’t start the container. I don't see that folder, so I am thinking I need to be inside my container to change the file. How can I look for the ambari.properties from inside the
container?
... View more
Labels:
- Labels:
-
Apache Ambari
-
Docker