Member since
09-03-2022
6
Posts
0
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1556 | 11-03-2022 08:18 AM | |
1067 | 09-03-2022 12:05 PM |
11-03-2022
08:18 AM
I learned that I needed to follow a series of steps to essentially enable the admin (root username) account. The Maria_Dev and Raj_Ops accounts they provided did not have privileges to access or edit those files apparently. I logged into the CLI as root with the default password of hadoop, then went through the mandatory password change procedure and the account was enabled. With Root enabled I was able to access and edit the files via a connection through WinSCP. I didn't try Ambari, though I'm certain that option would have worked as well. When I initially tried accessing the directory under Ambari the etc folder and many others were not showing at all, like they were hidden. I think that was due to the limited privileges of the other two accounts.
... View more
10-30-2022
11:53 AM
Hello, As part of a class I am in, we have tasks to edit the configuration files to include or change certain properties. I have found numerous articles detailing the importance of changing these properties but cannot find any explaining HOW TO change properties, either through the CLI or the web interface. We are using Hortonworks Hadoop Sandbox HDP through a virtual machine. For instance, in one of our tasks, we are to configure wire encryption. I understand that I need to edit the core-site.xml configuration file to include the property hadoop.rpc.protection=privacy But I do not understand how to do so. I have tried going into the CLI, and the furthest I can make it is to change the directory to etc/hadoop/conf where the file is located. I can also view the file properties using the vi command. But how can I add a property to the file, or replace the value if it an existing property. I have searched and searched google for a guide to do this but have had no luck. I also tried going through Ambari -> files to find the core-site.xml file but I cannot find it. I don't believe that the maria_dev profile that we are given to use has permissions to view that file perhaps, or it cannot be accessed through Ambari. I also tried using a connection through WinSCP to access the file directly from my computer, and edit it there, but when I tried to save the file I got a "Permission denied" message in response. Very sorry if this is a basic question, and I greatly appreciate your help!
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
09-19-2022
08:27 PM
See the PDF document within the Google drive I linked to
... View more
09-19-2022
07:06 PM
Hello all,
I am a student of Purdue Global University. As part of several classes, we have been instructed to utilize Oracle VM VirtualBox Manager, alongside an installation of Cloudera-Quickstart-VM-5.13.0-0-Virtualbox. Last semester this installation worked throughout my class just fine. Then, this semester a class requires it as well, but when I went to boot the original installation I was surprised to find it won't boot. I am running into what I think is a Kernel Panic error on boot. I have tried removing the Cloudera instance and reinstalling it using the PDF guide provided to us by the school. However, on reinstall, the Kernel boot error remained. I have also tried shutting down and restarting my computer, as well as changing the chipset setting from PIIX3 to ICH9, as suggested in another thread. I have a video showing a screen capture of what I receive on boot, as well as the logs, and the PDF we use for installation.
<Link Hidden>
... View more
Labels:
- Labels:
-
Quickstart VM
09-03-2022
12:05 PM
Apologies all, I think that I figured it out. I had been pasting the script directly as copied from the PDF, which didn't work. Then I tried entering it in manually without the -- in front of each line, but apparently those are requirements. I entered each line separately, with the -'s included, and the script ran, Sqoop was initiated.
... View more
09-03-2022
11:39 AM
Hello all,
I am a student taking a course in SQL. The final module of this course has us installing Oracle VMWare and Cloudera to utilize Hadoop.
I went through a detailed instruction manual for the VMWare installation, including an expansion pack install and guest account install within the VMWare environment. I ran into no problems with this installation process (linked below for reference)
Installation_ClouderaQuickstartVirtualMachine
Afterwards, we were to follow a Cloudera basic tutorial guide to get started (linked below).
Cloudera Quickstart Beginners Tutorial
However, when attempting to input the first script into Terminal as provided, I began running into errors. The script and errors received are shown below.
[cloudera@quickstart ~]$ sqoop import-all-tables \
> m 1 \
> connect jdbc:mysql://quickstart:3306/retail_db \
> username=retail_dba \
> password=cloudera \
> compression-codec=snappy \
> as-parquetfile \
> warehouse-dir=/user/hive/warehouse \
> hive-import
Initial errors received:
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
22/09/02 17:02:00 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Error parsing arguments for import-all-tables:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: -m
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: 1
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: --connect
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: jdbc:mysql://quickstart:3306/retail_db
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: --username
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: retail_dba
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: --password
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: cloudera
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: --compression-codec
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: snappy
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: --as-parquetfile
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: --warehouse-dir
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: /user/hive/warehouse
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument:
22/09/02 17:02:00 ERROR tool.BaseSqoopTool: Unrecognized argument: --hive-import
I found a post on this forum indicating I need to set the Accumulo default directory for the script to use. I ran the following script as directed.
sudo mkdir /var/lib/accumulo
ACCUMULO_HOME='/var/lib/accumulo'
export ACCUMULO_HOME
Running this script appeared to remove the initial error I was receiving, regarding the directory, but I am still receiving errors, shown below.
[cloudera@quickstart ~]$ sqoop import-all-tables \
> m 1 \
> connect jdbc:mysql://quickstart:3306/retail_db \
> username=retail_dba \
> password=cloudera \
> compression-codec=snappy \
> as-parquetfile \
> warehouse-dir=/user/hive/warehouse \
> hive-import
22/09/03 09:54:35 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.13.0
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Error parsing arguments for import-all-tables:
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: m
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: 1
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: connect
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: jdbc:mysql://quickstart:3306/retail_db
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: username=retail_dba
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: password=cloudera
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: compression-codec=snappy
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: as-parquetfile
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: warehouse-dir=/user/hive/warehouse
22/09/03 09:54:35 ERROR tool.BaseSqoopTool: Unrecognized argument: hive-import
Try --help for usage instructions.
I tried reaching out to my professor regarding these issues, but his only response was to follow the guide and scripting provided... Hoping someone here can help
Best regards!
... View more
Labels:
- Labels:
-
Apache Sqoop
-
Training