Member since
09-22-2020
9
Posts
1
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1698 | 09-24-2020 08:34 AM |
01-20-2021
07:28 AM
I am confused about HDP being free or paid. Cloudera doesn't let me download HDP unless i am subscribed to it. But i can still install HDP services via Ambari. Whats the difference between downloading and installing HDP from Cloudera and installing HDP services via Ambari?
... View more
Labels:
10-11-2020
11:11 PM
@balajip Hi thank you for your answer. For the student table it is working but for the table i create using Join operation, i get the same error: Here is the query i used: CREATE TABLE testjoin as SELECT * FROM table1 t1 FULL JOIN table2 t2 ON t1.bu_id = t2.bu_id; The table is created but when i do Select from that table: testjoin, i get the same error: Error: Error while compiling statement: FAILED: RuntimeException java.lang.NullPointerException (state=42000,code=40000) What could be the reason? P.S I started having this issue since past 3 days now. Before it was working. I also reinstalled Hive but it didn't work.
... View more
10-11-2020
06:47 PM
I created hive table using following query: create table students (name varchar(64), age int, gpa decimal(3, 2)); and inserted the data using this query: insert into table students values ('fred flinstone', 445, 1.34); It doesn't show any error but when i am viewing the data using select * from students it gives me the following error: Error: Error while compiling statement: FAILED: NullPointerException null (state=42000,code=40000) I get this same error when i am joining table in Hive. Any help? thank you
... View more
Labels:
- Labels:
-
Apache Hive
09-24-2020
08:34 AM
1 Kudo
I solved my problem. In my case one of the talbes' name was starting with a character underscore "_" because of which there was an issue where 2 single quotes were added automatically in the path of the hdfs directory where the copy of the file was stored. I changed the name of the column by removing the underscore character and now i can import the table into Hive database. I think special characters like that are not easily parsed in Hive or HDFS.
... View more
09-24-2020
08:02 AM
Since i am executing sqoop from the user hdfs (hdfs@master>) it should have access to all the directories inside it right? if i just use the command without specifying the --warehouse-dir, then also i get the same error. Also, i disabled permission in hdfs-site.xml from true to false in hdfs.permission.enabled
... View more
09-24-2020
02:49 AM
I am running the command from hdfs user hdfs@master> sqoop import-all-tables --connect jdbc:mysql://10.11.11.15:6306/siki_asmet?serverTimezone=Asia/Jakarta --username micronics -P --hive-import --warehouse-dir /warehouse/siki --hive-database siki_ods --exclude-tables "Sheet1$" --m 1; When i run the above command, i get the following error: FAILED: SemanticException Line 1:17 Invalid path ''hdfs://master.lpjk.com:8020/warehouse/siki/_asdamkindo_personal_ska_pendidikan'': No files matching path hdfs://master.lpjk.com:8020/warehouse/siki/_asdamkindo_personal_ska_pendidikan (state=42000,code=40000) But when i try to run the command again, it said file already exists.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
-
HDFS
09-23-2020
12:26 AM
I used this command to list the tables in mariadb database: sqoop list-tables --connect jdbc:mysql://10.11.11.15:6306/siki_asmet --username micronics -P I get this error: 20/09/23 14:41:02 ERROR manager.CatalogQueryManager: Failed to list tables java.sql.SQLException: The server time zone value 'WIB' is unrecognized or represents more than one time zone. You must configure either the server or JDBC driver (via the 'serverTimezone' configuration property) to use a more specifc time zone value if you want to utilize time zone support.
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop