Member since
09-24-2015
178
Posts
113
Kudos Received
28
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3410 | 05-25-2016 02:39 AM | |
3634 | 05-03-2016 01:27 PM | |
844 | 04-26-2016 07:59 PM | |
14472 | 03-24-2016 04:10 PM | |
2083 | 02-02-2016 11:50 PM |
10-06-2015
02:46 AM
AFAIK, hive-site.xml file can be uploaded to HDFS and used with same name. Here is an Falcon project I created that does the same. https://github.com/sainib/hadoop-data-pipeline The workflow defines the path to the hive-site.xml using a param like this - https://github.com/sainib/hadoop-data-pipeline/blob/master/falcon/workflow/workflow.xml#L95 & the param is defined in the process entity file like this - https://github.com/sainib/hadoop-data-pipeline/blob/master/falcon/process/processData.xml#L31 I am wondering if Falcon is looking for the hive-site xml where it should be or where you think it should. Can you try with the absolute path of the file on HDFS and give it a try?
... View more
10-06-2015
02:37 AM
Likely, one of the following issues - 1) If you enabled kerberos manually, an error was made. You should check if JSVC package is installed. 2) If you installed using Ambari to enable kerberos, then make sure that you are starting datanodes as root i.e. make sure Ambari Server is running as root.
... View more
10-05-2015
07:52 PM
I am using phoenix “psql” utility to ingest some file data and it fails with many generic “ERROR 201 (22000): Illegal data” without more details. It is difficult to figure out which column data or datatype cause the failure. Is there any way to get more details? [root@dev HbaseData]# psql.py -d "|" localhost tgb_counter.csv …… 15/09/17 17:57:55 ERROR util.CSVCommonsLoader: Error upserting record [263, 1442437680, 1442437730, , 1442437703, 9, Moundville, UNKNOWN, Standard Outdoor TGB w/ Argus/Alpha Supply, 20150916, 20150916, 0, 0, 3, 0, 0, 18, 2, 3, 8, 0, 0, 18, 2, 3, 8, 0, 0, 0, 0, 0, 0, 61, 61, 61, 1442437716, 1442437806, ALOHA, 901162486, ALOHA/Double density reduced, ALOHA, 901112486, ALOHA/Normal, Poll Response/Priority, 901137486, Poll Response/Priority, ALOHA, 901187486, ALOHA/Double density reduced, , , , , , , , , , , , ]: java.sql.SQLException: ERROR 201 (22000): Illegal data. 15/09/17 17:57:55 ERROR util.CSVCommonsLoader: Error upserting record [263, 1442437860, 1442437913, , 1442437888, 10, Moundville, UNKNOWN, Standard Outdoor TGB w/ Argus/Alpha Supply, 20150916, 20150916, 5, 0, 1, 0, 0, 20, 6, 21, 11, 4, 0, 20, 6, 21, 11, 0, 0, 0, 0, 0, 0, 61, 61, 61, 1442437995, 1442438125, ALOHA, 901162486, ALOHA/Double density reduced, ALOHA, 901112486, ALOHA/Normal, Poll Response/Priority, 901137486, Poll Response/Priority, ALOHA, 901187486, ALOHA/Double density reduced, , , , , , , , , , , , ]: java.sql.SQLException: ERROR 201 (22000): Illegal data.
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
10-05-2015
07:02 PM
One of the prospects recently evaluated Drill and while it worked for the structured / self-describing formats without creating schema, their experience was that the data type resolution aspect slowed the performance down. In any case, HWX does not support Drill officially so the on-us will be on customer to resolve any Drill related issues when using it with HDP. On the other hand, my comment to customers is that Hive provides a consistent approach and in a way / semantics that is known to the database developers. Additionally, a larger community involvement and maturity of the product has hardened Hive over number of years. JSONSerde is the easy to use way to handle JSON in HDP. In return of one time table creation, you get better performance as compared to Drill which does not seem like a bad trade off at all.
... View more
10-05-2015
06:38 PM
I think it just takes some extra time when an architectural change is made to the cluster but eventually Ambari displays it correctly. I noticed something similar after enabling HA where Ambari took 5+ minutes to show the correct number of datanodes. If the problem persists, it may be another issue.
... View more
10-04-2015
07:07 PM
Hi Saptak, You reported multiple issues 0 1) /usr/hdp/2.2.0.0-2041/sqoop/sqoop/bin/../../hcatalog does not exist! Ans: Are you working with HDP 2.2. or 2.3? This could be a hardcoded path in one of the scripts that needs to be updated. 2) Access denied for user ‘root’@’%’ to database ‘flightinfo’ Ans: MySQL access is typically limited to localhost by default. In a multi-node cluster, you need to open up that permission for the nodes that will be connecting to the MySQL Server. You can either do this per server basis (for all data nodes) or for a non-prod environment, you can just open up the permission for root@%. See this article for detailed instructions - https://rtcamp.com/tutorials/mysql/remote-access/ 3) cp: cannot create regular file ‘/home/horton/solutions/flightdelays_clean.pig’: Permission denied Ans: Looks like you just need to make sure the user you are logged in with has the permission to write to the directory you are writing to.
... View more
09-30-2015
09:01 PM
Ganesh - You are right that the process xml has to be changed but are you not able to edit the xml directly via UI ?
... View more
09-28-2015
09:52 PM
2015-09-28 16:28:28,937 FATAL conf.Configuration (Configuration.java:loadResource(2638)) - error parsing conf file:/etc/hadoop/2.3.0.0-2557/0/xasecure-audit.xml java.io.FileNotFoundException: /etc/hadoop/2.3.0.0-2557/0/xasecure-audit.xml (No such file or directory) at java.io.FileInputStream.open(Native Method) at java.io.FileInputStream.<init>(FileInputStream.java:146) at java.io.FileInputStream.<init>(FileInputStream.java:101)
... View more
09-28-2015
09:51 PM
1 Kudo
Labels:
- Labels:
-
Apache Hadoop
-
Apache Ranger
09-24-2015
03:52 PM
1 Kudo
I'm trying to get a demo up and running with Atlas. I am running into issues with just simply trying to view the hive db and tables. Can someone please
provide the resolution / pointers? provide a way to clean the Atlas demo and start again? share how the engineering team is visualizing the data in the graph database in Atlas? Please see below - 2 - Output graph page errors out 3 - Exception causing the error is below 4- I tried importing hive metadata but it errors out
... View more
Labels:
- Labels:
-
Apache Atlas
- « Previous
- Next »