Member since
04-11-2016
535
Posts
148
Kudos Received
77
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6459 | 09-17-2018 06:33 AM | |
1438 | 08-29-2018 07:48 AM | |
2277 | 08-28-2018 12:38 PM | |
1644 | 08-03-2018 05:42 AM | |
1546 | 07-27-2018 04:00 PM |
01-03-2017
09:57 AM
@Asier Gomez Click on the exception and share the complete stack trace. In most case, issue is related to hadoop.proxyuser.<user_name>.hosts and hadoop.proxyuser.<user_name>.groups configurations from core-site under HDFS configs.
... View more
12-29-2016
10:17 AM
3 Kudos
The table definition "LINES TERMINATED BY" only supports newline '\n' right now. This is a known issue and Jira Hive 11996 has already been raised for the issue:
To handle the newline characters within the data, you can use the Omniture Data format which uses a EscapedLineReader which gets around Omniture's pesky escaped tabs and newlines.
Please note that the data files need to include '\' characters before the newline character within the data and run the below command in sequence and required jars are attached along with data file: add jar /tmp/omnituredata-1.0.2-SNAPSHOT-jar-with-dependencies.jar;
add jar /tmp/omnituredata-1.0.2-SNAPSHOT-javadoc.jar;
add jar /tmp/omnituredata-1.0.2-SNAPSHOT-sources.jar;
add jar /tmp/omnituredata-1.0.2-SNAPSHOT.jar;
(Note: jars are available on the HDFS /tmp folder). CREATE TABLE test8(id string,desc string)
ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
STORED AS INPUTFORMAT 'org.rassee.omniture.hadoop.mapred.OmnitureDataFileInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat'
LOCATION '/apps/hive/warehouse/test8';
Sample file under HDFS location '/apps/hive/warehouse/test8' is as: [hive@sindhu root]$ hdfs dfs -cat /apps/hive/warehouse/test8/file.txt
id desc
1 Hi\
I am a member and would like to open savings accts for both my kids aged 12 and 16.\
Is that possible and what documents do I need to bring?\
Also do I need to make an appt first?\
Thx!
Also, the inputformat as TEXT does not understand the escaped newline characters. 4 rows selected (0.165 seconds)
0: jdbc:hive2://sindhu:2181/> CREATE TABLE test9(id string,desc string)
0: jdbc:hive2://sindhu:2181/> ROW FORMAT DELIMITED FIELDS TERMINATED BY '\t'
0: jdbc:hive2://sindhu:2181/> STORED AS textfile LOCATION '/apps/hive/warehouse/test9';
No rows affected (0.209 seconds)
0: jdbc:hive2://sindhu:2181/> select * from test9;
+---------------------------------------------------------------------------------------+-------------------+--+
| test9.id | test9.desc |
+---------------------------------------------------------------------------------------+-------------------+--+
| id | desc |
| 1 | Hi\ |
| I am a member and would like to open savings accts for both my kids aged 12 and 16.\ | NULL |
| Is that possible and what documents do I need to bring?\ | NULL |
| Also do I need to make an appt first?\ | NULL |
| Thx! | NULL |
| 2 | hi jihidp\ |
| uiunoo! | NULL |
| 3 | hi who are you\ |
| talking with | NULL |
+--------------------------+
... View more
Labels:
12-26-2016
10:52 AM
@chennuri gouri shankar This is a known issue with the Ambari version if the database used is MySQL. Manually create the required table by using the following create table statement:
CREATE TABLE DS_JOBIMPL_<REPLACE THIS WITH THE NUMBER IN THE ACTUAL TABLE NAME> (
ds_id character varying(255) NOT NULL,
ds_applicationid character varying(2800),
ds_conffile character varying(2800),
ds_dagid character varying(2800),
ds_dagname character varying(2800),
ds_database character varying(2800),
ds_datesubmitted bigint,
ds_duration bigint,
ds_forcedcontent character varying(2800),
ds_globalsettings character varying(2800),
ds_logfile character varying(2800),
ds_owner character varying(2800),
ds_queryfile character varying(2800),
ds_queryid character varying(2800),
ds_referrer character varying(2800),
ds_sessiontag character varying(2800),
ds_sqlstate character varying(2800),
ds_status character varying(2800),
ds_statusdir character varying(2800),
ds_statusmessage character varying(2800),
ds_title character varying(2800)
);
... View more
12-19-2016
07:16 AM
1 Kudo
@Rajendra Kalepu Validate option is to validate the data being imported into HDFS/Hive table against the input table using the table row count and number of rows copied. Refer to the link for details.
... View more
12-15-2016
08:28 AM
2 Kudos
@Yukti Agrawal First check if the query creates an application and if there any issues at the application end from Resource Manager UI.
... View more
12-09-2016
10:39 AM
1 Kudo
@subash sharma Please add the proxy property for root user as hadoop.proxyuser.root.groups=* and hadoop.proxyuser.root.hosts=*.
... View more
12-02-2016
12:31 PM
@Pooja Sahu Try the below create comment: CREATE EXTERNAL TABLE fix_map > (tag MAP<INT, STRING>) > ROW FORMAT DELIMITED > COLLECTION ITEMS TERMINATED BY '1' >ROW FORMAT DELIMITED>FIELDS TERMINATED BY '2' > MAP KEYS TERMINATED BY '=' > LOCATION '/user/pooja/fix/';
... View more
12-02-2016
11:30 AM
1 Kudo
@Gayathri Reddy G The issue could be related to the mapreduce running outofmemory due to higher fetch size. Use the sqoop command with mapreduce parameters as: sqoop -import -Dmapreduce.map.memory.mb=8192 -Dmapreduce.map.java.opts=-Xmx7200m ..
... View more
12-02-2016
11:18 AM
@Takashi Nasu Are you able to run the queries now? If yes, then issue could be missing HDFS directory and no Ranger policy in place.
... View more
12-02-2016
10:44 AM
1 Kudo
@Takashi Nasu Create the directory /user/admin under hdfs as below and then try running the query: hdfs dfs -mkdir /user/admin hdfs dfs -chown -R admin:admin /user/admin
... View more