Member since
04-18-2014
49
Posts
0
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
13771 | 05-13-2014 07:38 AM | |
14828 | 05-12-2014 10:22 AM |
09-28-2017
01:40 PM
Hi, I am getting some Unicode error messages when running the job. Run: ./spark-submit --packages org.apache.spark:spark-sql-kafka-0-10_2.11:2.2.0 file.py Error: UnicodeEncodeError: 'ascii' codec can't encode character u'\u2019' in position 39: ordinal not in range(128) Any help? How can I fix this issue?
... View more
09-25-2015
12:26 PM
when I run: sudo -u hdfs hdfs fsck / -listcourrptblocks i get: fsck: Illegal option '-listcourrptblocks'
... View more
09-25-2015
06:58 AM
Hi, when I run fsck command it shows total blocks to be 68 (avg. block size 286572 B). How can I have only 68 blocks? [hdfs@cluster1 ~]$ hdfs fsck / Connecting to namenode via http://cluster1.abc:50070 FSCK started by hdfs (auth:SIMPLE) from /192.168.101.241 for path / at Fri Sep 25 09:51:56 EDT 2015 ....................................................................Status: HEALTHY Total size: 19486905 B Total dirs: 569 Total files: 68 Total symlinks: 0 Total blocks (validated): 68 (avg. block size 286572 B) Minimally replicated blocks: 68 (100.0 %) Over-replicated blocks: 0 (0.0 %) Under-replicated blocks: 0 (0.0 %) Mis-replicated blocks: 0 (0.0 %) Default replication factor: 3 Average block replication: 1.9411764 Corrupt blocks: 0 Missing replicas: 0 (0.0 %) Number of data-nodes: 3 Number of racks: 1 FSCK ended at Fri Sep 25 09:51:56 EDT 2015 in 41 milliseconds The filesystem under path '/' is HEALTHY - This is what I get when I run hdfsadmin -repot command: [hdfs@cluster1 ~]$ hdfs dfsadmin -report Configured Capacity: 5715220577895 (5.20 TB) Present Capacity: 5439327449088 (4.95 TB) DFS Remaining: 5439303270400 (4.95 TB) DFS Used: 24178688 (23.06 MB) DFS Used%: 0.00% Under replicated blocks: 0 Blocks with corrupt replicas: 0 Missing blocks: 0 Missing blocks (with replication factor 1): 504 - Also, when I run hive job, it does not go beyond "Running job: job_1443147339086_0002". Could it be related? Any suggestion? Thank you!
... View more
Labels:
- Labels:
-
Apache Hive
-
HDFS
09-24-2015
05:52 PM
Hi, I am getting below message when I start the Hive. I just installed CDH 5.4. from cloudera manager: Canary test failed to create file in directory /tmp/.cloudera_health_monitoring_canary_files. from server: Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.hdfs.server.namenode.SafeModeException): Cannot create directory /tmp/hive/hdfs/5502ca90-629f-4c7e-afd5-dada9535d15c. Name node is in safe mode. The reported blocks 404 needs additional 504 blocks to reach the threshold 0.9990 of total blocks 908. The number of live datanodes 3 has reached the minimum number 0. Safe mode will be turned off automatically once the thresholds have been reached. How do I fix it? Thanks!
... View more
Labels:
05-07-2015
11:54 AM
This query is giving me output. what is difference between COLUMNS_OLD and COLUMNS_V2? select TBL_NAME, COLUMN_NAME, TYPE_NAME from TBLS t left join COLUMNS_OLD c on c.SD_ID = t.SD_ID where COLUMN_NAME regexp 'idorder';
... View more
05-06-2015
01:51 PM
Query comes out fast. Not all hive columns / tables are displayed. I know there is a table for a column idMfg but when I search it does not display. Will creating index help here?
... View more
05-06-2015
12:11 PM
Seems like not all columns are indexed, how do I index the columns?
... View more
04-29-2015
12:51 PM
It is always handy to search tables and column in metadata. How do I search for tables in Hive based on th column name? I know we can do via metadata. I logged into MySQL metadata but did not find a way to search. I saw a way to search a column name but could not find a way to relate to table name. select COLUMN_NAME from COLUMNS_V2 where column_name regexp 'repo' ; I also searched on TBLS table, but there is no column name. Any help?
... View more
Labels:
- Labels:
-
Apache Hive
01-22-2015
10:10 AM
put here: /opt/cloudera/parcels/CDH-4.3.1-1.cdh4.3.1.p0.110/lib/sqoop/lib/mysql-connector-java-5.1.29-bin.jar
... View more
05-13-2014
07:38 AM
I added this line in workflow.xml and it resolved the issue. I also copied the hive-site.xml to workflow directory. <job-xml>hive-site.xml</job-xml>
... View more