Member since
07-31-2013
1924
Posts
462
Kudos Received
311
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1952 | 07-09-2019 12:53 AM | |
| 11789 | 06-23-2019 08:37 PM | |
| 9077 | 06-18-2019 11:28 PM | |
| 10031 | 05-23-2019 08:46 PM | |
| 4442 | 05-20-2019 01:14 AM |
10-07-2015
01:48 PM
Hi Thanks for the help .. now i am able to set minutes parameter .
... View more
10-02-2015
07:52 AM
Once again thanks. Regarding using full xml resource load. I understand that is always good. But in our environment its not permitted hence endup adding one by one. I am trying to convince ops team to adopt conf folder approach. Thanks, MG
... View more
10-01-2015
04:00 AM
To add a new disk into a DN/NM config is as easy as configuring their data.dirs/local-dirs keys to the path the disk is mounted as. To change log dirs, you can use CM configuration fields to set newer locations, or alternatively move the log dirs and create symlinks to the new paths from the original. Does this help?
... View more
10-01-2015
03:19 AM
1 Kudo
Unless your NameNode is down, the only other reason, minus firewalls/etc. misconfigs, is that the NameNode port (and maybe other services' ports) are listening on a different network interfaces than the internal IP one. You could verify this with netstat on the refusing service's host.
... View more
09-28-2015
11:57 AM
We too have the similar issue while reading from the hbase snapshot . Here is our senario Created a table in HBASE (create 'TEST_HBASE_TABLE','cf') Created a hive external hbase table with name TEST_EXTERNAL_HBASE_TABLE created snapshot for (snapshot 'TEST_HBASE_TABLE','TEST_SNAPSHOT') set the hive hbase property (set hive.hbase.snapshot.name=TEST_SNAPSHOT) execute the query ( select * from TEST_EXTERNAL_HBASE_TABLE ) got the below exception . Need some help in fixing this issue 2015-09-28 16:10:29,025 WARN org.apache.hadoop.security.UserGroupInformation: PriviledgedActionException as:hbase_user (auth:SIMPLE) cause:org.apache.hive.service.cli.Hivetion: java.io.IOException: java.io.IOException: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.io.hfile.CacheStats org.apache.hive.service.cli.HiveSQLException: java.io.IOException: java.io.IOException: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbasle.CacheStats at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:329) at org.apache.hive.service.cli.operation.OperationManager.getOperationNextRowSet(OperationManager.java:250) at org.apache.hive.service.cli.session.HiveSessionImpl.fetchResults(HiveSessionImpl.java:699) at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:497) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78) at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36) at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59) at com.sun.proxy.$Proxy17.fetchResults(Unknown Source) at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:451) at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:676) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1553) at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1538) at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) at org.apache.hive.service.auth.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:56) at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.io.IOException: java.io.IOException: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.io.hfile.CacheStats at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:507) at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:414) at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:138) at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1657) at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:324) ... 24 more Caused by: java.io.IOException: java.lang.NoClassDefFoundError: Could not initialize class org.apache.hadoop.hbase.io.hfile.CacheStats at org.apache.hadoop.hbase.regionserver.HRegion.initializeRegionStores(HRegion.java:951) at org.apache.hadoop.hbase.regionserver.HRegion.initializeRegionInternals(HRegion.java:841) at org.apache.hadoop.hbase.regionserver.HRegion.initialize(HRegion.java:814) at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5828) at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5794) at org.apache.hadoop.hbase.regionserver.HRegion.openHRegion(HRegion.java:5765) at org.apache.hadoop.hbase.client.ClientSideRegionScanner.<init>(ClientSideRegionScanner.java:57) at org.apache.hadoop.hbase.mapreduce.TableSnapshotInputFormatImpl$RecordReader.initialize(TableSnapshotInputFormatImpl.java:190) at org.apache.hadoop.hbase.mapred.TableSnapshotInputFormat$TableSnapshotRecordReader.<init>(TableSnapshotInputFormat.java:96) at org.apache.hadoop.hbase.mapred.TableSnapshotInputFormat.getRecordReader(TableSnapshotInputFormat.java:150) at org.apache.hadoop.hive.hbase.HiveHBaseTableSnapshotInputFormat.getRecordReader(HiveHBaseTableSnapshotInputFormat.java:74)
... View more
09-27-2015
07:13 AM
> How can I have only 68 blocks? That depends on how much data your HDFS is carrying. Is the number much less than expected, and not match the output of 'hadoop fs -ls -R /' list of all files? The space report says only about 23 MB used by HDFS, so the number of blocks look OK to me. > Also, when I run hive job, it does not go beyond "Running job: job_1443147339086_0002". Could it be related? This would be unrelated, but to resolve the issue consider raising the values under YARN -> Configuration -> Container Memory (NodeManager) and Container Virtual CPUs (NodeManager)
... View more