Member since
12-16-2015
23
Posts
6
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
6126 | 09-15-2016 08:19 AM |
06-22-2016
03:10 AM
Thank you.
... View more
06-21-2016
07:06 AM
I want to download the file through browser only.
... View more
06-21-2016
06:19 AM
I am able to download a hdfs file /org/project/archived/data/hive/warehouse/Stats/2016_06_20.txt in a broswer thorugh knox using below URL http://hostname:8443/knox/nm1/webhdfs/v1/org/project/archived/data/hive/warehouse/Stats/2016_06_20.txt?op=OPEN Now I have a file in a hadoop archive as below. har:///org/project/archived/data/hive/warehouse/test.har/Stats/2016_06_20.txt How can i do the same for the above file?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Knox
01-12-2016
04:58 AM
My table 'a' has text format and 'b' has orc format. When I keep both as text, it is working fine.
... View more
01-12-2016
04:01 AM
Yes, I have permission to write. I am able to run the query for a smaller dataset, but not for a larger dataset.
... View more
01-11-2016
05:05 AM
1 Kudo
I am a running a insert overwrite query as below. Insert overwrite directory
'/org/data/tmp/webapptempspace/UC3/log'
select
a.* from
a join b on ucase(a.name)=ucase(b.name); It works fine when one of the table has smaller dataset, but when both tables have huge data, it throws the below error. Failed with exception Unable to move source /org/data/tmp/webapptempspace/UC3/log/.hive-staging_hive_2016-01-11_04-31-06_067_6297667876520454770-1/-ext-10000
to destination
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
12-21-2015
05:52 PM
I have added the custom jar in my remote server too. Yes I am using hiveserver2. The code was working in HDP 2.1 but getting this issue after upgrade to 2.3
... View more
12-21-2015
10:22 AM
1 Kudo
I have created a UDF and I added it to hive as a function as below create function
HVDB_Analysis_Archived.UDF_ForLogDatatoXML as 'UDF_ForLogDatatoXML' using jar
'hdfs://InnovationLab/tmp/UDF_ForLogDatatoXML.jar'; I am able to use this function through hive cli but through a java program using jbdc connection (with knox configuration), I am getting the below error. Exception in thread "main" org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException Line 0:-1 Invalid function 'HVDB_Analysis_Archived.UDF_ForLogDatatoXML'
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:255)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:241)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:247)
at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:378)
at find_data_from_hive_uc3.main(find_data_from_hive_uc3.java:60)
Caused by: org.apache.hive.service.cli.HiveSQLException: Error while compiling statement: FAILED: SemanticException Line 0:-1 Invalid function 'HVDB_XDLogFileAnalysis_Archived.UDF_ForLogDatatoXML'
at org.apache.hive.service.cli.operation.Operation.toSQLException(Operation.java:315)
at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:112)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:181)
at org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:388)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:375)
at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:274)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:486)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1313)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1298)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.thrift.server.TServlet.doPost(TServlet.java:83)
at org.apache.hive.service.cli.thrift.ThriftHttpServlet.doPost(ThriftHttpServlet.java:171)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:727)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:820)
at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:565)
at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:479)
at org.eclipse.jetty.server.session.SessionHandler.doHandle(SessionHandler.java:225)
at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1031)
at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:406)
at org.eclipse.jetty.server.session.SessionHandler.doScope(SessionHandler.java:186)
at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:965)
at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:117)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:111)
at org.eclipse.jetty.server.Server.handle(Server.java:349)
at org.eclipse.jetty.server.AbstractHttpConnection.handleRequest(AbstractHttpConnection.java:449)
at org.eclipse.jetty.server.AbstractHttpConnection$RequestHandler.content(AbstractHttpConnection.java:925)
at org.eclipse.jetty.http.HttpParser.parseNext(HttpParser.java:952)
at org.eclipse.jetty.http.HttpParser.parseAvailable(HttpParser.java:235)
at org.eclipse.jetty.server.AsyncHttpConnection.handle(AsyncHttpConnection.java:76)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint.handle(SelectChannelEndPoint.java:609)
at org.eclipse.jetty.io.nio.SelectChannelEndPoint$1.run(SelectChannelEndPoint.java:45)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
-
Apache Knox
12-17-2015
05:27 AM
Thank you @Alex Miller I imported the Knox SSl certificate into cacerts and used the below connection string . (
"jdbc:hive2://knoxserver.net:443/;ssl=true;transportMode=http;httpPath=knox/nn01/hive",
"username", "pwd"); It finally worked.. 🙂
... View more
12-16-2015
04:25 PM
I changed the connection string to below. "jdbc:hive2://knoxserver.net:443/default;ssl=false;hive.server2.transport.mode=http;hive.server2.thrift.http.path=knox/sandbox/hive", "username", "pwd" And getting the below error. INFO: Transport Used for JDBC connection: null
Exception in thread "main" java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://knoxserver.net:443/default;ssl=false;hive.server2.transport.mode=http;hive.server2.thrift.http.path=knox/sandbox/hive: null
at org.apache.hive.jdbc.HiveConnection.openTransport(HiveConnection.java:237)
... View more
- « Previous
-
- 1
- 2
- Next »