Member since
02-07-2018
33
Posts
1
Kudos Received
0
Solutions
09-07-2018
09:50 AM
I'm Using HDFS 2.7.3.2.6 version Polybase is SQL 2013 Polybase Error says: polybase-error.txt Yarn Error says: yarnerror.txt
... View more
Labels:
07-15-2018
11:32 PM
ambari-server log says: Error Processing URI: /api/v1/views/HIVE/versions/2.0.0/instances/AUTO_HIVE20_INSTANCE/resources/ddl/databases/kyctext_kpcustomers/tables/customers/info - (java.lang.NumberFormatException) For input string: "6180443254" I have already set ambari-env.sh AMBARI_JVM_ARGS variable to -Xmx4096m -XX:PermSize=128m -XX:MaxPermSize=128m. And I still get that error.
... View more
Labels:
05-11-2018
02:19 PM
Netstat show only 1 PID running from 8671
... View more
05-11-2018
02:03 PM
I tried to change the port but the error still appears. I change the port to 8671. . Failed to start ping port listener of: Could not open port 8671 because port already used by another process:
... View more
05-11-2018
01:17 PM
can I just use any port for ping port as long as it is not used by any application?
... View more
05-11-2018
07:51 AM
I'm running it as root user. [root@datanode4 ~]# ls -ld /var/run/ambari-agent
drwxr-xr-x 2 root root 4096 May 11 15:45 /var/run/ambari-agent [root@datanode4 ~]# ls -l /var/run/ambari-agent/ambari-agent.pid -rw-r--r-- 1 root root 5 May 11 15:45 /var/run/ambari-agent/ambari-agent.pid [root@datanode4 ~]# cat /var/log/ambari-agent/ambari-agent.out
Failed to start ping port listener of: Could not open port 8670 because port already used by another process:
UID PID PPID C STIME TTY TIME CMD
root 19874 10487 57 15:31 ? 00:00:16 /usr/bin/python /usr/lib/python2
... View more
05-11-2018
07:07 AM
/var/log/ambari-agent/ambari-agent.out: /var/run/ambari-agent/ambari-agent.pid already exists, exiting it only appears on 1 datanode
... View more
Labels:
04-22-2018
11:26 PM
Hi have you solve this problem? I am also having same problem as yours.
... View more
04-15-2018
07:38 AM
thank you for your fast response @Geoffrey Shelton Okot. I already know that --map-column-hive is used to override the default mapping from SQL type to Hive type. Is there any other way to get rid of that error without using --map-column-hive??
... View more
04-15-2018
07:17 AM
I'm only encountering this error when I import from mssql windows server 2003. When I import from MYSQL 5.5 to MYSQL 5.6 it does not have any problem. I'm using sqljdbc4.jar for the sql driver. I also attached the screen shot of the error. Please help me getting rid this error without mapping timestamp column to string.
... View more
Labels:
04-01-2018
06:47 PM
ERROR:
HY000] [Hortonworks][Hardy] (35) Error from server: error code: '1' error message: 'Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask'. ? Here's my code: using (OdbcConnection mycon1 = new OdbcConnection(conn.hiveconnection())) { mycon1.Open(); using (cmd = mycon1.CreateCommand()) { cmd.CommandText = "insert into table tblname values(1,'test@gmail.com','1990-11-03','09123456789','NULL', 'FEMALE','','test123456789','2017-03-14 11:37:56.0', '0000-00-00 00-00-00.0',1,0,0,'test','address', '-','lname', '-','fname','03/28/2018 15:54:53','0000-00-00 00-00-00.0')"; cmd.ExecuteNonQuery(); } } mycon1.Close(); The Error will apply in this line: cmd.ExecuteNonQuery(); I've attached the screen shot from tez view.
the app runs as hadoop user.
i am using Horton Works Drive - ODBC.
Is there something wrong in my query? When I run the query in ambari it works.
Can Anyone know this kind of problem? Please somebody help help. Thanks.
... View more
Labels:
04-01-2018
06:21 PM
@Rahul Soni Yes, sir . . That's what I see in the ambari-server logs with regards to the error I posted above.
... View more
03-31-2018
03:27 PM
1. it does not happened with other tables where the size of table table is lower than 5GB 2. it is not kerberized 3. Yes , I attached the screen shot of the logs
... View more
03-31-2018
01:54 PM
hive-20-error.png I am having this in the ambari server logs. . please see if you have experience same thing as mine
... View more
03-30-2018
09:55 PM
I just experienced this kind of error in hive view 2.0 when TABLES tab and I try to view the table properties of a table with 5 GIG size. .
... View more
03-27-2018
10:26 AM
1 Kudo
This error occurs every time I select the table in the tables tab in hive view 2.0. The source table from mysql does not have an int primary key. and the table size is 5.5GB If you have any I dea how to fix it. I need your help. Thanks
... View more
Labels:
03-27-2018
09:06 AM
thank you for answering my questions sir, do you mean I have to alter the table in mysql when adding surrogate int PK?
... View more
03-23-2018
10:00 PM
Okey, Thanks a lot for explaining it to me very well. . I really have a lot of things needed to learn for this job.
... View more
03-23-2018
04:57 AM
The import will succeed if I will set the mappers to 1. I also notice that when I'm not using 1 mapper the yarn memory will be fully consumed
... View more
03-23-2018
04:54 AM
here's the stderr and stdout stdout.txt & stderr.txt
... View more
03-23-2018
04:46 AM
NO, I don't use split-by when I set mappers to 1 or in any
... View more
03-23-2018
04:26 AM
yes I am using --direct
... View more
03-23-2018
04:08 AM
import -Dorg.apache.sqoop.splitter.allow_text_splitter=true --connect jdbc:mysql://x.x.x.x:xxxx/kpcustomers --username root --password ******* --table customers --fields-terminated-by | -m 5 --hive-import --hive-overwrite --hive-table testing.customers --direct --verbose
... View more
03-23-2018
02:52 AM
I'm importing 5.6 GB table and the error: Error: java.io.IOException: mysqldump terminated with status 2
at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:485)
at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
... View more
03-06-2018
09:23 PM
mssql-incremental-workflow.txtsqoop-incrementam-mssql.txt23747 [main] INFO org.apache.sqoop.orm.CompilationManager - Writing jar file: /tmp/sqoop-yarn/compile/5b6f42ea55506c6953c0204fe6bf82b7/dbo.Test.jar
23747 [main] INFO org.apache.sqoop.orm.CompilationManager - Writing jar file: /tmp/sqoop-yarn/compile/5b6f42ea55506c6953c0204fe6bf82b7/dbo.Test.jar
23767 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM dbo.Test AS t WHERE 1=0
23767 [main] INFO org.apache.sqoop.manager.SqlManager - Executing SQL statement: SELECT t.* FROM dbo.Test AS t WHERE 1=0
23831 [main] ERROR org.apache.sqoop.manager.SqlManager - SQL exception accessing current timestamp: com.microsoft.sqlserver.jdbc.SQLServerException: Line 1: Incorrect syntax near ')'.
com.microsoft.sqlserver.jdbc.SQLServerException: Line 1: Incorrect syntax near ')'.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:775)
at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:676)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeQuery(SQLServerStatement.java:611)
at org.apache.sqoop.manager.SqlManager.getCurrentDbTimestamp(SqlManager.java:987)
at org.apache.sqoop.tool.ImportTool.initIncrementalConstraints(ImportTool.java:328)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:498)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:243)
at org.apache.sqoop.tool.JobTool.run(JobTool.java:298)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:179)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:58)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:237)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
23831 [main] ERROR org.apache.sqoop.manager.SqlManager - SQL exception accessing current timestamp: com.microsoft.sqlserver.jdbc.SQLServerException: Line 1: Incorrect syntax near ')'.
com.microsoft.sqlserver.jdbc.SQLServerException: Line 1: Incorrect syntax near ')'.
at com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:197)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1493)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.doExecuteStatement(SQLServerStatement.java:775)
at com.microsoft.sqlserver.jdbc.SQLServerStatement$StmtExecCmd.doExecute(SQLServerStatement.java:676)
at com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:4575)
at com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:1400)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:179)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:154)
at com.microsoft.sqlserver.jdbc.SQLServerStatement.executeQuery(SQLServerStatement.java:611)
at org.apache.sqoop.manager.SqlManager.getCurrentDbTimestamp(SqlManager.java:987)
at org.apache.sqoop.tool.ImportTool.initIncrementalConstraints(ImportTool.java:328)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:498)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:243)
at org.apache.sqoop.tool.JobTool.run(JobTool.java:298)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:179)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:58)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:237)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
23833 [main] ERROR org.apache.sqoop.tool.ImportTool - Encountered IOException running import job: java.io.IOException: Could not get current time from database
at org.apache.sqoop.tool.ImportTool.initIncrementalConstraints(ImportTool.java:330)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:498)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.tool.JobTool.execJob(JobTool.java:243)
at org.apache.sqoop.tool.JobTool.run(JobTool.java:298)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:197)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:179)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:58)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:237)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
... View more
Labels:
02-26-2018
01:01 AM
Is there any way to get rid with this error without using "map-column-hive"?
... View more
Labels:
02-12-2018
08:55 PM
I am working on incremental import, my aim is to be able to do incremental import on hive using UI through sqoop action . . but it seems that the incremental import does not reflect in hive but it is successful in executing the job. here is my workflow sqoopjob.png screen shots of job process: process1.png, process2.png, process3.png, process4.png by the way I can already execute sqoop import using AMBARI UI in Workflow Management I really need your help badly on this one. Thanks,
... View more
Labels:
02-11-2018
06:06 PM
what changes will be made if I'm importing from MySQL RDMS to hive by incremental importation?
... View more