Member since
02-07-2018
33
Posts
1
Kudos Received
0
Solutions
07-15-2018
11:32 PM
ambari-server log says: Error Processing URI: /api/v1/views/HIVE/versions/2.0.0/instances/AUTO_HIVE20_INSTANCE/resources/ddl/databases/kyctext_kpcustomers/tables/customers/info - (java.lang.NumberFormatException) For input string: "6180443254" I have already set ambari-env.sh AMBARI_JVM_ARGS variable to -Xmx4096m -XX:PermSize=128m -XX:MaxPermSize=128m. And I still get that error.
... View more
Labels:
- Labels:
-
Apache Hive
05-11-2018
02:19 PM
Netstat show only 1 PID running from 8671
... View more
05-11-2018
02:03 PM
I tried to change the port but the error still appears. I change the port to 8671. . Failed to start ping port listener of: Could not open port 8671 because port already used by another process:
... View more
05-11-2018
01:17 PM
can I just use any port for ping port as long as it is not used by any application?
... View more
05-11-2018
07:51 AM
I'm running it as root user. [root@datanode4 ~]# ls -ld /var/run/ambari-agent
drwxr-xr-x 2 root root 4096 May 11 15:45 /var/run/ambari-agent [root@datanode4 ~]# ls -l /var/run/ambari-agent/ambari-agent.pid -rw-r--r-- 1 root root 5 May 11 15:45 /var/run/ambari-agent/ambari-agent.pid [root@datanode4 ~]# cat /var/log/ambari-agent/ambari-agent.out
Failed to start ping port listener of: Could not open port 8670 because port already used by another process:
UID PID PPID C STIME TTY TIME CMD
root 19874 10487 57 15:31 ? 00:00:16 /usr/bin/python /usr/lib/python2
... View more
05-11-2018
07:07 AM
/var/log/ambari-agent/ambari-agent.out: /var/run/ambari-agent/ambari-agent.pid already exists, exiting it only appears on 1 datanode
... View more
Labels:
- Labels:
-
Apache Ambari
04-22-2018
11:26 PM
Hi have you solve this problem? I am also having same problem as yours.
... View more
04-15-2018
07:38 AM
thank you for your fast response @Geoffrey Shelton Okot. I already know that --map-column-hive is used to override the default mapping from SQL type to Hive type. Is there any other way to get rid of that error without using --map-column-hive??
... View more
04-15-2018
07:17 AM
I'm only encountering this error when I import from mssql windows server 2003. When I import from MYSQL 5.5 to MYSQL 5.6 it does not have any problem. I'm using sqljdbc4.jar for the sql driver. I also attached the screen shot of the error. Please help me getting rid this error without mapping timestamp column to string.
... View more
Labels:
- Labels:
-
Apache Sqoop
04-01-2018
06:47 PM
ERROR:
HY000] [Hortonworks][Hardy] (35) Error from server: error code: '1' error message: 'Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.tez.TezTask'. ? Here's my code: using (OdbcConnection mycon1 = new OdbcConnection(conn.hiveconnection())) { mycon1.Open(); using (cmd = mycon1.CreateCommand()) { cmd.CommandText = "insert into table tblname values(1,'test@gmail.com','1990-11-03','09123456789','NULL', 'FEMALE','','test123456789','2017-03-14 11:37:56.0', '0000-00-00 00-00-00.0',1,0,0,'test','address', '-','lname', '-','fname','03/28/2018 15:54:53','0000-00-00 00-00-00.0')"; cmd.ExecuteNonQuery(); } } mycon1.Close(); The Error will apply in this line: cmd.ExecuteNonQuery(); I've attached the screen shot from tez view.
the app runs as hadoop user.
i am using Horton Works Drive - ODBC.
Is there something wrong in my query? When I run the query in ambari it works.
Can Anyone know this kind of problem? Please somebody help help. Thanks.
... View more
Labels:
- Labels:
-
Apache Hive
04-01-2018
06:21 PM
@Rahul Soni Yes, sir . . That's what I see in the ambari-server logs with regards to the error I posted above.
... View more
03-31-2018
03:27 PM
1. it does not happened with other tables where the size of table table is lower than 5GB 2. it is not kerberized 3. Yes , I attached the screen shot of the logs
... View more
03-31-2018
01:54 PM
hive-20-error.png I am having this in the ambari server logs. . please see if you have experience same thing as mine
... View more
03-30-2018
09:55 PM
I just experienced this kind of error in hive view 2.0 when TABLES tab and I try to view the table properties of a table with 5 GIG size. .
... View more
03-27-2018
10:26 AM
1 Kudo
This error occurs every time I select the table in the tables tab in hive view 2.0. The source table from mysql does not have an int primary key. and the table size is 5.5GB If you have any I dea how to fix it. I need your help. Thanks
... View more
Labels:
- Labels:
-
Apache Hive
03-27-2018
09:06 AM
thank you for answering my questions sir, do you mean I have to alter the table in mysql when adding surrogate int PK?
... View more
03-23-2018
10:00 PM
Okey, Thanks a lot for explaining it to me very well. . I really have a lot of things needed to learn for this job.
... View more
03-23-2018
04:57 AM
The import will succeed if I will set the mappers to 1. I also notice that when I'm not using 1 mapper the yarn memory will be fully consumed
... View more
03-23-2018
04:54 AM
here's the stderr and stdout stdout.txt & stderr.txt
... View more
03-23-2018
04:46 AM
NO, I don't use split-by when I set mappers to 1 or in any
... View more
03-23-2018
04:26 AM
yes I am using --direct
... View more
03-23-2018
04:08 AM
import -Dorg.apache.sqoop.splitter.allow_text_splitter=true --connect jdbc:mysql://x.x.x.x:xxxx/kpcustomers --username root --password ******* --table customers --fields-terminated-by | -m 5 --hive-import --hive-overwrite --hive-table testing.customers --direct --verbose
... View more
03-23-2018
02:52 AM
I'm importing 5.6 GB table and the error: Error: java.io.IOException: mysqldump terminated with status 2
at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:485)
at org.apache.sqoop.mapreduce.MySQLDumpMapper.map(MySQLDumpMapper.java:49)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:170)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:164)
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Sqoop
-
MapReduce
-
Security
02-26-2018
01:01 AM
Is there any way to get rid with this error without using "map-column-hive"?
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Sqoop
02-11-2018
06:06 PM
what changes will be made if I'm importing from MySQL RDMS to hive by incremental importation?
... View more
02-07-2018
06:54 PM
Goals:
Set up set up hadoop with 7 vms. 1 for ambari server 1 for masternode or namenode 1 for resourcemanager 1 for secondarynamenode 3 for datanodes
Id like to ask for some help how to setup it up and what services should I install in each servers. Thanks in Advance.
... View more
Labels:
- Labels:
-
Apache Ambari