Member since
07-01-2016
38
Posts
11
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1331 | 09-21-2016 12:23 AM | |
1499 | 09-16-2016 01:10 PM | |
1509 | 09-04-2016 05:47 PM | |
2389 | 08-08-2016 01:44 AM | |
1165 | 07-18-2016 12:09 AM |
09-01-2016
02:21 AM
2 Kudos
Hi, I have upgraded Ambari 2.2 to 2.4 and everything went well as per steps but it failed to start the ambari service and the following is the error 2016-08-31 21:52:28,082 INFO - ******************************* Check database started *******************************
2016-08-31 21:52:31,647 INFO - Checking for configs not mapped to any cluster
2016-08-31 21:52:31,653 INFO - Checking for configs selected more than once
2016-08-31 21:52:31,655 INFO - Checking for hosts without state
2016-08-31 21:52:31,657 INFO - Checking host component states count equals host component desired states count
2016-08-31 21:52:31,660 INFO - Checking services and their configs
2016-08-31 21:52:33,669 ERROR - Unexpected error, database check failed
java.lang.NullPointerException
at org.apache.ambari.server.checks.DatabaseConsistencyCheckHelper.checkServiceConfigs(DatabaseConsistencyCheckHelper.java:543)
at org.apache.ambari.server.checks.DatabaseConsistencyChecker.main(DatabaseConsistencyChecker.java:115) Thank you for your help. thanks ram
... View more
Labels:
- Labels:
-
Apache Ambari
08-08-2016
01:44 AM
Hi All, I would like post the solution that worked for me. I deleted data from the following tables from Ambari database. a) request b) stage c) host_role_command d) execution_command e) requestoperationlevel f) requestresourcefilter Thank you for your help Thanks Ram
... View more
08-04-2016
09:38 PM
Sharma, Thank you for your help. Here is the error from Ambari log 04 Aug 2016 17:20:36,026 ERROR [pool-9-thread-9] BaseProvider:240 - Caught exception getting JMX metrics : Connection refused, skipping same exceptions for next 5 minutes One of the agent log has the following error ERROR 2016-08-04 16:21:36,953 HostInfo.py:229 - Checking java processes failed Please let me know if you need more information. Thank you Ram
... View more
08-04-2016
09:35 PM
Thank you for your reply. The tried the above a) /var/logs and grep for ERROR I have identified in ambari agent logs ERROR 2016-08-03 16:19:01,144 Controller.py:350 - Connection to hdp-cent7-01 was lost (details=Request to https://hdp-cent7-01:8441/agent/v1/heartbeat/hdp-cent7-02 failed due to Error occured during connecting to the server: ('The read operation timed out',))
ERROR 2016-08-03 16:20:27,315 Controller.py:350 - Connection to hdp-cent7-01 was lost (details=Request to https://hdp-cent7-01:8441/agent/v1/heartbeat/hdp-cent7-02 failed due to Error occured during connecting to the server: ('The read operation timed out',)) based on the above, I followed the following article https://community.hortonworks.com/articles/49075/heartbeat-lost-due-to-ambari-agent-error-unable-to.html un-installed ambari-agent as well as ambar-server, reinstalled again. However, it is not working and I noticed the following error in the ambari server log. 04 Aug 2016 17:21:07,213 WARN [C3P0PooledConnectionPoolManager[identityToken->2w0zzb9io96x8a18kxg2w|3fc2959f]-HelperThread-#2] BasicResourcePool:223 - com.mchange.v2.resourcepool.BasicResourcePool$ScatteredAcquireTask@1d91d05d -- Acquisition Attempt Failed!!! Clearing pending acquires. While trying to acquire a needed new resource, we failed to succeed more than the maximum number of allowed acquisition attempts (30). Last acquisition attempt exception:
com.mysql.jdbc.exceptions.jdbc4.MySQLNonTransientConnectionException: Data source rejected establishment of connection, message from server: "Too many connections"
at sun.reflect.GeneratedConstructorAccessor174.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.Util.getInstance(Util.java:386)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:1015)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:989)
at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:975)
at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1112)
at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2488)
at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2521)
at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2306)
at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:839)
at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:49)
at sun.reflect.GeneratedConstructorAccessor171.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
at com.mysql.jdbc.Util.handleNewInstance(Util.java:411)
at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:421)
at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:350)
at com.mchange.v2.c3p0.DriverManagerDataSource.getConnection(DriverManagerDataSource.java:175)
at com.mchange.v2.c3p0.WrapperConnectionPoolDataSource.getPooledConnection(WrapperConnectionPoolDataSource.java:220)
at com.mchange.v2.c3p0.WrapperConnectionPoolDataSource.getPooledConnection(WrapperConnectionPoolDataSource.java:206)
at com.mchange.v2.c3p0.impl.C3P0PooledConnectionPool$1PooledConnectionResourcePoolManager.acquireResource(C3P0PooledConnectionPool.java:203)
at com.mchange.v2.resourcepool.BasicResourcePool.doAcquire(BasicResourcePool.java:1138)
at com.mchange.v2.resourcepool.BasicResourcePool.doAcquireAndDecrementPendingAcquiresWithinLockOnSuccess(BasicResourcePool.java:1125)
at com.mchange.v2.resourcepool.BasicResourcePool.access$700(BasicResourcePool.java:44)
at com.mchange.v2.resourcepool.BasicResourcePool$ScatteredAcquireTask.run(BasicResourcePool.java:1870)
at com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread.run(ThreadPoolAsynchronousRunner.java:696) b) I tested SSH to all nodes from Ambari server and I did not find any issues. Here is the final error I am seeing WARN [qtp-ambari-client-29] ServletHandler:563 - /api/v1/clusters/txhubdevcluster01/hosts/hdp-cent7-03.rd.allscripts.com/host_components/FLUME_HANDLER
javax.persistence.RollbackException: Exception [EclipseLink-4002] (Eclipse Persistence Services - 2.6.2.v20151217-774c696): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception: com.mysql.jdbc.exceptions.jdbc4.MySQLIntegrityConstraintViolationException: Cannot add or update a child row: a foreign key constraint fails (Unknown error code) Thanks Ram
... View more
08-04-2016
03:29 AM
Hi , I Create a cluster using Ambari 2.2.2 on centos 7.2. It worked for about four days and I was able to ingest the data using Flume. All of sudden, I am not able to start any service using Ambari. The background operations with progress bar is not appearing and I am seeing the following exception in Ambari server log. 03 Aug 2016 17:15:53,863 ERROR [pool-9-thread-256] BaseProvider:240 - Caught exception getting JMX metrics : Connection refused, skipping same exceptions for next 5 minutes
java.net.ConnectException: Connection refused
at java.net.PlainSocketImpl.socketConnect(Native Method)
at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
at java.net.Socket.connect(Socket.java:589)
at sun.net.NetworkClient.doConnect(NetworkClient.java:175)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:432)
at sun.net.www.http.HttpClient.openServer(HttpClient.java:527)
at sun.net.www.http.HttpClient.<init>(HttpClient.java:211)
at sun.net.www.http.HttpClient.New(HttpClient.java:308)
at sun.net.www.http.HttpClient.New(HttpClient.java:326)
at sun.net.www.protocol.http.HttpURLConnection.getNewHttpClient(HttpURLConnection.java:1168)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect0(HttpURLConnection.java:1104)
at sun.net.www.protocol.http.HttpURLConnection.plainConnect(HttpURLConnection.java:998)
at sun.net.www.protocol.http.HttpURLConnection.connect(HttpURLConnection.java:932)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream0(HttpURLConnection.java:1512)
at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1440)
at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:480)
at org.apache.ambari.server.controller.internal.URLStreamProvider.processURL(URLStreamProvider.java:209)
at org.apache.ambari.server.controller.internal.URLStreamProvider.processURL(URLStreamProvider.java:133)
at org.apache.ambari.server.controller.internal.URLStreamProvider.readFrom(URLStreamProvider.java:107)
at org.apache.ambari.server.controller.internal.URLStreamProvider.readFrom(URLStreamProvider.java:112)
at org.apache.ambari.server.controller.jmx.JMXPropertyProvider.populateResource(JMXPropertyProvider.java:212)
at org.apache.ambari.server.controller.metrics.ThreadPoolEnabledPropertyProvider$1.call(ThreadPoolEnabledPropertyProvider.java:180)
at org.apache.ambari.server.controller.metrics.ThreadPoolEnabledPropertyProvider$1.call(ThreadPoolEnabledPropertyProvider.java:178)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
03 Aug 2016 17:16:09,289 INFO [qtp-ambari-client-445] RequestScheduleR I did the following: a) Removed the content /var/lib/ambari-agent/data and restarted all ambari-agents b) restarted the Ambari-server. I really appreciate your help. Thanks Ram
... View more
Labels:
- Labels:
-
Apache Ambari
07-22-2016
06:03 PM
Here are the details : a) The following is the show create table testtable results ( this table is created with Spark SQL CREATE TABLE `testtabletmp1`( `person_key` bigint, `pat_last` string, `pat_first` string, `pat_dob` timestamp, `pat_zip` string, `pat_gender` string, `pat_chksum1` bigint, `pat_chksum2` bigint, `dimcreatedgmt` timestamp, `pat_mi` string, `h_keychksum` string, `patmd5` string) ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' LOCATION 'hdfs://hdp-cent7-01:8020/apps/hive/warehouse/datawarehouse.db/testtabledimtmp1' | TBLPROPERTIES ( 'orc.compress'='SNAPPY', 'transient_lastDdlTime'='1469207216') 2. The original table create when we scooped the data from SQL server using SQOOP import CREATE TABLE `testtabledim`( `person_key` bigint, `pat_last` varchar(35), `pat_first` varchar(35), `pat_dob` timestamp, `pat_zip` char(5), `pat_gender` char(1), `pat_chksum1` bigint, `pat_chksum2` bigint, `dimcreatedgmt` timestamp, `pat_mi` char(1), `h_keychksum` string, `patmd5` string) ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.orc.OrcSerde' STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.orc.OrcOutputFormat' LOCATION 'hdfs://hdp-cent7-01:8020/apps/hive/warehouse/datawarehouse.db/testtabledim' TBLPROPERTIES ( 'COLUMN_STATS_ACCURATE'='false', 'last_modified_by'='hdfs', 'last_modified_time'='1469026541', 'numFiles'='1', 'numRows'='-1', 'orc.compress'='SNAPPY', 'rawDataSize'='-1', 'totalSize'='11144909', 'transient_lastDdlTime'='1469026541') If use the first script using spark sql and store the file as ORC with snappy compression it is working. if I store ORC file with snappy compression and use hive to create table using script 1 then it is working fine. But I use an existing table alter table with a new coulmn using the Spark Hive context and save as ORC with snappy compression, I am getting the following error ORC does not support type conversion from STRING to VARCHAR. if use the same ORC but use hive to create a table using second query even then I am getting the same error. I noticed some columns are defined as VARCHAR(35) and I think those columns may be the issue. After I made the change from VARCHAR to String and CHAR to String, it worked fine. I am still investigating what is the best way to handle VARCHAR/CHAR types through Spark dataframe. Please let me know if you need more information. Thank you for your help.
... View more
07-22-2016
04:29 PM
I executed the above statement and I indentified that we created a table with TBLPROPERTIES ( |
| 'COLUMN_STATS_ACCURATE'='false', |
| 'last_modified_by'='hdfs', |
| 'last_modified_time'='1469026541', |
| 'numFiles'='1', |
| 'numRows'='-1', |
| 'orc.compress'='SNAPPY', |
| 'rawDataSize'='-1', |
| 'totalSize'='11144909', |
| 'transient_lastDdlTime'='1469026541' I noticed that while storing ORC file I did not provide compress option and I used option("compression", "snappy") while saving the file and it appears the compression is not working. can you please help. thanks Ram
... View more
07-21-2016
09:06 PM
1 Kudo
Hi, thank you for your reply. I will post the results. However I followed these steps. a) Loaded the data from existing table testtable into dataframe using HiveContext b) Added a column using withColumn to dataframe c) Created the new table (testtabletmp) using Spark SQL with new column that saves as ORC d) Save the data frame as ORC dataframe.write.format("orc").save("testtabletmp") With the above steps, I am able to access the table from Hive. I will post the results related to SHOW CREATE TABLE testtable tomorrow. thanks Ram
... View more
07-20-2016
03:24 AM
I Sqooped the data from SQL server and stored the data in Hive in ORC file in a data warehouse as table testtable. I read the data using spark into a dataframe. Added a column using withColumn to dataframe and issued an alter to add the column alter table testtable add columns (PatMD5 VARCHAR(50) using hiveContext.sql and it is changing the table and I saved the dataframe using the following dataframe.write.format("orc").mode(SaveMode.Overwrite).save("testtable") I am able to save the file into ORC. But when I tried to query using Hue or Beeline, I am getting the following error ORC does not support type conversion from STRING to VARCHAR I tried with alter table testtable add columns (PatMD5 STRING) I am able to save the file in ORC but not able to query from hive. Can any one help. thanks in advance Ram
... View more
Labels:
- Labels:
-
Apache Hive
-
Apache Spark
07-18-2016
12:09 AM
Hi All, I further researched this issue and found an alternative solution. if you define the function as follows val parsePatientfun = udf { (thestruct: Row) =>
thestruct.getAs[String]("City") } you can get to the fields from StuctureType. Thanks Ram
... View more
- « Previous
-
- 1
- 2
- Next »