<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: phoenix sqlline.py error in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/317288#M227094</link>
    <description>&lt;P&gt;Hi ,&amp;nbsp; I can see you are not using the correct script parameter while running sqlline.py. Please use the below command and check.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;/usr/hdp/current/phoenix-client/bin/sqlline.py zkhosts:2181/&amp;lt;&lt;SPAN&gt;ZooKeeper Znode Parent&amp;gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;You can check the "&lt;SPAN&gt;ZooKeeper Znode Parent" from Ambari -&amp;gt; Hbase -&amp;gt; Configs&amp;nbsp;&lt;/SPAN&gt;&amp;nbsp;. Also if you haven’t enabled Phoenix query , please enable it from Ambari UI -&amp;gt; Hbase -&amp;gt; Configs -&amp;gt; phoenix query (Enable it)&lt;/P&gt;</description>
    <pubDate>Fri, 28 May 2021 07:45:05 GMT</pubDate>
    <dc:creator>arunek95</dc:creator>
    <dc:date>2021-05-28T07:45:05Z</dc:date>
    <item>
      <title>phoenix sqlline.py error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/317184#M227050</link>
      <description>&lt;P&gt;When running sqlline.py, I get the following error:&lt;/P&gt;&lt;P&gt;Also the file "/usr/lib/phoenix/phoenix-server.jar" exists on all the hbase master and regionserver in the cluster.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[root@test01 bin]# ./sqlline.py&lt;BR /&gt;Picked up _JAVA_OPTIONS: -Xmx2048m -XX:MaxPermSize=512m -Djava.awt.headless=true&lt;BR /&gt;OpenJDK 64-Bit Server VM warning: ignoring option MaxPermSize=512m; support was removed in 8.0&lt;BR /&gt;Setting property: [incremental, false]&lt;BR /&gt;Setting property: [isolation, TRANSACTION_READ_COMMITTED]&lt;BR /&gt;issuing: !connect jdbc:phoenix: none none org.apache.phoenix.jdbc.PhoenixDriver&lt;BR /&gt;Connecting to jdbc:phoenix:&lt;BR /&gt;SLF4J: Class path contains multiple SLF4J bindings.&lt;BR /&gt;SLF4J: Found binding in [jar:file:/usr/lib/phoenix/phoenix-4.7.0.2.6.1.0-129-client.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: Found binding in [jar:file:/usr/lib/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]&lt;BR /&gt;SLF4J: See &lt;A href="http://www.slf4j.org/codes.html#multiple_bindings" target="_blank"&gt;http://www.slf4j.org/codes.html#multiple_bindings&lt;/A&gt; for an explanation.&lt;BR /&gt;21/05/26 17:18:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable&lt;BR /&gt;21/05/26 17:18:53 WARN shortcircuit.DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded.&lt;BR /&gt;Error: org.apache.hadoop.hbase.DoNotRetryIOException: Class org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be loaded Set hbase.table.sanity.checks to false at conf or table descriptor if you want to bypass sanity checks&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:2051)&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1897)&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1799)&lt;BR /&gt;at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:488)&lt;BR /&gt;at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2399)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:124)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:311)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:291) (state=08000,code=101)&lt;BR /&gt;org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: Class org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be loaded Set hbase.table.sanity.checks to false at conf or table descriptor if you want to bypass sanity checks&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:2051)&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1897)&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1799)&lt;BR /&gt;at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:488)&lt;BR /&gt;at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2399)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:124)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:311)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:291)&lt;/P&gt;&lt;P&gt;at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:111)&lt;BR /&gt;at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1135)&lt;BR /&gt;at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1427)&lt;BR /&gt;at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2190)&lt;BR /&gt;at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:872)&lt;BR /&gt;at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:194)&lt;BR /&gt;at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:343)&lt;BR /&gt;at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:331)&lt;BR /&gt;at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)&lt;BR /&gt;at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:329)&lt;BR /&gt;at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1421)&lt;BR /&gt;at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2390)&lt;BR /&gt;at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2339)&lt;BR /&gt;at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)&lt;BR /&gt;at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2339)&lt;BR /&gt;at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:237)&lt;BR /&gt;at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:150)&lt;BR /&gt;at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:205)&lt;BR /&gt;at sqlline.DatabaseConnection.connect(DatabaseConnection.java:157)&lt;BR /&gt;at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:203)&lt;BR /&gt;at sqlline.Commands.connect(Commands.java:1064)&lt;BR /&gt;at sqlline.Commands.connect(Commands.java:996)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:498)&lt;BR /&gt;at sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:36)&lt;BR /&gt;at sqlline.SqlLine.dispatch(SqlLine.java:804)&lt;BR /&gt;at sqlline.SqlLine.initArgs(SqlLine.java:588)&lt;BR /&gt;at sqlline.SqlLine.begin(SqlLine.java:656)&lt;BR /&gt;at sqlline.SqlLine.start(SqlLine.java:398)&lt;BR /&gt;at sqlline.SqlLine.main(SqlLine.java:292)&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: Class org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be loaded Set hbase.table.sanity.checks to false at conf or table descriptor if you want to bypass sanity checks&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:2051)&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1897)&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1799)&lt;BR /&gt;at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:488)&lt;BR /&gt;at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2399)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:124)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:311)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:291)&lt;/P&gt;&lt;P&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:423)&lt;BR /&gt;at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)&lt;BR /&gt;at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4403)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:748)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:669)&lt;BR /&gt;at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1067)&lt;BR /&gt;... 30 more&lt;BR /&gt;Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.DoNotRetryIOException): org.apache.hadoop.hbase.DoNotRetryIOException: Class org.apache.phoenix.coprocessor.MetaDataEndpointImpl cannot be loaded Set hbase.table.sanity.checks to false at conf or table descriptor if you want to bypass sanity checks&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.warnOrThrowExceptionForFailure(HMaster.java:2051)&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.sanityCheckTableDescriptor(HMaster.java:1897)&lt;BR /&gt;at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1799)&lt;BR /&gt;at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:488)&lt;BR /&gt;at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2399)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:124)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:311)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:291)&lt;/P&gt;&lt;P&gt;at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1225)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)&lt;BR /&gt;at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)&lt;BR /&gt;at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:62907)&lt;BR /&gt;at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$4.createTable(ConnectionManager.java:1832)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HBaseAdmin$5.call(HBaseAdmin.java:757)&lt;BR /&gt;at org.apache.hadoop.hbase.client.HBaseAdmin$5.call(HBaseAdmin.java:749)&lt;BR /&gt;at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)&lt;BR /&gt;... 34 more&lt;BR /&gt;sqlline version 1.1.8&lt;BR /&gt;0: jdbc:phoenix:&amp;gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Any help is much appreciated.&lt;/P&gt;&lt;P&gt;Thanks,&lt;/P&gt;</description>
      <pubDate>Wed, 26 May 2021 22:26:49 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/317184#M227050</guid>
      <dc:creator>ryu</dc:creator>
      <dc:date>2021-05-26T22:26:49Z</dc:date>
    </item>
    <item>
      <title>Re: phoenix sqlline.py error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/317288#M227094</link>
      <description>&lt;P&gt;Hi ,&amp;nbsp; I can see you are not using the correct script parameter while running sqlline.py. Please use the below command and check.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;/usr/hdp/current/phoenix-client/bin/sqlline.py zkhosts:2181/&amp;lt;&lt;SPAN&gt;ZooKeeper Znode Parent&amp;gt;&lt;/SPAN&gt;&lt;/P&gt;&lt;P class="p1"&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;You can check the "&lt;SPAN&gt;ZooKeeper Znode Parent" from Ambari -&amp;gt; Hbase -&amp;gt; Configs&amp;nbsp;&lt;/SPAN&gt;&amp;nbsp;. Also if you haven’t enabled Phoenix query , please enable it from Ambari UI -&amp;gt; Hbase -&amp;gt; Configs -&amp;gt; phoenix query (Enable it)&lt;/P&gt;</description>
      <pubDate>Fri, 28 May 2021 07:45:05 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/317288#M227094</guid>
      <dc:creator>arunek95</dc:creator>
      <dc:date>2021-05-28T07:45:05Z</dc:date>
    </item>
    <item>
      <title>Re: phoenix sqlline.py error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/317746#M227306</link>
      <description>&lt;P&gt;can you please answer to this as well , any help is good, its very related.&lt;/P&gt;&lt;P&gt;&lt;A href="https://community.cloudera.com/t5/Support-Questions/org-apache-phoenix-coprocessor-SystemCatalogRegionObserver/td-p/317745" target="_self"&gt;https://community.cloudera.com/t5/Support-Questions/org-apache-phoenix-coprocessor-SystemCatalogRegionObserver/td-p/317745&lt;/A&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Sun, 06 Jun 2021 01:01:09 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/317746#M227306</guid>
      <dc:creator>gurucgi</dc:creator>
      <dc:date>2021-06-06T01:01:09Z</dc:date>
    </item>
    <item>
      <title>Re: phoenix sqlline.py error</title>
      <link>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/321417#M228394</link>
      <description>&lt;P&gt;Hello&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/84981"&gt;@ryu&lt;/a&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;As mentioned by&amp;nbsp;&lt;a href="https://community.cloudera.com/t5/user/viewprofilepage/user-id/77994"&gt;@arunek95&lt;/a&gt;, we assume Phoenix is enabled for the Cluster. If not, Kindly enable Phoenix &amp;amp; try the Command again.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The Logging indicates HDP v2.6.1.0 with Phoenix v4.7. The Directory "&lt;SPAN&gt;/usr/lib/phoenix/" has the Phoenix Client &amp;amp; you mentioned the same Directory has Phoenix Server Jar as well. Kindly verify if the Permission on the JAR is Correct &amp;amp; confirm via "jar -tvf" on the Phoenix Server Jar that the Class "MetaDataEndpointImpl" is included in the same.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;The Error indicates the Phoenix creating the SYSTEM Tables (Upon 1st Connection to Phoenix) is encountering the Error. In our Internal Setup, We see the Phoenix-Server Jar is present in HBase Lib Path as well, pointing to the Phoenix-Server Jar in Phoenix Lib Path as SymLink:&lt;/SPAN&gt;&lt;/P&gt;&lt;LI-CODE lang="markup"&gt;/usr/hdp/&amp;lt;Version&amp;gt;/hbase/lib/phoenix-server.jar -&amp;gt; /usr/hdp/&amp;lt;Version&amp;gt;/phoenix/phoenix-server.jar&lt;/LI-CODE&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;Kindly ensure the Phoenix Server JAR is present in HBase Lib Directory as well. Additionally, Review the Master Logs to check for the Error Message at HBase Level as well.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;SPAN&gt;- Smarak&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Fri, 23 Jul 2021 10:41:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/phoenix-sqlline-py-error/m-p/321417#M228394</guid>
      <dc:creator>smdas</dc:creator>
      <dc:date>2021-07-23T10:41:23Z</dc:date>
    </item>
  </channel>
</rss>

