Member since
10-20-2016
6
Posts
0
Kudos Received
1
Solution
04-20-2024
01:44 AM
Hi folks, trying to build a local data engineering environment based around HBase and Phoenix. I've manually downloaded and configured all the packages and I'm trying to connect using to Phoenix via the PQS. The PQS is running but when I try and connect I'm getting the following error: [hduser@dataengineering ~]$ sqlline-thin.py
Picked up _JAVA_OPTIONS: -Xmx4096m
Setting property: [incremental, false]
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect -p driver org.apache.phoenix.queryserver.client.Driver -p user "none" -p password "none" "jdbc:phoenix:thin:url=http://localhost:8765;serialization=PROTOBUF"
Connecting to jdbc:phoenix:thin:url=http://localhost:8765;serialization=PROTOBUF
java.lang.IllegalArgumentException: Cannot fetch parser for Response with missing class name
at org.apache.calcite.avatica.remote.ProtobufTranslationImpl.getParserForResponse(ProtobufTranslationImpl.java:332)
at org.apache.calcite.avatica.remote.ProtobufTranslationImpl.parseResponse(ProtobufTranslationImpl.java:437)
at org.apache.calcite.avatica.remote.RemoteProtobufService._apply(RemoteProtobufService.java:52)
at org.apache.calcite.avatica.remote.ProtobufService.apply(ProtobufService.java:81)
at org.apache.calcite.avatica.remote.Driver.connect(Driver.java:176)
at sqlline.DatabaseConnection.connect(DatabaseConnection.java:135)
at sqlline.DatabaseConnection.getConnection(DatabaseConnection.java:192)
at sqlline.Commands.connect(Commands.java:1364)
at sqlline.Commands.connect(Commands.java:1244)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at sqlline.ReflectiveCommandHandler.execute(ReflectiveCommandHandler.java:38)
at sqlline.SqlLine.dispatch(SqlLine.java:730)
at sqlline.SqlLine.initArgs(SqlLine.java:410)
at sqlline.SqlLine.begin(SqlLine.java:515)
at sqlline.SqlLine.start(SqlLine.java:267)
at sqlline.SqlLine.main(SqlLine.java:206)
at org.apache.phoenix.queryserver.client.SqllineWrapper.main(SqllineWrapper.java:64)
sqlline version 1.9.0 I have no problems using phoenix sqlline.py [hduser@dataengineering ~]$ sqlline.py
Picked up _JAVA_OPTIONS: -Xmx4096m
Setting property: [incremental, false]
Setting property: [isolation, TRANSACTION_READ_COMMITTED]
issuing: !connect -p driver org.apache.phoenix.jdbc.PhoenixDriver -p user "none" -p password "none" "jdbc:phoenix:"
Connecting to jdbc:phoenix:
Connected to: Phoenix (version 5.1)
Driver: PhoenixEmbeddedDriver (version 5.1)
Autocommit status: true
Transaction isolation: TRANSACTION_READ_COMMITTED
sqlline version 1.9.0
0: jdbc:phoenix:> !quit Any ideas?
... View more
Labels:
- Labels:
-
Apache HBase
-
Apache Phoenix
06-30-2023
06:03 AM
So basically no Nifi as part of the free 60 day trial.
... View more
06-30-2023
01:19 AM
Hi all, Is it possible to install Nifi in the Cloudera CDP Private Cloud Base trial? Its not available from the list of services and when I try to add the Parcel (https://archive.cloudera.com/p/cfm2/2.1.5.1000/redhat7/yum/tars/parcel/CFM-2.1.5.1000-56-el7.parcel) it requires authentication. Its strange as just about every other service is there. Thanks.
... View more
Labels:
- Labels:
-
Apache NiFi
-
Cloudera
-
Cloudera on premises
12-14-2022
01:45 AM
Hi folks, I'm new to CDP and I'm trying to figure out what roles I require to be able to view Data Hub Clusters as well as the associated Environment and Data Lake pages in CDP Public Cloud? I've been granted the following Account/Resource Roles: "IamViewer" "EnvironmentUser" This allows me to see specific DataHub Clusters but when I click the links for either Environments or Data Lakes there's no information appearing. The EnvironmentUser Role has the following Policy attached: [
{
"crn": "crn:altus:iam:us-west-1:altus:policy:EnvironmentUserPolicy",
"policyStatements": [
{
"rights": [
"datahub/read",
"datahub/write",
"datalake/read",
"environments/read"
],
"resources": [
"*"
]
}
]
}
] Given the Policy has environments/read & datalake/read what am I missing here? Thanks.
... View more
Labels:
- Labels:
-
Cloudera