Member since
07-06-2017
53
Posts
12
Kudos Received
5
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
16234 | 05-03-2018 08:01 AM | |
10021 | 10-11-2017 08:17 AM | |
10768 | 07-20-2017 07:04 AM | |
1261 | 04-05-2017 07:32 AM | |
3246 | 03-09-2017 12:05 PM |
05-04-2018
03:49 AM
Glad I could help.
... View more
05-03-2018
08:01 AM
2 Kudos
Hello, Support fixed it for me : The workaround for this is the following: copy snappy-java-1.1.4.jar to /opt/cloudera/parcels/SPARK2/lib/spark2/jars/ on each node where such executors are running. That can be downloaded from http://repo1.maven.org/maven2/org/xerial/snappy/snappy-java/1.1.4/snappy-java-1.1.4.jar. Tested - confirmed working
... View more
04-24-2018
07:23 AM
1 Kudo
Hi, A bit trivial.. but 1/ Do you have the folder /var/log/spark2/lineage present on the Gateway instance of Spark2? 2/ is spark:spark listed as owner of said folder if it exist? I have a similar issue (CDH 5.14.1 / Spark 2.3 - Issue appaered after Spark 2.3 got enabled) though Yarn & workbench for which a case is open In the meantime, I disabled Navigator Lineage from cloudera Manager (spark2 Configuration / config.navigator.lineage_enabled) which allowed my colleagues to work.
... View more
10-11-2017
08:17 AM
2 Kudos
The Password policy / password format is indeed guilty : 1. Set the policy validation to low 2. ALTER USER 'root'@'localhost' IDENTIFIED BY 'Password1234'; 3. [root@scmtst ~]# /usr/share/cmf/schema/scm_prepare_database.sh -uroot -pPassword1234 --verbose mysql scmdb scmuser PAssword1234
JAVA_HOME=/usr/java/latest/
Verifying that we can write to /etc/cloudera-scm-server
Database type: mysql
Database user: root
Executing: /usr/java/latest//bin/java -cp /usr/share/java/mysql-connector-java.jar:/usr/share/java/oracle-connector-java.jar:/usr/share/cmf/schema/../lib/* com.cloudera.enterprise.dbutil.DbProvisioner --create -h localhost -u root -H localhost -U scmuser -d scmdb -t mysql
Wed Oct 11 17:14:57 CEST 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
Creating SCM configuration file in /etc/cloudera-scm-server
Created db.properties file:
# Auto-generated by scm_prepare_database.sh on Wed 11 Oct 17:14:58 CEST 2017
#
# For information describing how to configure the Cloudera Manager Server
# to connect to databases, see the "Cloudera Manager Installation Guide."
#
com.cloudera.cmf.db.type=mysql
com.cloudera.cmf.db.host=localhost
com.cloudera.cmf.db.name=scmdb
com.cloudera.cmf.db.user=scmuser
com.cloudera.cmf.db.setupType=EXTERNAL
com.cloudera.cmf.db.password=PAssword1234
Executing: /usr/java/latest//bin/java -cp /usr/share/java/mysql-connector-java.jar:/usr/share/java/oracle-connector-java.jar:/usr/share/cmf/schema/../lib/* com.cloudera.enterprise.dbutil.DbCommandExecutor /etc/cloudera-scm-server/db.properties com.cloudera.cmf.db.
Wed Oct 11 17:14:59 CEST 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
[ main] DbCommandExecutor INFO Successfully connected to database.
All done, your SCM database is configured correctly!
... View more
10-11-2017
08:10 AM
I actually install a MySql intance on the SCM server directly, and It exposed the exact same behaviour: [root@cdhscmtst ~]# /usr/share/cmf/schema/scm_prepare_database.sh -uroot -p'<root password>' --verbose mysql scmdb scmuser 'scmuser password>'
JAVA_HOME=/usr/java/latest/
Verifying that we can write to /etc/cloudera-scm-server
Database type: mysql
Database user: root
Executing: /usr/java/latest//bin/java -cp /usr/share/java/mysql-connector-java.jar:/usr/share/java/oracle-connector-java.jar:/usr/share/cmf/schema/../lib/* com.cloudera.enterprise.dbutil.DbProvisioner --create -h localhost -u root -H localhost -U scmuser -d scmdb -t mysql
Wed Oct 11 17:06:41 CEST 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
[ main] DbProvisioner ERROR Exception when creating/dropping database with user 'root' and jdbc url 'jdbc:mysql://localhost/?useUnicode=true&characterEncoding=UTF-8'
java.sql.SQLException: Access denied for user 'root'@'localhost' (using password: YES) I kinda wonder if it might related to some special character mis=-intepreted in the passwords. I will disable the password policy in MySql & run more tests
... View more
10-11-2017
07:37 AM
Same behaviour with CDH 5.10.1 installer
... View more
10-11-2017
07:18 AM
Hello, I'm working of installing a brand new cluster, on a fresh dedicated MySql Instance. Servernames : SCM : scmtst Mysql : mysqltst001 CDH 5.12.1 freshly downalod & Installed MySQL 5.7.19 MySql connector 5.1.44 CentOs 7.3, configured according to Cloudera specs When i ran scm_prepare_database.sh on SCM, i keep getting "Access denied for user 'admin@'scmtst'" /usr/share/cmf/schema/scm_prepare_database.sh -u admin -h mysqltst001 -P 3306 -p mysql scmdb scmuser [root@scmtst ~]# /usr/share/cmf/schema/scm_prepare_database.sh -u admin -h mysqltst001 -P 3306 -p mysql scmdb scmuser
Enter database password:
Enter SCM password:
JAVA_HOME=/usr/java/latest/
Verifying that we can write to /etc/cloudera-scm-server
Wed Oct 11 14:50:28 CEST 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
[ main] DbProvisioner ERROR Exception when creating/dropping database with user 'admin' and jdbc url 'jdbc:mysql://mysqltst001:3306/?useUnicode=true&characterEncoding=UTF-8'
java.sql.SQLException: Access denied for user 'admin'@'scmtst' (using password: YES) In order to check that admin user was indeed set properly, I install the MySql client on scmtst, and I can indeed create and drop database at will if logged in as admin While investigating, I extracted the command-line actually called by scm_prepare_database.sh, and once i explicitly add the password (-p & -P) : it worked. The database got created, as well as the user & GRANTS [root@scmtst ~]# /usr/java/latest//bin/java -cp /usr/share/java/mysql-connector-java.jar:/usr/share/java/oracle-connector-java.jar:/usr/share/cmf/schema/../lib/* com.cloudera.enterprise.dbutil.DbProvisioner --create -h mysqltst001.lab.ams:3306 -u admin -H scmtst.lab.ams -U scmuser -d scmdb -t mysql -p 'admin password' -P 'scm password'
Wed Oct 11 15:51:52 CEST 2017 WARN: Establishing SSL connection without server's identity verification is not recommended. According to MySQL 5.5.45+, 5.6.26+ and 5.7.6+ requirements SSL connection must be established by default if explicit option isn't set. For compliance with existing applications not using SSL the verifyServerCertificate property is set to 'false'. You need either to explicitly disable SSL by setting useSSL=false, or set useSSL=true and provide truststore for server certificate verification.
=> Works However the script will fail later on on command com.cloudera.enterprise.dbutil.DbCommandExecutor. The file /etc/cloudera-scm-server/db.properties looks good to me, yet it does not connect [ main] DbCommandExecutor INFO Unable to login using supplied username/password.
[ main] DbCommandExecutor ERROR Error when connecting to database. It's lookng like the scripts scm_prepare_database.sh is not properly processing the passwords when attempting the logging to the MySql server. I coud not find any similar error, hence this post Thanks Chris
... View more
Labels:
- Labels:
-
Cloudera Manager
07-20-2017
07:04 AM
Hello, I fixed it. In the Spark2 Configuration Screen (In Cloudera Manager for the CDH cluster), Hiver Service was set to none I set it to Hive and CDSW is now working as expected. Thanks!
... View more
07-13-2017
07:14 AM
Hi Peter, See the roles installed. (In case of : Hive GW is the second listed role) From Spark2-shell (Workbench host) Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information.
scala> spark.catalog.listTables.show()
+----+--------+-----------+---------+-----------+
|name|database|description|tableType|isTemporary|
+----+--------+-----------+---------+-----------+
+----+--------+-----------+---------+-----------+ From HUE running in the CDH cluster (table datascience): the table exist, and can be queried from HUE, the returned data is correct Spark2-shell from a CDH node scala> spark.catalog.listTables.show()
// Detected repl transcript. Paste more, or ctrl-D to finish.
+----+--------+-----------+---------+-----------+
|name|database|description|tableType|isTemporary|
+----+--------+-----------+---------+-----------+
+----+--------+-----------+---------+-----------+
scala> spark.sql("describe database default").show
+-------------------------+--------------------------+
|database_description_item|database_description_value|
+-------------------------+--------------------------+
| Database Name| default|
| Description| default database|
| Location| file:/log/clouder...|
+-------------------------+--------------------------+
scala> spark.sql("describe formatted default.datascience").show
org.apache.spark.sql.catalyst.analysis.NoSuchTableException: Table or view 'datascience' not found in database 'default';
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.requireTableExists(SessionCatalog.scala:138)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.getTableMetadata(SessionCatalog.scala:289)
at org.apache.spark.sql.execution.command.DescribeTableCommand.run(tables.scala:437)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)
... 48 elided
Thanks!
... View more
07-13-2017
04:54 AM
Hi Peter, I updated my test code as follow import org.apache.spark.sql.hive.HiveContext
spark.catalog.listTables.show()
val sqlContext = new HiveContext(sc)
sqlContext.sql("describe database default").show
sqlContext.sql("describe formatted default.mytable").show
sc.version the results for SparkSession: spark.catalog.listTables.show()
+----+--------+-----------+---------+-----------+
|name|database|description|tableType|isTemporary|
+----+--------+-----------+---------+-----------+
+----+--------+-----------+---------+-----------+ (There should be one table listed) The result for HiveContext sqlContext.sql("describe database default").show
+-------------------------+--------------------------+
|database_description_item|database_description_value|
+-------------------------+--------------------------+
| Database Name| default|
| Description| default database|
| Location| /user/hive/warehouse|
+-------------------------+--------------------------+
sqlContext.sql("describe formatted default.mytable").show
Name: org.apache.spark.sql.catalyst.analysis.NoSuchTableException
Message: Table or view 'mytable' not found in database 'default';
StackTrace: at org.apache.spark.sql.catalyst.catalog.SessionCatalog.requireTableExists(SessionCatalog.scala:138)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.getTableMetadata(SessionCatalog.scala:289)
at org.apache.spark.sql.execution.command.DescribeTableCommand.run(tables.scala:437)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:58)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:56)
at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:114)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:135)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:132)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:113)
at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:87)
at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:87)
at org.apache.spark.sql.Dataset.<init>(Dataset.scala:185)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:592)
at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:699) while I do get an result for Describe Table, the description returned does not match the setting in Hive.
... View more