Member since
05-09-2016
39
Posts
23
Kudos Received
12
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1579 | 06-29-2017 03:11 PM | |
1010 | 06-28-2017 12:06 PM | |
962 | 06-28-2017 08:28 AM | |
2837 | 06-21-2017 06:19 AM | |
828 | 06-13-2017 01:12 PM |
06-28-2017
08:45 AM
1 Kudo
@prsingh You need to pass databricks csv dependencies, either you need to download the jar or pass dependencies at run time. 1) download the dependency at run time pyspark --packages com.databricks:spark-csv_2.10:1.2.0
df = sqlContext.read.load('file:///root/file.csv',format='com.databricks.spark.csv',header='true',inferSchema='true') or 2) pass the jars while starting a) downloaded the jars as follow: wget http://search.maven.org/remotecontent?filepath=org/apache/commons/commons-csv/1.1/commons-csv-1.1.jar -O commons-csv-1.1.jar
wget http://search.maven.org/remotecontent?filepath=com/databricks/spark-csv_2.10/1.0.0/spark-csv_2.10-1.0.0.jar -O spark-csv_2.10-1.0.0.jar b) then start the python spark shell with the arguments: ./bin/pyspark --jars "spark-csv_2.10-1.0.0.jar,commons-csv-1.1.jar" c) load as dataframe df = sqlContext.read.load('file:///root/file.csv',format='com.databricks.spark.csv',header='true',inferSchema='true') Let me know if above helps!
... View more
03-27-2017
10:31 AM
@nyadav Setting the root.acl properties resolved the issue.
... View more
03-22-2017
12:17 PM
@nyadav Thanks for the information . I was able to insert after creating the dataframe.
... View more
03-22-2017
08:37 AM
@prsingh That seems to be a bug, let's disable the ACL for ZK, by adding below in zookeeper-env.sh
-Dzookeeper.skipACL=yes 1) Stop HBase
2) Add -Dzookeeper.skipACL=yes in zookeeper-env.sh
export SERVER_JVMFLAGS="$SERVER_JVMFLAGS -Dzookeeper.skipACL=yes -Djava.security.auth.login.config={{zk_server_jaas_file}}"
3) rmr /hbase-secure/table/hbase:acl
4)revert back the change in step 2 -Dzookeeper.skipACL=yes
5) Restart HBase again
6) Try grant it should work now
... View more
06-23-2018
01:48 PM
I am facing the same problem. Following all the above steps but still unable to get delete command running. Note: I have table with no sorting.
... View more
01-21-2017
06:18 PM
thanks for confirming , so what i wrote is correct that is changing dfs.blocksize . restart anyways will happen
... View more
09-22-2017
09:31 AM
hi i added spark node in oozie workflow but getting this error all time { reason: Main class [org.apache.oozie.action.hadoop.SparkMain], exit code [101]
... View more