Member since
09-10-2015
32
Posts
29
Kudos Received
3
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
3580 | 10-04-2015 10:36 PM | |
1327 | 09-30-2015 04:59 PM | |
7914 | 09-26-2015 05:24 PM |
10-04-2015
10:43 PM
When will HDP 2.x will start giving support on CentOS 7 . I don’t understand the python26 dependency fully but I presume that is a major roadblock in moving to the new OS ?
... View more
Labels:
- Labels:
-
Hortonworks Data Platform (HDP)
10-04-2015
10:41 PM
Is there a source which makes rpm from “HDP tar.gz” like “Bigtop”? I am using ambari and HDP.
I want to change some part of HDP and make rpm.
Do I have to customize Bigtop or rpmbuild to make rpm?
... View more
Labels:
10-04-2015
10:36 PM
Before Ranger was integrated with Sandbox, the dfs.perm in Sandbox was set to false. The reason was to allow Hue and some other use cases to create databases and tables.
After Ranger was integrated, we emulated the same behavior by creating a global policy to allow everyone. If they go through the Sandbox Security tutorials, the first step is to disable the global policy (for each component). If you disable the global HDFS policy in Ranger which allows everyone, then you should see what you expect from HDFS security permissions.
... View more
10-04-2015
10:35 PM
Hello,
I want to test the file permissions of HDFS. By these Tests I get a strange behavior of Hadoop.
I created a new directory with the user “root”. The used command was “hadoop fs -mkdir /user/test”.
After this I changed the permissions of this directory to r, w, x only for the owner (“hadoop fs -chmod 700 /user/test”).
And I copied a new file into this directory (“hadoop fs -put test.txt /user/test”) and I changed the permissions of this file (“hadoop fs -chmod 600 /user/test/test.txt”), too. I created an new user and a new usergroup and added the new user to this group.
With this new user is accessed the folder (“hadoop fs -ls /user/test”) and deleted the file (“hadoop fs -rm ./user/test/test.txt”).
With the right permissions i havn’t do this. I do this Test with the same file in the UNIX-Filesystem and there the Deletion failed. This is the right behavior I expected in HDFS. I used the HDP 2.3 Sandbox with default configuration. Had anyone the same behavior or did I a mistake?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Ranger
10-04-2015
07:00 PM
I am trying to install HUE 3.8.1 on HDP 2.3 cluster installed over SLES 11 by following this blog post “http://gethue.com/hadoop-hue-3-on-hdp-installation-tutorial/” But struggling to get all the prerequisite packages, below are packages which i’m not able to install with zypper(yast): krb5-devel , mysql-devel, openssl-devel , cyrus-sasl-devel , cyrus-sasl-gssapi, sqlite-devel, libtidy , libxml2-devel , libxslt-devel, openldap-devel, python-devel , python-setuptools are there any online repos available from where i can get these. I was successfully in installing HUE on CentOS HDP cluster using same steps.
Realy appreciate Any help and pointers on this.
... View more
Labels:
- Labels:
-
Cloudera Hue
10-04-2015
06:57 PM
Hi,
I took the HDPCD practice test on AWS but I am facing few problems with sqoop. For the Task 10, i used “sqoop export –connect jdbc:mysql://namenode:3306/flightinfo –table weather –export-dir /user/horton/weather –input-fields-terminated-by ‘,’ –username root –password hadoop” and got the following: Warning: /usr/hdp/2.2.0.0-2041/sqoop/sqoop/bin/../../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
15/10/03 18:46:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5.2.2.0.0-2041
15/10/03 18:46:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/10/03 18:46:56 INFO manager.SqlManager: Using default fetchSize of 1000
15/10/03 18:46:56 INFO tool.CodeGenTool: Beginning code generation
15/10/03 18:46:57 ERROR manager.SqlManager: Error executing statement: com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Access denied for user ‘root’@’%’ to database ‘flightinfo’
com.mysql.jdbc.exceptions.jdbc4.MySQLSyntaxErrorException: Access denied for user ‘root’@’%’ to database ‘flightinfo’ And also, I am not able to copy the pig scripts into the solutions folder as mentioned. cp -f flightdelays_clean.pig /home/horton/solutions/
cp: cannot create regular file ‘/home/horton/solutions/flightdelays_clean.pig’: Permission denied Am I missing something? Please help.
... View more
Labels:
- Labels:
-
Apache Pig
-
Apache Sqoop
09-30-2015
04:59 PM
1 Kudo
Check out this tutorial which walks you through ORC support in Spark: http://hortonworks.com/hadoop-tutorial/using-hive-with-orc-from-apache-spark/
... View more
09-28-2015
02:06 PM
I have a main thread that opens a JDBC connection to HiveServer2, this connection object is shared by multiple threads. The thread has a prepared statement and executes a select query(not CRUD) and does some processing with the resultset. I am trying this with Hive as I have some legacy code from the product I work on which I dont want to change and I know that this works with Oracle. Below is the stack trace of the exception. org.apache.thrift.transport.TTransportException
at org.apache.thrift.transport.TIOStreamTransport.read(TIOStreamTransport.java:132)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.transport.TSaslTransport.readLength(TSaslTransport.java:376)
at org.apache.thrift.transport.TSaslTransport.readFrame(TSaslTransport.java:453)
at org.apache.thrift.transport.TSaslTransport.read(TSaslTransport.java:435)
at org.apache.thrift.transport.TSaslClientTransport.read(TSaslClientTransport.java:37)
at org.apache.thrift.transport.TTransport.readAll(TTransport.java:86)
at org.apache.thrift.protocol.TBinaryProtocol.readAll(TBinaryProtocol.java:429)
at org.apache.thrift.protocol.TBinaryProtocol.readI32(TBinaryProtocol.java:318)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:219)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.hive.service.cli.thrift.TCLIService$Client.recv_FetchResults(TCLIService.java:501)
at org.apache.hive.service.cli.thrift.TCLIService$Client.FetchResults(TCLIService.java:488)
at org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:360)
at hivetrial.RSIterator2.run(ConcurrentRSIteration2.java:60)
at java.lang.Thread.run(Unknown Source)
Trying to understand if this is a limitation.
... View more
Labels:
- Labels:
-
Apache Hive
09-26-2015
05:24 PM
3 Kudos
After a bit of research, I found the hadoopsdk on codeplex is a good place to start. As far as very basic connection examples go, try this blog for an example, but note that the connection for HDInsight is slightly different now it's all using the templeton interface, so this will get you going: var db =newHiveConnection(
webHCatUri:newUri("http://localhost:50111"),
userName:(string)"hadoop", password:(string)null);var result = db.ExecuteHiveQuery("select * from w3c");
If you are looking to do full on MapReduce on HDInsight, then you probably want to take a look at the C# MapReduce examples with the sdk on codeplex. Note that the default HDInsight install also comes with some good samples, which include a bit of data to play with and some powershell scripts and .NET code to get you started. If there are other recommendations I am all ears.
... View more
09-26-2015
05:18 PM
1 Kudo
I am working on a solution where I will have a Hadoop cluster with Hive running and I want to send jobs and hive queries from a .NET application to be processed and get notified when they are done. What is the recommended API, library here?
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache Hive
- « Previous
-
- 1
- 2
- Next »