Member since
10-01-2015
3933
Posts
1148
Kudos Received
374
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1128 | 05-03-2017 05:13 PM | |
978 | 05-02-2017 08:38 AM | |
1056 | 05-02-2017 08:13 AM | |
1321 | 04-20-2017 12:28 AM | |
1249 | 04-10-2017 10:51 PM |
04-16-2021
06:39 AM
I am also getting the same issue and restarted zookeeper and then region server and then hbase master but issue didn't resolve. Even I have deleted hbase znode but still issue is there. Regards, Satya
... View more
04-07-2021
05:55 PM
I know it's an old post but looks issue is unresolved yet. I came to following bash version: function hdfsMoveMerge()
{
src=$1
DEST=$2
if [[ "$SRC" == "$DEST" ]]
then
echo "source and dest is same "
return 1
else
echo "source and dest is not same "
fi
hadoop fs -find $SRC | xargs -n 10 hadoop fs -ls -d | grep -v '^d'| awk '{print $8}' | while read file ;
do
newFP=${file/$SRC/${DEST}};
if hadoop fs -mkdir -p $(dirname $newFP ) ;
then
cnt=0
if hadoop fs -test -f ${newFP}
then
((cnt++))
while hadoop fs -test -f "${newFP}_copy($cnt)"
do
((cnt++))
done
hadoop fs -mv "${file}" "${newFP}_copy($cnt)";
else
hadoop fs -mv "${file}" "${newFP}";
fi
fi
done
}
... View more
03-24-2021
05:22 PM
@tcpip001to set hash key in your case you must create an EvaluateJsonPath and add a new attribute for each property that you're using, following your example: For Hash Key: Property: hashKey Value: $.k For Range Key I believe that is the same process: Property: rangeKey Value: $.v And finally go to PutDynamoDb processor and put in the properties: Hash Key Value: ${hashKey} Range Key Value: ${rangeKey} I hope this could help you.
... View more
03-24-2021
06:01 AM
Recently I have install sandbox in my laptop .. Everything is working expect amabri page...when Im giving localhost:8080 it not loading page its just blank page as attached ....its asking username and password ...after giving nothing poping up..all popup is enabled ..
... View more
03-08-2021
07:06 AM
HI @abenesova, I would suggest you open a new thread, as this is an older post. This will also be an opportunity to provide details specific to your environment that could aid others in assisting you with a more accurate answer to your question. You can link this thread as a reference in your new post.
... View more
02-24-2021
05:47 AM
@romits I have to do a similar docker image with more clients and with another version of HDP I would like to know if the dockerfile used to build this image is available somewhere thanks in advance
... View more
01-13-2021
10:37 PM
I find the solution in HDP 2.6.5 the ssh prort for the sandbox vm is the standard port 22.
... View more
12-07-2020
11:36 AM
2020 Update, what are the preferred data quality tools compatible with CDH for Hive,Hbase and Solr? Our team is looking at Apache Griffin. Regards, Nithya Koka
... View more
11-25-2020
09:14 AM
Yes it's a big SIGH!!! I've tried 10s and 20s of different connection strings from trying to install older verison of Python (3.7.4) so I can install sasl and pyhive and basically everything I could find out there but it's still not working yet. So, basically my setup is HIVE on Azure and the DB connections have server/host something like this "<server>.azurehdinsight.net" with port of 443. I'm using DBeaver to connect to the HIVE db and it's using JDBC URL - complete URL is something like this "jdbc:hive2://<server>.azurehdinsight.net:443/default;transportMode=http;ssl=true;httpPath=/hive2", so can someone please help me out with what packages I need in order for me to successfully query HIVE from Python? @pudnik26354 - can you please post what worked for you? Thank you so much.
... View more
09-28-2020
01:01 PM
I killed two previous YARN applications in the queue and insertion was done. They were in the "Running" status but didn't make any progress (don't know why).
... View more
08-19-2020
09:23 AM
I have the same issue and in my case it is a Kerberos-enabled cluster. I modified the beeline command as follows: ${HIVE_HOME}/bin/beeline -u "jdbc:hive2://<server_name>:10015/default ssl=true;principal=<principal_name>/<hostname>@<REALM>;auth=kerberos; '' ''" "$@" I get two errors: - Unknown HS2 problem when communicating with Thrift server. - Error: Could not open client transport with JDBC Uri: jdbc:hive2://<node_name>:10015/default: Invalid status 21 (state=08S01,code=0) I would appreciate any help.
... View more
07-07-2020
06:42 AM
I know this is a bit late to post but i have a web app that scans the table and gets results based on the rowkey provided in the call so it needs to support multi threading, here's a snip of the scan: try(ResultScanner scanner = myTable.getScanner(scan)) {
for (Result result : scanner) {
//logic of result.getValue() and result.getRow()
}
} i just saw https://hbase.apache.org/1.2/devapidocs/org/apache/hadoop/hbase/client/Result.html is one of those classes that is not thread-safe among others mentioned in this article. Is there an example of a fully thread-safe hbase app that scans results based on the rowkey provided or anything similar? I'm looking for an efficient and good example i can use for reference. I am now concerned that this piece of code might not yield proper results when i get simultaneous requests.
... View more
06-11-2020
01:27 PM
Our installation had the password hash in another table. update ambari.user_authentication set authentication_key='538916f8943ec225d97a9a86a2c6ec0818c1cd400e09e03b660fdaaec4af29ddbb6f2b1033b81b00' where user_id='1' Note: user_id=1 was the admin in my case.
... View more
05-22-2020
01:23 PM
Shubham_Ranjan
As this is an older post you would have a better chance of receiving a resolution by starting a new thread. This will also provide the opportunity to provide details specific to your environment that could aid others in providing a more accurate answer to your question.
... View more
05-13-2020
11:56 PM
Yes this is possible. You need to kinit with the username that has been granted access to the SQL server DB and tables. integrated security passes your credentials to the SQL server using kerberos "jdbc:sqlserver://sername.domain.co.za:1433;integratedSecurity=true;databaseName=SCHEMA;authenticationScheme=JavaKerberos;" This worked for me.
... View more
05-06-2020
03:24 AM
This could be permission issue. you can see the hive server2 log for the error. Log will be in /var/log/hive on the node to which you connect the hive
... View more
04-24-2020
01:56 AM
Have you got solution, i'm getting same problem
... View more
04-02-2020
01:52 AM
You can try logging into the admin user and restart datanodes from the actions bar in Dashboard. That worked for me. May work for you too.
... View more
- Tags:
- heartbeatlost
04-01-2020
11:42 AM
@bhara I just finished install of Hue 4.6.0 in HDP 2.6.5 using this repo: https://github.com/steven-dfheinz/HDP2-Hue4-Service If you have any issues please open a new question here and tag me in it.
... View more
03-31-2020
03:06 AM
you should install ambari-server and ambari-agent on the first node wich you want to install hdfs service for example . the other nodes install ambari-agent only . dont forget to change (ambari-agent.ini ) hostname and the hosts file with (ip and hostname of all machines.
... View more
03-30-2020
09:54 AM
The DSN-less connection string below FINALLY worked for me, in windows 10. I created a file DSN, then copy/pasted the string into the python code, as a template. Three lessons that I learned from this struggle: 1) kerberos is CASE SENSITIVE. Your kerberos realm in the string MUST be uppercase. 2) The Cloudera driver doesn't like spaces in between the semicolons in the string. Avoid them. 3) If you don't need connection pooling, turn it off with a pyodbc.pooling = False statement. import pyodbc strFileDSNAsAstring = "DRIVER=Cloudera ODBC Driver for Apache Hive;USEUNICODESQLCHARACTERTYPES=1; \ SSL=0;SERVICEPRINCIPALCANONICALIZATION=0;SERVICEDISCOVERYMODE=0;SCHEMA=database;PORT=port; \ KRBSERVICENAME=hive;KRBREALM=uppercaserealm;KRBHOSTFQDN=hostfqdndomain;INVALIDSESSIONAUTORECOVER=1; \ HOST=host;HIVESERVERTYPE=2;GETTABLESWITHQUERY=0;ENABLETEMPTABLE=0;DESCRIPTION=Hive; \ DELEGATEKRBCREDS=0;AUTHMECH=1;ASYNCEXECPOLLINTERVAL=100;APPLYSSPWITHQUERIES=1;CAIssuedCertNamesMismatch=1;" try: pyodbc.pooling = False conn = pyodbc.connect(strFileDSNAsAstring, autocommit=True) except: print("failure.") else: conn.close() print("success.")
... View more
03-06-2020
07:22 AM
@Ham As this is an older post you would have a better chance of receiving a resolution by starting a new thread. This will also provide the opportunity to provide details specific to your environment that could aid others in providing a more accurate answer to your question.
... View more
02-19-2020
10:49 PM
with newer versions of spark, the sqlContext is not load by default, you have to specify it explicitly : scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc) warning: there was one deprecation warning; re-run with -deprecation for details sqlContext: org.apache.spark.sql.SQLContext = org.apache.spark.sql.SQLContext@6179af64 scala> import sqlContext.implicits._ import sqlContext.implicits._ scala> sqlContext.sql("describe mytable") res2: org.apache.spark.sql.DataFrame = [col_name: string, data_type: string ... 1 more field] I'm working with spark 2.3.2
... View more
02-10-2020
08:47 AM
Awesome.It worked for me.
... View more
01-23-2020
11:46 AM
Okay so I wrote an example nifi process to do it https://www.datainmotion.dev/2020/01/flank-stack-nifi-processor-for-kafka.html
... View more
01-14-2020
11:22 PM
--as-textfile Maps all the columns to their respective datatypes mentioned in –map-column-hive --as-parquet-file Does not change any datatype for the columns mentioned in –map-column-hive Please reply on this, if you have got answer
... View more
01-13-2020
09:17 PM
What happens when we clear cookies from browser?
... View more
01-07-2020
09:16 AM
How did you resolve the issue ?
... View more
12-25-2019
10:21 PM
we just need to put phoenx-kafka-***-minimal jar file available in binary distribution of appropriate phoenix version which need to downloaded and extracted to hbase CLASSPATH. and then excute the command as it is. pointing to the jar file copied.
... View more
12-20-2019
07:16 AM
https://github.com/apache/oozie/blob/9c288fe5cea6f2fbbae76f720b9e215acdd07709/webapp/src/main/webapp/oozie-console.js#L384
... View more