I am using ambari hdp 2.4 sandbox with hawq and on a different machine i use python 2.6 with paramiko to run a large query against an hawq eternal table. when i run my script it seems to freese. When i limit the query till 7000 it runs OK when i go above a limit of 8000 it freeses. I need some input where to look to solve this issue. running the same query from from a windows pgadmin tool runs OK without limitation. so i guess it might be a buffer that fills up on python machine
Peter, can you provide more details? What kind of external table? What data format? Is HAWQ processing your query, or just provisioning your data as a file output? What query interface are you using? What happens when you use psql ?
As i said i run one machine with ambari sandbox HDP 2.4 with hawq. This sandbox a HDFS on which I have put several common CSV files that have been loaded as external tables.
On a second machine (clean install centos with nifi installed) i run a python script in which I use the paramiko library. This executes a remote ssh-command psql -A -t -c "sql query"
therefor think that Hawq is processing the query.
Peter, sounds like something that you need to handle on the python script, not HAWQ (you can post your Python script here as well). You may want to post the scenario on Python community discussions for more ideas. Also see this, in case it helps: https://stackoverflow.com/questions/25260088/paramiko-with-continuous-stdout
Another troubleshooting exercise you could do is to install the psql client on your client machine and see if psql works fine by itself (just to prove that HAWQ server and client are fully functional). Instructions for psql client below: