Could you please clarify your question further - are you asking about data integration with DB2, or data transfer with DB2, or using DB2 for storing HAWQ metadata?
HAWQ is an analytic database itself, built using Postgres database technology. It doesn't use DB2 in its architecture, but you could migrate data from DB2 into HAWQ if you wish to do so. HAWQ stores its metadata internally in the same instance and doesn't support external databases like mysql or oracle ...etc for its metastore.
In my case the data will be moved to HDFS via an ingestion process. We are looking to know if the user queries can be run as is or if they will need to invest a lot of time in doing changes to there code.
HAWQ is built using Postgres 8.2 (additions and backporting done recently for some features but fork was from 8.2) Below is a document which highlights the difference between DB2 SQL and Postgres SQL https://wiki.postgresql.org/images/d/d1/DB2UDB-to-PG.pdf
Also there are Automated tools which would migrate SQL from DB2 to Postgres. Ispirer (http://www.ispirer.com/products/db2-luw-to-postgresql-migration)
HAWQ is a port of Greenplum to query data in Hadoop. This is not going to be a simple migration of scripts with minimal changes. There are tools available (will require licenses - one link pointed by Muji and there are other products that will show up with simple Google) to migrate scripts from DB2 to Greenplum and that doesn’t mean it will work right away with HAWQ. Once migrated to Greenplum, I would expect your scripts to go through some changes though not as much as going from DB2 to HAWQ or Greenplum directly.