Created 05-03-2017 11:41 AM
Currently i am using Hortonworks HDP 2.6 TO perform a TPCH benchmark with a 10 gb scale factor, but when i execute query 19 it only gives one reducer and the query never ends cause its complex, so i tried to force the number of the reducers with the above commands (set mapred.reduce.tasks = 6; and set mapreduce.job.reduces = 6;), with this, it would be logic to see that 6 reducers where used, but it stays the same with only one reducer.
Any ideas why i cant increase the number of reducers
What's the size of data and value ofhive.exec.reducers.bytes.per.reducer ?
Since Hive 0.14 and later the default value of this is 256 MB. Before that it used to be 1GB. You might have to control the number of reducers using this property.
But if your query is hanging for some other reason then this will of course not help.
I explained how to manually set a parameter at runtime beeline. Before to do this you have to set sth on Ambari Hive settings.