Created 05-03-2017 11:41 AM
Currently i am using Hortonworks HDP 2.6 TO perform a TPCH benchmark with a 10 gb scale factor, but when i execute query 19 it only gives one reducer and the query never ends cause its complex, so i tried to force the number of the reducers with the above commands (set mapred.reduce.tasks = 6; and set mapreduce.job.reduces = 6;), with this, it would be logic to see that 6 reducers where used, but it stays the same with only one reducer.
Any ideas why i cant increase the number of reducers
Created 05-03-2017 12:24 PM
What's the size of data and value ofhive.exec.reducers.bytes.per.reducer ?
Since Hive 0.14 and later the default value of this is 256 MB. Before that it used to be 1GB. You might have to control the number of reducers using this property.
But if your query is hanging for some other reason then this will of course not help.
Created 05-03-2017 02:17 PM
The current value is 1073741. I tried to decrease the number to see if the reducers raise but the result is the same.
Created 05-03-2017 06:28 PM
Created on 10-12-2019 10:18 PM - edited 10-12-2019 10:19 PM
I explained how to manually set a parameter at runtime beeline. Before to do this you have to set sth on Ambari Hive settings.