Support Questions
Find answers, ask questions, and share your expertise

How to set manually the number of reducers on Beeline with HDP 2.6? i already tried to set them manually with the commands set mapred.reduce.tasks = 6; and set mapreduce.job.reduces = 6;

Explorer

Hello,

Currently i am using Hortonworks HDP 2.6 TO perform a TPCH benchmark with a 10 gb scale factor, but when i execute query 19 it only gives one reducer and the query never ends cause its complex, so i tried to force the number of the reducers with the above commands (set mapred.reduce.tasks = 6; and set mapreduce.job.reduces = 6;), with this, it would be logic to see that 6 reducers where used, but it stays the same with only one reducer.

Any ideas why i cant increase the number of reducers

5 REPLIES 5

Super Guru
@mÁRIO Rodrigues

What's the size of data and value of

hive.exec.reducers.bytes.per.reducer ?

Since Hive 0.14 and later the default value of this is 256 MB. Before that it used to be 1GB. You might have to control the number of reducers using this property.

But if your query is hanging for some other reason then this will of course not help.

Explorer

The current value is 1073741. I tried to decrease the number to see if the reducers raise but the result is the same.

Super Guru
@mÁRIO Rodrigues

What is the total amount of data? May be your data is too small and fits one reducer capacity.

Guru

@mÁRIO Rodrigues

Did you try setting mapreduce.job.reduces to a desired value ? by default it is -1

Expert Contributor

I explained how to manually set a parameter at runtime beeline. Before to do this you have to set sth on Ambari Hive settings.

Please refer: https://community.cloudera.com/t5/Support-Questions/params-that-are-allowed-to-be-modified-at-runtim...

Just use

hive.security.authorization.sqlstd.confwhitelist.append=mapreduce.job.reduces