Support Questions

Find answers, ask questions, and share your expertise
Celebrating as our community reaches 100,000 members! Thank you!

Copy data from cloudera hdfs to azure blob storage


In cdh 5.10.2, we need copy data from hdfs to azure but we have problems to put files.

  • After config the azure account and test the access from azure storage explorer.
  • we config the core-site.xml with the credentials (Account + key) and restart.
  • we test the command distcp but the follow error appears:

    hadoop distcp /user/myuser/file1.txt wasb:// -log /usr/myuser/

18/03/08 20:20:59 INFO tools.DistCp: Input Options: DistCpOptions{atomicCommit=false, syncFolder=false, deleteMissing=false, ignoreFailures=false, overwrite=false, append=false, useDiff=false, useRdiff=false, fromSnapshot=null, toSnapshot=null, skipCRC=false, blocking=true, numListstatusThreads=0, maxMaps=20, mapBandwidth=100, sslConfigurationFile='null', copyStrategy='uniformsize', preserveStatus=[], preserveRawXattrs=false, atomicWorkPath=null, logPath=null, sourceFileListing=null, sourcePaths=[/user/myuser/file1.txt, wasb://, -log], targetPath=/usr/myuser, targetPathExists=false, filtersFile='null'} 18/03/08 20:20:59 INFO client.RMProxy: Connecting to ResourceManager at xxxx.xxxx.test/ 18/03/08 20:20:59 WARN impl.MetricsConfig: Cannot locate configuration: tried, 18/03/08 20:20:59 INFO impl.MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 18/03/08 20:20:59 INFO impl.MetricsSystemImpl: azure-file-system metrics system started 18/03/08 20:21:03 ERROR tools.DistCp: Exception encountered The value for one of the HTTP headers is not in the correct format. at at at org.apache.hadoop.fs.Globber.getFileStatus( at org.apache.hadoop.fs.Globber.doGlob( at org.apache.hadoop.fs.Globber.glob( at org.apache.hadoop.fs.FileSystem.globStatus( at at at at at at at at Caused by: The value for one of the HTTP headers is not in the correct format. at at at at at at$CloudBlobContainerWrapperImpl.downloadAttributes( at at


we config the core-site.xml with the credentials (Account + key) and restart
What do you mean by restart, Is it HDFS service restart across cluster


I restarted the service hdfs and then the cluster without results. Finally i used adl (Azure datalake) in place to wasb.


Hi @cgomezfl,


In regarding to the original error message: The value for one of the HTTP headers is not in the correct format.


This error will occur if the Azure Account Kind is not set properly when creating the storage account. To correct this, set the Account Kind to General Purpose and the Access type to Blob within the Blob service configuration.


Hope this helps!




Li Wang, Technical Solution Manager

Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.

Learn more about the Cloudera Community:

Terms of Service

Community Guidelines

How to use the forum