Support Questions

Find answers, ask questions, and share your expertise
Check out our newest addition to the community, the Cloudera Data Analytics (CDA) group hub.

How to put data into hdfs from sqlserver on azure spark hdi

Hi I can read data from sqlserver. Converted it into ORC and now trying to push it into HDFS.


My PutHDFS config are below.

I'm just putting data into hdfs to PutHDFS is terminated at success. Can you guys please help me get past the error below




Super Collaborator

Super Collaborator

please check here for the config of the putHDFS processor to write to Azure:

@Harald Berghoff I will try this and try to push data into HDFS. I'll update . Thanks

@Harald Berghoff I tried the steps in the article. But I'm getting the same error. I was hoping that this would atleast change the error but it didn't . Any suggestions?

The solution provided by @Harald via link is good enough as a solution. The only prequesite is that the cluster should be on top of Azure Data Lake and not azure datastorage. If its a cluster then solution should be replicated on all nodes. If it's only Edge node then solution should only be applied only edge node. By solution I mean the solution in Thanks

Take a Tour of the Community
Don't have an account?
Your experience may be limited. Sign in to explore more.