Support Questions
Find answers, ask questions, and share your expertise
Announcements
Alert: Welcome to the Unified Cloudera Community. Former HCC members be sure to read and learn how to activate your account here.

connect Azure Data Lake Storage Gen2 to Linux Kernel using hadoop-fuse-dfs

connect Azure Data Lake Storage Gen2 to Linux Kernel using hadoop-fuse-dfs

New Contributor

Hello,

I am not sure I it is proper section for such kind of questions, hope that is not a big problem.

Generally I am trying to mount my Azure Data Lake Storage Gen2 to Linux using hadoop-fuse-dfs, ADLS Gen2 has HDFS features itself, so structuraly if should work, at least from my point of view, but using Storage Accounts in Azure You have to deal with security and authorization, at this point I am not sure where to paste Account Key, 

hadoop-fuse-dfs d abfs://analysis@centerofexcellence.dfs.core.windows.net /home/adminello/storage/ -oport=8020

analysis is name of filesystem and centerofexcellence is name of Storage Account, I was trying to paste key like that: 

 

hadoop-fuse-dfs dfs://analysis:"$key"@centerofexcellence.dfs.core.windows.net:8020 /home/adminello/storage/ -d

But nothing is working, after both commands directory looks as follows 

 

"d????????? ? ? ? ? ? storage/"

and debug message 

 

INFO /data/jenkins/workspace/generic-package-ubuntu64-16-04/CDH5.16.2-Packaging-Hadoop-2019-06-03_03-26-17/hadoop-2.6.0+cdh5.16.2+2863-1.cdh5.16.2.p0.26~xenial/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:115 Ignoring option -d
INFO /data/jenkins/workspace/generic-package-ubuntu64-16-04/CDH5.16.2-Packaging-Hadoop-2019-06-03_03-26-17/hadoop-2.6.0+cdh5.16.2+2863-1.cdh5.16.2.p0.26~xenial/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_options.c:164 Adding FUSE arg /home/adminello/storage/
FUSE library version: 2.9.4
nullpath_ok: 0
nopath: 0
utime_omit_ok: 0
unique: 1, opcode: INIT (26), nodeid: 0, insize: 56, pid: 0
INIT: 7.26
flags=0x001ffffb
max_readahead=0x00020000
INFO /data/jenkins/workspace/generic-package-ubuntu64-16-04/CDH5.16.2-Packaging-Hadoop-2019-06-03_03-26-17/hadoop-2.6.0+cdh5.16.2+2863-1.cdh5.16.2.p0.26~xenial/hadoop-hdfs-project/hadoop-hdfs/src/main/native/fuse-dfs/fuse_init.c:98 Mounting with options: [ protected=(NULL), nn_uri=abfs://analysis@centerofexcellence.dfs.core.windows.net, nn_port=8020, debug=0, read_only=0, initchecks=0, no_permissions=0, usetrash=0, entry_timeout=60, attribute_timeout=60, rdbuffer_size=10485760, direct_io=0 ]
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
19/07/18 11:15:23 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
fuseConnectInit: initialized with timer period 5, expiry period 300
   INIT: 7.19
   flags=0x00000039
   max_readahead=0x00020000
   max_write=0x00020000
   max_background=0
   congestion_threshold=0
   unique: 1, success, outsize: 40

Maybe somebody can help, thanks in advance