- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
data node cant start because the root user
- Labels:
-
Apache Hadoop
Created 08-16-2016 03:03 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
hi,everyone:
my data store azure storage,but azure data user is root. cant change hdfs. this cause my datanode cant start。
Have other method Resolve this confilcts?
the error is:
2016-08-16 02:13:05,270 INFO datanode.DataNode (LogAdapter.java:info(47)) - registered UNIX signal handlers for [TERM, HUP, INT] 2016-08-16 02:13:07,256 WARN datanode.DataNode (DataNode.java:checkStorageLocations(2439)) - Invalid dfs.datanode.data.dir /mnt/data : EPERM: Operation not permitted at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:230) at org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:727) at org.apache.hadoop.fs.FilterFileSystem.setPermission(FilterFileSystem.java:502) at org.apache.hadoop.util.DiskChecker.mkdirsWithExistsAndPermissionCheck(DiskChecker.java:140) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:156) at org.apache.hadoop.hdfs.server.datanode.DataNode$DataNodeDiskChecker.checkDir(DataNode.java:2394) at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2436) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2418) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2310) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2357) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2538) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2562) 2016-08-16 02:13:07,267 ERROR datanode.DataNode (DataNode.java:secureMain(2545)) - Exception in secureMain java.io.IOException: All directories in dfs.datanode.data.dir are invalid: "/mnt/data/" at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2445) at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2418) at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2310) at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2357) at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2538) at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2562) 2016-08-16 02:13:07,269 INFO util.ExitUtil (ExitUtil.java:terminate(124)) - Exiting with status 1 2016-08-16 02:13:07,278 INFO datanode.DataNode (LogAdapter.java:info(47)) - SHUTDOWN_MSG:
Created 08-16-2016 10:25 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@pan bocun check if you have correct permissions and ownership to this directory : /mnt
hdfs:hadoop ownership and 755 permissions.
Created 08-16-2016 09:23 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi @pan bocun
Error looks like: dfs.datanode.data.dir are invalid: "/mnt/data/" at org.apache.hadoop.hdfs.server.datanode.DataNode.checkStorageLocations(DataNode.java:2445)..
Can you please give the valid entry dfs.datanode.data.dir property in HDFS config(hdfs-site.xml)
Created 08-16-2016 10:25 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
@pan bocun check if you have correct permissions and ownership to this directory : /mnt
hdfs:hadoop ownership and 755 permissions.
Created 08-16-2016 12:50 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
thanks,InAmbari the data node dri permission default is775,but my folder is 777
so i change 775 to 777 use ambari
my problem solved
Created 08-17-2016 02:49 PM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
This link might be helpful https://community.hortonworks.com/questions/1635/instructions-to-setup-wasb-as-storage-for-hdp-on-a....
seems the below properties need to be verified on the data node which is failing:
The following is a list of configurations that should be modified to configure WASB:
- fs.defaultFS
- wasb://<containername>@<accountname>.blob.core.windows.net
- fs.AbstractFileSystem.wasb.impl
- org.apache.hadoop.fs.azure.Wasb
- fs.azure.account.key. . blob.core.windows.net
- <storage_access_key>
- Even though WASB will be set as the fs.defaultFS, you still need to define DataNode directories for HDFS. As the intent here is to use WASB as the primary FS, you can set the HDFS datanode directories to the temporary /mnt/resource mount point that is provided with Azure compute servers if you only plan to use HDFS for temporary job files. DataNode Directories
- /mnt/resource/Hadoop/hdfs/data
