The problem is from DAS, default user logged in to DAS is "Hive", so when execute following command
LOAD DATA INPATH '/user/maria_dev/drivers.csv' OVERWRITE INTO TABLE temp_drivers
It's not working with following errors
java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.MoveTask. org.apache.hadoop.hive.ql.metadata.HiveException: Access denied: Unable to move source hdfs://sandbox-hdp.hortonworks.com:8020/user/maria_dev/drivers.csv to destination hdfs://sandbox-hdp.hortonworks.com:8020/warehouse/tablespace/managed/hive/temp_drivers/base_0000001: Permission denied: user=hive, access=WRITE, inode="/user/maria_dev":maria_dev:hdfs:drwxr-xr-x
Apparently user hive can't write into maria_dev's space, it's a common permission/authentication issue.
HDP3 sandbox no longer carries Hive Viewer, instead it comes with DAS
So the solution would be either login to DAS as maria_dev or grant user 'hive' the permission. I think former is a better and cleaner solution.
However after plenty of research I can't find any details to configure authentication for DAS in the sandbox, most materials are about using DataPlane but I can't find it in the sandbox too, more interestingly DAS don't have a login page at all from the sandbox unlike Atlas etc, then it stuck from here (if login as maria_dev)....