Do not have write access permission on my user account itv000764g02

Dear Sir / Madam,

I am writing to you to ask why on my user account I don’t have write access permission. I am trying to create and drop databases on Spark2-Scala Kernel on the Jupyter Hub provided by ITVERSITY LABS.

Here is what I have done, I have started the Spark Session, dropped any pre-existing itversity_demo database and tried to create this database again but it comes up with warnings that I don’t have write access.

However, I have managed to create and drop databases and tables in Spark-Sql CLI in Linux, but not in Spark2-Scala Kernel on Jupyter Hub which is why I am contacting you.

The code is given below;

val spark = SparkSession.
builder.
config(“spark.ui.port”, “0”).
config(“spark.sql.warehouse.dir”, “/user/itversity/warehouse”).
enableHiveSupport.
master(“yarn”).
appName(“Getting Started - Spark SQL”).
getOrCreate
spark = org.apache.spark.sql.SparkSession@64e8965d
org.apache.spark.sql.SparkSession@64e8965d
%%sql

DROP DATABASE IF EXISTS itversity_demo CASCADE
++
||
++
++

%%sql

CREATE DATABASE IF NOT EXISTS itversity_demo
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecke…
Magic sql failed to execute with error:
org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException Permission denied: user=itv000764, access=WRITE, inode="/user/itversity/warehouse":itversity:itversity:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:496)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:336)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermissionWithContext(FSPermissionChecker.java:360)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:239)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1909)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1893)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkAncestorAccess(FSDirectory.java:1852)
at org.apache.hadoop.hdfs.server.namenode.FSDirMkdirOp.mkdirs(FSDirMkdirOp.java:60)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.mkdirs(FSNamesystem.java:3407)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.mkdirs(NameNodeRpcServer.java:1161)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.mkdirs(ClientNamenodeProtocolServerSideTranslatorPB.java:739)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine2$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine2.java:532)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1070)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:1020)
at org.apache.hadoop.ipc.Server$RpcCall.run(Server.java:948)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1845)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2952)

Hello @SAMalkani1,

Please go through below link:

https://discuss.itversity.com/t/unable-to-create-or-use-hive-database-getting-permission-denied-error/23169/2