FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=rahulhadoop2016, access =WRITE, inode="/":hdfs:hdfs:drwxr-xr-x

I am trying to run the query below and getting the error as follows

hive> add jar /home/rahulhadoop2016/hivexmlserde-1.0.5.3.jar;
Added [/home/rahulhadoop2016/hivexmlserde-1.0.5.3.jar] to class path
Added resources: [/home/rahulhadoop2016/hivexmlserde-1.0.5.3.jar]
hive> create external table mer_Instrument_Payload ( ML_CA_ILLIQUID STRING, ML_SCD_SECURITY_TYPE STRING)
> row format serde “com.ibm.spss.hive.serde2.xml.XmlSerDe”
> with serdeproperties
> (
> “column.xpath.ML_CA_ILLIQUID” = “/InstrumentMaster/Custom/CustomInstrument/ML_CA_ILLIQUID/text()”,
> “column.xpath.ML_SCD_SECURITY_TYPE” = “/InstrumentMaster/Custom/CustomInstrument/ML_SCD_SECURITY_TYPE/text()”
> ) stored as
> inputformat “com.ibm.spss.hive.serde2.xml.XmlInputFormat”
> outputformat “org.apache.hadoop.hive.ql.io.IgnoreKeyTextOutputFormat”
> location ‘/home/rahulhadoop2016/xml_instrument/’
> tblproperties (“xmlinput.start” = “<Publication”,“xmlinput.end” = “”);
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:java.security.AccessControlException: Permission denied: user=rahulhadoop2016, access
=WRITE, inode="/":hdfs:hdfs:drwxr-xr-x
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:319)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:219)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1827)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1811)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPathAccess(FSDirectory.java:1785)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkAccess(FSNamesystem.java:8558)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.checkAccess(NameNodeRpcServer.java:2064)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.checkAccess(ClientNamenodeProtocolServerSideTranslatorPB.java:1451)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:640)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:982)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2313)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2309)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2307)

sorry. I resolved it. Its the path.