forked from docs/doc-exports
Reviewed-by: Hasko, Vladimir <vladimir.hasko@t-systems.com> Co-authored-by: Yang, Tong <yangtong2@huawei.com> Co-committed-by: Yang, Tong <yangtong2@huawei.com>
40 lines
3.8 KiB
HTML
40 lines
3.8 KiB
HTML
<a name="mrs_01_2026"></a><a name="mrs_01_2026"></a>
|
|
|
|
<h1 class="topictitle1">What Directory Permissions Do I Need to Create a Table Using SparkSQL?</h1>
|
|
<div id="body1595920220421"><div class="section" id="mrs_01_2026__s86e1d6a6bbb040ebb52a8e7e0b058720"><h4 class="sectiontitle">Question</h4><p id="mrs_01_2026__a3e890b5ef6bb4503866e06e7a5e3bd28">The following error information is displayed when a new user creates a table using SparkSQL:</p>
|
|
<pre class="screen" id="mrs_01_2026__sc91213f6c67d4478b492f4d1709cd9cb">0: jdbc:hive2://192.168.169.84:22550/default> create table testACL(c string);
|
|
Error: org.apache.spark.sql.execution.QueryExecutionException: FAILED: Execution Error, return code 1 from
|
|
org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Got exception: org.apache.hadoop.security.AccessControlException
|
|
Permission denied: user=testACL, access=EXECUTE, inode="/user/hive/warehouse/testacl":spark:hadoop:drwxrwx---
|
|
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkAccessAcl(FSPermissionChecker.java:403)
|
|
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.check(FSPermissionChecker.java:306)
|
|
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkTraverse(FSPermissionChecker.java:259)
|
|
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:205)
|
|
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:190)
|
|
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:1710)
|
|
at org.apache.hadoop.hdfs.server.namenode.FSDirStatAndListingOp.getFileInfo(FSDirStatAndListingOp.java:109)
|
|
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:3762)
|
|
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.getFileInfo(NameNodeRpcServer.java:1014)
|
|
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.getFileInfo(ClientNamenodeProtocolServerSideTranslatorPB.java:853)
|
|
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
|
|
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:616)
|
|
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:973)
|
|
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2089)
|
|
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2085)
|
|
at java.security.AccessController.doPrivileged(Native Method)
|
|
at javax.security.auth.Subject.doAs(Subject.java:422)
|
|
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1675)
|
|
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2083)
|
|
) (state=,code=0)</pre>
|
|
</div>
|
|
<div class="section" id="mrs_01_2026__s48370ac7288a4bdca63033ea13225608"><h4 class="sectiontitle">Answer</h4><p id="mrs_01_2026__ad818281a91e8403eafac8bfc8b0695a5">When you create a table using Spark SQL, the interface of Hive is called by the underlying system and a directory named after the table will be created in the <span class="filepath" id="mrs_01_2026__fa533aa19fb8b476bb23e4a8349ff25f1"><b>/user/hive/warehouse</b></span> directory. Therefore, you must have the permissions to read, write, and execute the <span class="filepath" id="mrs_01_2026__f42ebef2e36774bacb9e30d15382c0c3e"><b>/user/hive/warehouse</b></span> directory or the group permission of Hive.</p>
|
|
<p id="mrs_01_2026__a4fa2e6169dfa46e6902297b44ad98fa3">The<span class="filepath" id="mrs_01_2026__fea509ba079f149a59b5b550ba973dccb"><b>/user/hive/warehouse</b></span> is specified by the hive.metastore.warehouse.dir parameter.</p>
|
|
</div>
|
|
</div>
|
|
<div>
|
|
<div class="familylinks">
|
|
<div class="parentlink"><strong>Parent topic:</strong> <a href="mrs_01_2022.html">Spark SQL and DataFrame</a></div>
|
|
</div>
|
|
</div>
|
|
|