Yang, Tong 6182f91ba8 MRS component operation guide_normal 2.0.38.SP20 version
Reviewed-by: Hasko, Vladimir <vladimir.hasko@t-systems.com>
Co-authored-by: Yang, Tong <yangtong2@huawei.com>
Co-committed-by: Yang, Tong <yangtong2@huawei.com>
2022-12-09 14:55:21 +00:00

17 lines
2.8 KiB
HTML

<a name="mrs_01_2041"></a><a name="mrs_01_2041"></a>
<h1 class="topictitle1">Why Does the "Permission denied" Exception Occur When I Create a Temporary Table or View in Spark-beeline?</h1>
<div id="body1595920222655"><div class="section" id="mrs_01_2041__s0d98802bc243418f897c9136d4e21b82"><h4 class="sectiontitle">Question</h4><p id="mrs_01_2041__acc63db8be7bc42069b19220a861f350c">In normal mode, when I create a temporary table or view in spark-beeline, the error message "Permission denied" is displayed, indicating that I have no permissions on the HDFS directory. The error log information is as follows:</p>
<pre class="screen" id="mrs_01_2041__s0c0539a3f75844ec8719c606119c4c8d">org.apache.hadoop.security.AccessControlException Permission denied: user=root, access=EXECUTE, inode="/tmp/spark/sparkhive-scratch/omm/e579a76f-43ed-4014-8a54-1072c07ceeff/_tmp_space.db/52db1561-60b0-4e7d-8a25-c2eaa44850a9":omm:hadoop:drwx------</pre>
</div>
<div class="section" id="mrs_01_2041__s71abd3eaef2b4036b20f98ba2a81f386"><h4 class="sectiontitle">Answer</h4><p id="mrs_01_2041__ae70544e3b66e4b73b3a3aa7d62ee4913">In normal mode, if you run the spark-beeline command as a non-omm user, <strong id="mrs_01_2041__b103798750552337">root</strong> user for example, without specifying the <strong id="mrs_01_2041__b1510135718583">-n</strong> parameter, your account is still the root user. After spark-beeline is started, a new HDFS directory is created by JDBCServer. In the current version of DataSight, the user that starts the JDBCServer is <strong id="mrs_01_2041__b39557581752337">omm</strong>. In versions earlier than DataSight V100R002C30, the user is <strong id="mrs_01_2041__b150059767652337">root</strong>. Therefore, the owner of the HDFS directory is <strong id="mrs_01_2041__b152326489052337">omm</strong> and the group is <strong id="mrs_01_2041__b167432400552337">hadoop</strong>. The HDFS directory is used when you create a temporary table or view in spark-beeline and the user <strong id="mrs_01_2041__b84598505352337">root</strong> is a common user in HDFS and has no permissions on the directory of user <strong id="mrs_01_2041__b28420585252337">omm</strong>. As a result, the "Permission denied" exception occurs.</p>
<p id="mrs_01_2041__a711557dd806f460b96851788b4014f29">In normal mode, only user <strong id="mrs_01_2041__b78253484852337">omm</strong> can create a temporary table or view. To solve this problem, you can specify the <strong id="mrs_01_2041__b95143329652337">-n omm</strong> option for user <strong id="mrs_01_2041__b131828644052337">omm</strong> when starting spark-beeline. In this way, you have the permissions to perform operations on the HDFS directory.</p>
</div>
</div>
<div>
<div class="familylinks">
<div class="parentlink"><strong>Parent topic:</strong> <a href="mrs_01_2022.html">Spark SQL and DataFrame</a></div>
</div>
</div>