Yang, Tong 6182f91ba8 MRS component operation guide_normal 2.0.38.SP20 version
Reviewed-by: Hasko, Vladimir <vladimir.hasko@t-systems.com>
Co-authored-by: Yang, Tong <yangtong2@huawei.com>
Co-committed-by: Yang, Tong <yangtong2@huawei.com>
2022-12-09 14:55:21 +00:00

53 lines
5.3 KiB
HTML

<a name="mrs_01_0625"></a><a name="mrs_01_0625"></a>
<h1 class="topictitle1">Why Does the LoadIncrementalHFiles Tool Fail to Be Executed and "Permission denied" Is Displayed When Nodes in a Cluster Are Used to Import Data in Batches?</h1>
<div id="body1600571305226"><div class="section" id="mrs_01_0625__s0d75574469bd460f87878b6f2e2d2866"><h4 class="sectiontitle">Question</h4><p id="mrs_01_0625__p154064411257">Why does the LoadIncrementalHFiles tool fail to be executed and "Permission denied" is displayed when a Linux user is manually created in a normal cluster and DataNode in the cluster is used to import data in batches?</p>
<pre class="screen" id="mrs_01_0625__screen57210392546">2020-09-20 14:53:53,808 WARN [main] shortcircuit.DomainSocketFactory: error creating DomainSocket
java.net.ConnectException: connect(2) error: Permission denied when trying to connect to '/var/run/FusionInsight-HDFS/dn_socket'
at org.apache.hadoop.net.unix.DomainSocket.connect0(Native Method)
at org.apache.hadoop.net.unix.DomainSocket.connect(DomainSocket.java:256)
at org.apache.hadoop.hdfs.shortcircuit.DomainSocketFactory.createSocket(DomainSocketFactory.java:168)
at org.apache.hadoop.hdfs.client.impl.BlockReaderFactory.nextDomainPeer(BlockReaderFactory.java:804)
at org.apache.hadoop.hdfs.client.impl.BlockReaderFactory.createShortCircuitReplicaInfo(BlockReaderFactory.java:526)
at org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.create(ShortCircuitCache.java:785)
at org.apache.hadoop.hdfs.shortcircuit.ShortCircuitCache.fetchOrCreate(ShortCircuitCache.java:722)
at org.apache.hadoop.hdfs.client.impl.BlockReaderFactory.getBlockReaderLocal(BlockReaderFactory.java:483)
at org.apache.hadoop.hdfs.client.impl.BlockReaderFactory.build(BlockReaderFactory.java:360)
at org.apache.hadoop.hdfs.DFSInputStream.getBlockReader(DFSInputStream.java:663)
at org.apache.hadoop.hdfs.DFSInputStream.blockSeekTo(DFSInputStream.java:594)
at org.apache.hadoop.hdfs.DFSInputStream.readWithStrategy(DFSInputStream.java:776)
at org.apache.hadoop.hdfs.DFSInputStream.read(DFSInputStream.java:845)
at java.io.DataInputStream.readFully(DataInputStream.java:195)
at org.apache.hadoop.hbase.io.hfile.FixedFileTrailer.readFromStream(FixedFileTrailer.java:401)
at org.apache.hadoop.hbase.io.hfile.HFile.isHFileFormat(HFile.java:651)
at org.apache.hadoop.hbase.io.hfile.HFile.isHFileFormat(HFile.java:634)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.visitBulkHFiles(LoadIncrementalHFiles.java:1090)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.discoverLoadQueue(LoadIncrementalHFiles.java:1006)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.prepareHFileQueue(LoadIncrementalHFiles.java:257)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.doBulkLoad(LoadIncrementalHFiles.java:364)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.run(LoadIncrementalHFiles.java:1263)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.run(LoadIncrementalHFiles.java:1276)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.run(LoadIncrementalHFiles.java:1311)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.hbase.tool.LoadIncrementalHFiles.main(LoadIncrementalHFiles.java:1333)</pre>
</div>
<div class="section" id="mrs_01_0625__section9854113819918"><h4 class="sectiontitle">Answer</h4><p id="mrs_01_0625__p13100330172619">If the client that the LoadIncrementalHFiles tool depends on is installed in the cluster and is on the same node as DataNode, HDFS creates short-circuit read during the execution of the tool to improve performance. The short-circuit read depends on the <strong id="mrs_01_0625__b12895115145410">/var/run/FusionInsight-HDFS</strong> directory (<strong id="mrs_01_0625__b11440954155417">dfs.domain.socket.path</strong>). The default permission on this directory is <strong id="mrs_01_0625__b12696205813546">750</strong>. This user does not have the permission to operate the directory.</p>
<p id="mrs_01_0625__p123131815194011">To solve the preceding problem, perform the following operations:</p>
<p id="mrs_01_0625__p13776126183610">Method 1: Create a user (recommended).</p>
<ol id="mrs_01_0625__ol84371210114011"><li id="mrs_01_0625__l57959cf11dc74b388d62a55b172f9fa6"><span>Create a user on Manager. By default, the user group contains the <strong id="mrs_01_0625__b1411161718586">ficommon</strong> group.</span><p><pre class="screen" id="mrs_01_0625__screen1084192916499">[root@xxx-xxx-xxx-xxx ~]# id test
uid=20038(test) gid=9998(ficommon) groups=9998(ficommon)</pre>
</p></li><li id="mrs_01_0625__li124971844104312"><span>Import data again.</span></li></ol>
<p id="mrs_01_0625__p525610163916">Method 2: Change the owner group of the current user.</p>
<ol id="mrs_01_0625__ol32521012393"><li id="mrs_01_0625__li1825410153911"><span>Add the user to the <strong id="mrs_01_0625__b145732313595">ficommon</strong> group.</span><p><pre class="screen" id="mrs_01_0625__screen17533125384315">[root@xxx-xxx-xxx-xxx ~]# usermod -a -G ficommon test
[root@xxx-xxx-xxx-xxx ~]# id test
uid=2102(test) gid=2102(test) groups=2102(test),9998(ficommon)</pre>
</p></li><li id="mrs_01_0625__li1425210123918"><span>Import data again.</span></li></ol>
</div>
</div>
<div>
<div class="familylinks">
<div class="parentlink"><strong>Parent topic:</strong> <a href="mrs_01_1638.html">Common Issues About HBase</a></div>
</div>
</div>