forked from docs/doc-exports
Reviewed-by: Hasko, Vladimir <vladimir.hasko@t-systems.com> Co-authored-by: Yang, Tong <yangtong2@huawei.com> Co-committed-by: Yang, Tong <yangtong2@huawei.com>
56 lines
3.6 KiB
HTML
56 lines
3.6 KiB
HTML
<a name="mrs_01_1052"></a><a name="mrs_01_1052"></a>
|
|
|
|
<h1 class="topictitle1">Migrating Services of External Security Components Interconnected with Storm</h1>
|
|
<div id="body1590370637072"><div class="section" id="mrs_01_1052__section159271159777"><h4 class="sectiontitle">Migrating Services for Interconnecting Storm with HDFS and HBase</h4><p id="mrs_01_1052__p09791622111919">If the Storm services use the <strong id="mrs_01_1052__b89761444492853">storm-hdfs</strong> or <strong id="mrs_01_1052__b147645168192853">storm-hbase</strong> plug-in package for interconnection, you need to specify the following security parameters when migrating Storm services as instructed in <a href="mrs_01_1050.html">Completely Migrating Storm Services</a>.</p>
|
|
<pre class="screen" id="mrs_01_1052__screen78491262565">//Initialize Storm Config.
|
|
Config conf = new Config();
|
|
|
|
//Initialize the security plug-in list.
|
|
List<String> auto_tgts = new ArrayList<String>();
|
|
//Add the AutoTGT plug-in.
|
|
auto_tgts.add("org.apache.storm.security.auth.kerberos.AutoTGT");
|
|
//Add the AutoHDFS plug-in.
|
|
//If HBase is interconnected, use auto_tgts.add("org.apache.storm.hbase.security.AutoHBase") to replace the following:
|
|
auto_tgts.add("org.apache.storm.hdfs.common.security.AutoHDFS");
|
|
|
|
//Set security parameters.
|
|
conf.put(Config.TOPOLOGY_AUTO_CREDENTIALS, auto_tgts);
|
|
//Set the number of workers.
|
|
conf.setNumWorkers(3);
|
|
|
|
//Convert Storm Config to StormConfig of Flink.
|
|
StormConfig stormConfig = new StormConfig(conf);
|
|
|
|
//Construct FlinkTopology using TopologBuilder of Storm.
|
|
FlinkTopology topology = FlinkTopology.createTopology(builder);
|
|
|
|
//Obtain the StreamExecutionEnvironment.
|
|
StreamExecutionEnvironment env = topology.getExecutionEnvironment();
|
|
|
|
//Add StormConfig to the environment variable of Job to construct Bolt and Spout.
|
|
//If Config is not required during the initialization of Bolt and Spout, do not set this parameter.
|
|
env.getConfig().setGlobalJobParameters(stormConfig);
|
|
|
|
//Submit the topology.
|
|
topology.execute();</pre>
|
|
<p id="mrs_01_1052__p15849182614566">After the preceding security plug-in is configured, unnecessary logins during the initialization of HDFSBolt and HBaseBolt are avoided because the security context has been configured in Flink.</p>
|
|
</div>
|
|
<div class="section" id="mrs_01_1052__section394119176816"><h4 class="sectiontitle">Migrating Services of Storm Interconnected with Other Security Components</h4><p id="mrs_01_1052__p18768435191913">If the plug-in packages, such as <strong id="mrs_01_1052__b198398611892853">storm-kakfa-client</strong> and <strong id="mrs_01_1052__b138826426892853">storm-solr</strong> are used for interconnection between Storm and other components for service migration, the previously configured security plug-ins need to be deleted.</p>
|
|
<pre class="screen" id="mrs_01_1052__screen118498262560">List<String> auto_tgts = new ArrayList<String>();
|
|
//keytab mode
|
|
auto_tgts.add("org.apache.storm.security.auth.kerberos.AutoTGTFromKeytab");
|
|
|
|
//Write the plug-in list configured on the client to the specified config parameter.
|
|
//Mandatory in security mode
|
|
//This configuration is not required in common mode, and you can comment out the following line.
|
|
conf.put(Config.TOPOLOGY_AUTO_CREDENTIALS, auto_tgts);</pre>
|
|
<p id="mrs_01_1052__p8849162695610">The AutoTGTFromKeytab plug-in must be deleted during service migration. Otherwise, the login will fail when Bolt or Spout is initialized.</p>
|
|
</div>
|
|
</div>
|
|
<div>
|
|
<div class="familylinks">
|
|
<div class="parentlink"><strong>Parent topic:</strong> <a href="mrs_01_1048.html">Migrating Storm Services to Flink</a></div>
|
|
</div>
|
|
</div>
|
|
|