要將Java Kafka與Apache Storm集成,您需要遵循以下步驟:
首先,確保在您的項目中添加了Kafka和Storm的依賴項。對于Maven項目,您可以在pom.xml文件中添加以下依賴項:
<!-- Kafka -->
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>2.8.0</version>
</dependency>
<!-- Storm -->
<dependency>
<groupId>org.apache.storm</groupId>
<artifactId>storm-core</artifactId>
<version>2.3.2</version>
</dependency>
創建一個Java類,用于向Kafka主題發送消息。例如,創建一個名為KafkaProducer.java
的文件:
import org.apache.kafka.clients.producer.KafkaProducer;
import org.apache.kafka.clients.producer.ProducerRecord;
import java.util.Properties;
public class KafkaProducer {
public static void main(String[] args) {
Properties props = new Properties();
props.put("bootstrap.servers", "localhost:9092");
props.put("key.serializer", "org.apache.kafka.common.serialization.StringSerializer");
props.put("value.serializer", "org.apache.kafka.common.serialization.StringSerializer");
KafkaProducer<String, String> producer = new KafkaProducer<>(props);
for (int i = 0; i < 100; i++) {
producer.send(new ProducerRecord<>("my-topic", Integer.toString(i), Integer.toString(i * 2)));
}
producer.close();
}
}
創建一個Java類,用于定義Storm Topology。例如,創建一個名為KafkaSpout.java
的文件:
import org.apache.storm.topology.TopologyBuilder;
import org.apache.storm.StormSubmitter;
public class KafkaSpout {
public static void main(String[] args) throws Exception {
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("kafka-spout", new KafkaSpout(), 5);
builder.setBolt("bolt", new KafkaBolt(), 5).shuffleGrouping("kafka-spout");
Config config = new Config();
config.setNumWorkers(3);
StormSubmitter.submitTopology("kafka-topology", config, builder.createTopology());
}
}
創建一個Java類,用于從Kafka主題讀取消息。例如,創建一個名為KafkaSpout.java
的文件:
import org.apache.storm.kafka.spout.KafkaSpoutConfig;
import org.apache.storm.topology.TopologyBuilder;
public class KafkaSpout {
public static void main(String[] args) throws Exception {
TopologyBuilder builder = new TopologyBuilder();
builder.setSpout("kafka-spout", new KafkaSpoutConfig
.Builder("localhost:9092", "my-topic")
.setProp("key.deserializer", "org.apache.kafka.common.serialization.StringDeserializer")
.setProp("value.deserializer", "org.apache.kafka.common.serialization.StringDeserializer")
.build(), 5);
builder.setBolt("bolt", new KafkaBolt(), 5).shuffleGrouping("kafka-spout");
Config config = new Config();
config.setNumWorkers(3);
StormSubmitter.submitTopology("kafka-topology", config, builder.createTopology());
}
}
創建一個Java類,用于處理從Kafka Spout接收到的消息。例如,創建一個名為KafkaBolt.java
的文件:
import org.apache.storm.topology.BasicOutputCollector;
import org.apache.storm.topology.OutputFieldsDeclarer;
import org.apache.storm.topology.base.BaseBasicBolt;
import org.apache.storm.tuple.Tuple;
public class KafkaBolt extends BaseBasicBolt {
@Override
public void execute(Tuple input, BasicOutputCollector collector) {
String message = input.getStringByField("value");
System.out.println("Received message: " + message);
}
@Override
public void declareOutputFields(OutputFieldsDeclarer declarer) {
}
}
現在,您已經成功地將Java Kafka與Apache Storm集成在一起。運行KafkaProducer.java
以發送消息到Kafka主題,然后運行KafkaSpout.java
以從Kafka主題讀取消息并將其傳遞給KafkaBolt.java
進行處理。