MapR Streams is a global publish-subscribe event streaming system for big data. It connects data producers and consumers worldwide in real time, with unlimited scale. Publishers (data producers) write data to one or more topics in MapR Streams. Subscribers (data consumers) to the topic can read the data instantaneously, anywhere across the globe.
MapR Streams is unique for two reasons. First, it is the first big data-scale streaming system to be built into a converged data platform. Next, it is the only big data streaming system to support global event replication at Internet-of-Things (IoT) scale and reliability, providing failover endpoints between up to thousands of distributed clusters.
This guide describes the MapR Kafka implementation of the Spring Cloud Stream Binder and how data can be published or subscribed from MapR Event Streams.
For using the Apache Kafka binder, you just need to add it to your Spring Cloud Stream application, using the following Maven coordinates:
Step 1) Create MapR Stream path
Step 4) Start the Kafka Server
In this blog you have learnt how to integrate MapR Event Streams with Spring Cloud Kafka Binder with Producer and Consumer examples.
MapR Streams is unique for two reasons. First, it is the first big data-scale streaming system to be built into a converged data platform. Next, it is the only big data streaming system to support global event replication at Internet-of-Things (IoT) scale and reliability, providing failover endpoints between up to thousands of distributed clusters.
This guide describes the MapR Kafka implementation of the Spring Cloud Stream Binder and how data can be published or subscribed from MapR Event Streams.
For using the Apache Kafka binder, you just need to add it to your Spring Cloud Stream application, using the following Maven coordinates:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId>
</dependency>
If you use Kafka 0.9, then ensure that you exclude the kafka broker jar from the spring-cloud-starter-stream-kafka dependency as following.<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-stream-binder-kafka</artifactId>
<exclusions>
<exclusion>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
</exclusion>
</exclusions>
</dependency>
To integrate with MapR Event Streams add the MapR specific dependencies for MapR Streams/Kafka<dependency>
<groupId>com.mapr.streams</groupId>
<artifactId>mapr-streams</artifactId>
<version>5.2.2-mapr</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka-clients</artifactId>
<version>0.9.0.0-mapr-1707</version>
</dependency>
<dependency>
<groupId>org.apache.kafka</groupId>
<artifactId>kafka_2.11</artifactId>
<version>0.9.0.0-mapr-1707</version>
</dependency>
This Producer example illustrates how one can publish data to MapR Event Streams.Step 1) Create MapR Stream path
maprcli stream create -path /tmp/springcloud
Step 2) Create stream topicmaprcli stream topic create -path /tmp/springcloud -topic testbinder
Step 3) Validate if the topic has been createdmaprcli stream topic list -path /tmp/springcloud
partitions maxlag logicalsize topic consumers physicalsize
1 0 0 testbinder 0 0
Step 4) Start the Kafka Server
/opt/mapr/kafka/kafka-0.9.0/bin/kafka-server-start.sh /opt/mapr/kafka/kafka-0.9.0/config/server.properties
Step 5) Download the SpringStreamsProducer project from GIT , go to project directory and execute the producer with mvn spring boot command mvn spring-boot:run
The standard Spring Integration @InboundChannelAdapter annotation sends a message to the source’s output channel, using the return value as the payload of the message.@EnableBinding(Source.class)
public class TimerSource {
@InboundChannelAdapter(value = Source.OUTPUT, poller = @Poller(fixedDelay = "1000", maxMessagesPerPoll = "1"))
public ChatMessage timerMessageSource() {
return new ChatMessage("MapR Data Technologies", System.currentTimeMillis());
}
}
Steps 6) Use streamanalyzer utility to verify messages are published to topic.mapr streamanalyzer -path /tmp/kafka -topics test
Total number of messages: 65
mapr streamanalyzer -path /tmp/kafka -topics test
Total number of messages: 71
mapr streamanalyzer -path /tmp/kafka -topics test
Total number of messages: 74
Consumer example illustrates inputChannel from which this service activator will consume messages.@ServiceActivator(inputChannel=Sink.INPUT)
public void loggerSink(Object payload) {
System.out.println("Received : " +payload);
}
Step 1) Download the SpringStreamsConsumer project from GIT , go to project directory and execute the consumer with mvn spring boot command mvn spring-boot:run
Step 2) Use maprcli command to verify the commitedoffsetmaprcli stream cursor list -path /tmp/springcloud -topic testbinder -json {
"timestamp":1506985469999,
"timeofday":"2017-10-02 04:04:29.999 GMT-0700",
"status":"OK",
"total":1,
"data":[
{
"consumergroup":"mapr",
"topic":"testbinder",
"partitionid":"0",
"produceroffset":"1001", <-------------
"committedoffset":"1002", <-------------
"producertimestamp":"2017-10-02T04:04:15.606-0700",
"consumertimestamp":"2017-10-02T04:04:15.606-0700",
"consumerlagmillis":"0"
}
]
}
ConclusionIn this blog you have learnt how to integrate MapR Event Streams with Spring Cloud Kafka Binder with Producer and Consumer examples.
It is nice blog Thank you provide important information and I am searching for the same information to save my time Big Data Hadoop Online Training
ReplyDeleteHappy to found this blog. I have some facts related to this blog and I would like to share with all its readers. Definitely it is going to help everyone and aware people with some more knowledgeable points
ReplyDeletewedding live streaming hyderabad