advertisement

Connecting Apache Kafka With Mule ESB

40 %
60 %
advertisement
Information about Connecting Apache Kafka With Mule ESB

Published on June 19, 2017

Author: JitendraBafna2

Source: slideshare.net

advertisement

1. Connecting Apache Kafka With Mule ESB JITENDRA BAFNA

2. Connecting Apache Kafka With Mule ESB 1.0 Overview Apache Kafka was initially originated by LinkedIn and later became an open sourced Apache in 2011. Kafka is messaging queuing system and it is written in Java and Scala. Kafka is a distributed publish-subscribe messaging system that is designed to be fast, scalable, and durable. Kafka has four core API's • The Producer API allows an application to publish a stream of records to one or more Kafka topics. • The Consumer API allows an application to subscribe to one or more topics and process the stream of records produced to them. • The Streams API allows an application to act as a stream processor, consuming an input stream from one or more topics and producing an output stream to one or more output topics, effectively transforming the input streams to output streams. • The Connector API allows building and running reusable producers or consumers that connect Kafka topics to existing applications or data systems.

3. Connecting Apache Kafka With Mule ESB

4. Connecting Apache Kafka With Mule ESB 2.0 Components of Apache Kafka • Topic is name of category or feed where records has been published.Topic are always multi subscriber as it can have zero or more consumers that subscribe to the data written to it • Producers publish data to topics of their choice. It can publish data to one or more Kafka topics. • Consumers consume data from Topics. Consumers subscribes to one or more topics and consume published messages by pulling data from the brokers. • Partition: -Topics may have many partitions, so it can handle an arbitrary amount of data. • Partition offset:- Each partitioned message has unique id and it is known as offset. • Brokers are simple system responsible for maintaining the published data. Each broker may have zero or more partitions per topic. • Kafka Cluster: - Kafka's server has one or more brokers are called Kafka Cluster.

5. Connecting Apache Kafka With Mule ESB 3.0 Apache Kafka Use Cases Belowis someuse caseswhereApacheKafkacan beconsider. 3.1 Messaging In comparison toothermessagingsystem,ApacheKafkahasbetterthroughput andperformance, partitioning,replication,andfault-tolerancewhichmakesit a goodsolution forlargescalemessageprocessingapplications. 3.2Website Activity Tracking Websiteactivity likenumber ofview, number ofsearchesor any otheractionsthat usersmay perform is publishedtocentraltopicswithonetopic peractivity type. Thesefeeds are availablefor subscription forarangeof use casesincludingreal-timeprocessing,real-timemonitoring,andloadingintoHadooporofflinedata warehousingsystemsforofflineprocessingandreporting. 3.3 Metrics Kafkais often used for operationalmonitoringdata.Thisinvolvesaggregatingstatisticsfromdistributedapplicationstoproducecentralizedfeedsofoperational data. 3.4 LogAggregation Kafkacan be used acrossan organization tocollect logsfrommultipleservicesandmakethemavailablein standardformat to multipleconsumers. 3.5Stream Processing PopularframeworkssuchasStormandSparkStreamingreaddatafroma topic,processesit,andwriteprocesseddatatoanew topic whereit becomesavailable for users andapplications.Kafka’sstrong durability isalsovery useful in thecontext ofstreamprocessing.

6. Connecting Apache Kafka With Mule ESB 4.0 SetupZookeeperOn Windows Server Now you will learn how to setup Zookeeper onWindows Server. Make sure JRE8 has been installed and JAVA_HOME path is setup in environment variable.

7. Connecting Apache Kafka With Mule ESB 4.1 Download & Install Zookeeper Zookeeper also plays vital role for serving so many other purposes such as leader detection, configuration management, synchronization, detecting when a new node join or leaves the cluster etc. • Download the ZooKeeper from http://zookeeper.apache.org/releases.html and extract it (e.g. zookeeper-3.4.10). • Go to your Zookeeper directory (e.g. C:zookeeper-3.4.10conf). • Rename file zoo_sample.cfg to zoo.cfg. • Open zoo.cfg file in text editor like Notepad or Notepad++. • Search for dataDir=/tmp/zookeeper and update path to dataDir=zookeeper-3.4.10data. • Add two environment variable • a. Add System Variables ZOOKEEPER_HOME = C:zookeeper-3.4.10 • b. Edit System Variable named Path add ;%ZOOKEEPER_HOME%bin; • By default Zookeeper run on port 2181 but you can change the port by editing zoo.cfg.

8. Connecting Apache Kafka With Mule ESB 4.2 StartingThe Zookeeper Open the command prompt and run the command zkserverand it will start Zookeeper on port localhost:2181.

9. Connecting Apache Kafka With Mule ESB 5.0 Setup Apache Kafka On Windows Server Now you will learn how to setup Apache Kafka onWindows Server. 5.1 Download & InstallApache Kafka • Download the Apache Kafka from http://kafka.apache.org/downloads.html and extract it (e.g. kafka_2.11-0.9.0.0). • Go to your Kafka config directory (e.g. C:kafka_2.11-0.9.0.0config). • Open file server.propertiesin text editor like Notepad or Notepad++. • Search for log.dirs=/tmp/kafka-logsand update path to log.dirs=C:kafka_2.11- 0.9.0.0kafka-logs.

10. Connecting Apache Kafka With Mule ESB 5.2 Starting Apache Kafka Server • Open the command prompt and make sure you are at path C:kafka_2.11-0.9.0.0 • Run the below command to start Kafka server.

11. Connecting Apache Kafka With Mule ESB 6.0 CreatingTopic On Apache Kafka Server Now we will create topic with replication factor 1 as only one kafka server is running. • Open command prompt and make sure you are at path C:kafka_2.11- 0.9.0.0binwindows • Run below command to create topic kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 --partitions 1 -- topic muleesb

12. Connecting Apache Kafka With Mule ESB 7.0 Installing Anypoint Kafka Connector Ingest streaming data from Kafka and publish it to Kafka with this connector. By default Kafka connector is not part of Mule palette and you can install the Kafka connector by connecting to Anypoint Exchange fromAnypoint Studio.You just need to accept the license agreement and at the end of installation it will ask you to restart the Anypoint studio. Streamline business processes and move data between Kafka and Enterprise applications and services with the Anypoint Connector for Kafka. Kafka Connector enables out-of-the-box connectivity with Kafka, allowing users to ingest real-time data from Kafka and publish it to Kafka.

13. Connecting Apache Kafka With Mule ESB 8.0 Integrating Apache Kafka With Mule ESB as Producer We will implement flow that will publish message to Apache Kafka server. • Place the http connector at message source and configure it. • Drag and Drop Apache Kafka connector and configure it by clicking on add button. Configure Bootstrap Servers, Producer Properties File andConsumer Properties File. Press OK.

14. Connecting Apache Kafka With Mule ESB

15. Connecting Apache Kafka With Mule ESB • Configure the Operation to Producer,Topic name and Key (it is some unique key that needs to publish with message). • Add consumer .properties and producer.properties file to mule application build path (src/main/resources).Both properties files can be found at location C:kafka_2.11-0.9.0.0config.

16. Connecting Apache Kafka With Mule ESB 9.0 IntegratingApache Kafka With Mule ESB asConsumer We will implement flow that will consume message fromApache Kafka server. • Place the Apache Kafka connector at message source and configure it by clicking on add button. Configure Bootstrap Servers, Producer Properties File and Consumer Properties File as shown above. Press OK. • Configure Operation to Consumer,Topic name and Partitions (i.e. it is number of partition you have given during creating topic).

17. Connecting Apache Kafka With Mule ESB • Drag and drop the file connector and configure it. This will be used to save the message consumed fromApache Kafka server.

18. Connecting Apache Kafka With Mule ESB 10.0 Mule Flow [Code] <?xmlversion="1.0" encoding="UTF-8"?> <mule xmlns:file="http://www.mulesoft.org/schema/mule/file" xmlns:http="http://www.mulesoft.org/schema/mule/http" xmlns:apachekafka="http://www.mulesoft.org/schema/mule/apachekafka" xmlns:tracking="http://www.mulesoft.org/schema/mule/ee/tracking" xmlns="http://www.mulesoft.org/schema/mule/core" xmlns:doc="http://www.mulesoft.org/schema/mule/documentation" xmlns:spring="http://www.springframework.org/schema/beans" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans-current.xsd http://www.mulesoft.org/schema/mule/core http://www.mulesoft.org/schema/mule/core/current/mule.xsd http://www.mulesoft.org/schema/mule/http http://www.mulesoft.org/schema/mule/http/current/mule-http.xsd http://www.mulesoft.org/schema/mule/ee/tracking http://www.mulesoft.org/schema/mule/ee/tracking/current/mule-tracking-ee.xsd http://www.mulesoft.org/schema/mule/apachekafka http://www.mulesoft.org/schema/mule/apachekafka/current/mule-apachekafka.xsd http://www.mulesoft.org/schema/mule/file http://www.mulesoft.org/schema/mule/file/current/mule-file.xsd"> <http:listener-config name="HTTP_Listener_Configuration"host="0.0.0.0" port="8081" doc:name="HTTPListener Configuration"/> <apachekafka:config name="Apache_Kafka__Configuration"bootstrapServers="localhost:9092" consumerPropertiesFile="consumer.properties" producerPropertiesFile="producer.properties" doc:name="Apache Kafka: Configuration"/> <flow name="apache-kafka-producer"> <http:listener config-ref="HTTP_Listener_Configuration" path="/kafka"allowedMethods="POST" doc:name="HTTP"/> <logger message="Message Published : #[payload]"level="INFO" doc:name="Logger"/> <apachekafka:producer config-ref="Apache_Kafka__Configuration" topic="muleesb"key="#[server.dateTime.getMilliSeconds()]"doc:name="Apache Kafka"/> </flow> <flow name="apache-kafka-consumer"> <apachekafka:consumerconfig-ref="Apache_Kafka__Configuration" topic="muleesb"partitions="1" doc:name="Apache Kafka (Streaming)"/> <logger message="Message Consumed : #[payload]"level="INFO" doc:name="Logger"/> <file:outbound-endpoint path="src/test/resources/consumer" responseTimeout="10000"doc:name="File"/> </flow> </mule>

19. Connecting Apache Kafka With Mule ESB 11.0Testing You can use Postman to test the application. Send the POST request to producer flow and it will publish message to Apache Kafka.Once message is publish, it will be consumed by consumer flow and save message to specified directory. For more details on testing, please watch demonstration video with this slide.

20. Connecting Apache Kafka With Mule ESB 12.0 Useful Apache Kafka Commands • Start Zookeeper: zkserver • Start Apache Kafka: .binwindowskafka-server-start.bat .configserver.properties • Start Producer: kafka-console-producer.bat --broker-list localhost:9092 --topic topicName • Start Consumer: kafka-console-consumer.bat --zookeeper localhost:2181 --topic topicName • Create Topic: kafka-topics.bat --create --zookeeper localhost:2181 --replication-factor 1 -- partitions 1 --topic topicName • List Topics: kafka-topics.bat --list --zookeeper localhost:2181 • Describe Topic: kafka-topics.bat --describe --zookeeper localhost:2181 --topic topicName • Consume messages from beginning: kafka-console-consumer.bat --zookeeper localhost:2181 -- topic topicName --from-beginning • Delete Topic: kafka-run-class.bat kafka.admin.TopicCommand --delete --topic topicName -- zookeeper localhost:2181

21. Connecting Apache Kafka With Mule ESB 13.0Conclusion Apache Kafka is very powerful distributed, scalable and durable message queing system. Mule ESB provides the Apache Kafka connector that can publish message to Kafka server and consume message from Kafka server (i.e. can act as producer as well as consumer.

22. ThankYou.

Add a comment