Jump to content
The TIBCO Platform is a real-time, composable data platform that will bring together an evolving set of your TIBCO solutions - and it's available now! See more information here ×
  • Flogo Use-Case - Event-based Integration


    JenVay Chong

    Preface

    TIBCO Integration consists of a broad range of capabilities served by several products. Namely, TIBCO BusinessWorks 5, TIBCO BusinessWorks 6/CE and TIBCO Flogo. Each of these products by itself can handle a broad range of use-cases. It should not be a surprise that each use-cases could also be implemented in multiple of these products. However, there are reasons why these products are in the stack as each do optimally serve certain use-cases better than the other. We should understand when to use which. In this series of articles, we will deep dive into some of the use-cases best suited for TIBCO Flogo. Some of the use-cases includes Web API-based Integration, Messaging/Event-based Integration, Data-as-a-Service Architecture, Simple Event Processing, Stream Event Processing, Standard Protocol-based IoT Integration, Edge Processing, API Ecosystem for Mobile/Web, and Mobile/Web Event-driven Integration.

    Event-based Integration

    The focus of this article is on Event-based integration. There are many messaging systems in the market today each serving its targeted use-case and audiences. TIBCO Flogo has connectors to a lot of these messaging systems. In an event based architecture, it is common to share a single event with many different consumers, each needing the event and data associated with it to perform its specific function. However, with many types of messaging systems today, especially in large enterprises, it is also possible to have multiple being operated at the same time.

    In this scenario, consider the few requirements below

    • You have multiple different types of messaging systems that you need to share events across. The messaging systems cannot be consolidated because some specific clients can only communicate with specific types of messaging systems. 
    • Some modern messaging systems provide connectors or bridges to other messaging systems. Each type of messaging system has a different message representation/format of the message that it can work with. The connectors and/or bridges provided might have a very specific way of transforming the message from one system to another. You want to have control over how the messages are transformed between the messaging systems.
    • The current connectors and/or bridges available have limited capabilities when it comes to scaling, performance, and observability. 

    Flogo is extremely well suited to handle scenarios like these. 

    • It has connectors to Apache Pulsar, Apache Kafka, TIBCO EMS, TIBCO eFTL, MQTT, Azure Service Bus, Amazon SNS, Amazon SQS to name a few and with more to come.
    • It has the capability to allow full control on how the message should be transformed as it is consumed and published out the other messaging systems.
    • It has the capability to be scaled up or down, extremely lightweight and performant, and provides observability out of the box. 

    Apache Kafka to Apache Pulsar Implementation

    To illustrate the implementation of a scenario like this, we will use Flogo to get messages from Apache Kafka and propagate it to Apache Pulsar. 

    In this scenario, we assume an enterprise message schema below used in all messaging systems. We have the sample payload below.

    {

      "environmentCode": "DEV",

      "messageId": "ABC-123",

      "correlationid": "XYZ-123",

      "receivedDateTime": "2024-01-01 12:00:00",

      "messageProperty": [ {"name": "CustomerId", "value": "A123"} ],

      "messagePayload": [ {"type": "JSON", "payload": "..."} ]

    }

    This enterprise schema can be defined in the SCHEMAS section of the Flogo flow. Do note that by entering the sample payload into the schema text box in Flogo, it will automatically be converted into a JSON schema on exit.

    AD_4nXdBlRF6vVKxq2f0SGHtQRMf2aniCS_siPXX5Uvt9sSiq9YZz_ardC3_eSfz22QHwhjvGPGX3vU8mENOeClwYzX7U1p4J0YDzSf61wlxJmAxaTYDBcfbcD9uszPKUsD_i36Xczn0lovYuWqZmngCKiIMr3dB?key=33lrmgretI2ZybEkR4RR3Q

    AD_4nXfqO-8_i1-W5Lfc_3KrtrQeV-1qeSGhRl2eY1X2IWPRqvxljGGIgCP1Em2hdfTY89WCJhkSQ207-qWSF8-Y_sUSDewN3-nXsc7STRIz3oXxTHMJSR5FiB1fZiQ8fNxWG9S9SA-Dy-pPHkWiOfTCh29PnkBj?key=33lrmgretI2ZybEkR4RR3Q

     

    This scenario requires us to connect to both Apache Kafka and Apache Pulsar. To do this, we define two connections in the CONNECTIONS section of the flow.

    AD_4nXeksjH4HStI5U8LiLtqZY54SPhe1_My3GzjfpS-_I3-vrM0hu0vJaxwwdIvIEB18x7htn-LUWr38W0H8RsMlB0B4W1ADgl579fok9OMxGtBbcba4B89kUSEY23oLoR7fw7S_RFdgN5AP_Ua5VwF5Zd4pnNq?key=33lrmgretI2ZybEkR4RR3Q

     

    The KafkaConnection allows the configuration of connection parameters to Apache Kafka.

    AD_4nXemh_FhpjWoY3KlpmmapZljvSxZytDJj4EQz1W4uVOzTs7iZESLn_MVlxq-lWWtMw366HajMdpStXzE99Z2wu3sUBg9ccdIIkgN9-edNZg33_YKyPngXzXEJ1QMnbzFu1RxWW3Bkn5GIv78z0mPKqeuzO34?key=33lrmgretI2ZybEkR4RR3Q

    Also note the Auth Mode supported.

    AD_4nXcITmkgw9S_PrvtrzYavyO2LTzxma5b-TFXYOThCoEeqLM4HDrCFpqp5W2JV4rBVvhwf7DM9gK91cKLT04mMR83qtK-ysstEij_M70P57EvEIx75rKF1CKh6iDSXonOxYOJaMFDuLG0Xa1gHDtoUZYcwChZ?key=33lrmgretI2ZybEkR4RR3Q

     

    The PulsarConnection allows the configuration of connection parameters to Apache Pulsar.

    AD_4nXdJrv8kRr20TQg37qhaWt3ix9yCGUBNSf9tIiLe19eWam3tCsgZPFwrJaXxaPFf83ScjU_gq_cu38nSUTSud7jghjiqsF89Nsrhn7KkB1vQo3EWQI9szGGKoUHPwzUVGF4Wff2MubYNesrUOhwQsg6JUOrs?key=33lrmgretI2ZybEkR4RR3Q

    Also note the authorization type supported.

    AD_4nXf28Gk_SBHOok7a1IB2VDiN2hMS5_ZgRgVJqzY-QH-3iXnvAtf1CBuGEUiqqfxb9Rs_g4MNoAYwZJ-IvX-sVHoeoJ9JloxqKTWbVqGDnqvrsPUvPsoOsnCSJ-EO4mlqICqJcCrxuQJdyLwnF3izHtNz8sMH?key=33lrmgretI2ZybEkR4RR3Q

     

    Connection properties that can be interchangeable between environments are also automatically created in the APP PROPERTIES of the flow.

    AD_4nXeLrpNPPUAqIhlf9PJDI2SxR7Ry7_KCu24n7KH53QnqyIDmuUMoykK4QJspAB3yZKOjQFDg0yqrBBMYQmf49kjM_zYXv2B6wRCrD06jX3JVwhWwvFQ3R-cWKwRWdjHcwEbIcRwGqyLBFXCmtibFwRI4kaL7?key=33lrmgretI2ZybEkR4RR3Q

     

    The image below shows a simple flow to accomplish this use-case. What you need is a Kafka Consumer trigger to consume the message from a Kafka topic. The message consume can then be propagated into an Apache Pulsar Producer Activity. We can then mark the message being consumed using the Kafka Commit Offset activity. The flow also has optional Log Message to log the start and end of each message propagation.

    AD_4nXeMXUWg-Z6QLZb8ulPyj66jlmRVaxn2mamtG2It5ofsRCTap4n0R_9f1kn1_qNnCddVixC7QRYbOnLS0UUJyITMy0oR37wjrZYLBzCaX99RS6RgH_UId7IifFEdXNNSuHtKRVtMJcvs6rHIdQc4af8u3bo?key=33lrmgretI2ZybEkR4RR3Q

     

    The Kafka Consumer Trigger allows you to configure the settings and behaviors of this consumer such as the Kafka Connection, the topic name, the consumer group ID, the Value Deserializer (String, JSON, Avro), Initial Offset (Newest, Oldest) and others.

    AD_4nXeo5MB-gdch52UqrwcpOugbitlRYo4-BwMxq0bMP9qv9kz5KaLMLNkN69RNFF9Ltmr1Fp-IOOyxLzYowDUba0liQXHnpz9HHa8PEDXiWCtL21GTIAeQ0TriNyUjiZJJANqMOy5HFVXXM7NyYhY28g2e0xkG?key=33lrmgretI2ZybEkR4RR3Q

     

    Likewise, the Apache Pulsar Producer Activity allows you to configure the settings and behavior of this activity. Notable settings include the Pulsar Connection, the Topic name, Compression Type (LZ4, ZLIB, ZSTD), and Message Format (String, JSON).

    AD_4nXcfvYV2jwipG9Ey6xAmGs6Vcz6Q1x4MRCkSmlWFyugVo1BguPzeC9GyQ0YQiYjyQnACU8TKcbSegipO59k_Y_fIui9iUOptzW8G2Gam9FBqj4C6QffuLy4NE_59AlCWnHC-Bh3kX04hVPvacWJs4CpmHjMR?key=33lrmgretI2ZybEkR4RR3Q

    AD_4nXeWSmE3Iaco8v7T0IUBmiqRTDPqMq0VTwahseR1aOqlhp77kNfxlJCmlhBmvX6rqAltIFTvnLAiPONhSGfGreNkCusyBBPk0GG__ILffCt-WKQDCHAJ1_lpT2esZ1lYeahTR42A3nDpKtM9qApmmZusi2bX?key=33lrmgretI2ZybEkR4RR3Q

     

    Mapping the Kafka message information over to the Pulsar message follows the same method and rules of mapping any other Flogo key and properties.

    AD_4nXcATAsBhEwAZ3ezMok931ANJhjT0bZ-W64BFf7TbMWnSqVe_vq7xSAVZHps-Wn8titVwyLlefKkK_UL84uMGQYqFkCSozTTwxMNZhYHg3ZkYFP2raivohW1f1r04j5g8gElroGGAaFqW1643gIPMfjD1nQ7?key=33lrmgretI2ZybEkR4RR3Q

     

    To test this Kafka to Pulsar flow, we have also created an Apache Kafka Producer and Apache Pulsar Consumer flow.

    Apache Kafka Producer

    The Apache Kafka Producer flow will act as the first step in the test where it will produce messages into an Apache Kafka topic. It is triggered by a simple repeating timer and uses the Kafka Producer activity.

    AD_4nXdZBkFIBKx0H961LRmK-rMtKuGuRVDhE9DHXSczOcuEi7oe6bPBvXlg7v2eL_UB9MZCCeOI6J0GKa7iWPl-UtYcuJr31-w9PyJFa4qFg2mhNWEwvd5kRfgZciArUNKOjA3MvLAW_gs15adtr16Ybyk0w1l3?key=33lrmgretI2ZybEkR4RR3Q

    The Kafka Producer activity is configured to use the same Kafka Connection with mapping into the MessageEnvelope JSON schema using hard coded values.

    AD_4nXdGOM3nElkc6V1SwA1r-uOwatgeMsOXan9WJOZ5WmiszZ4ODELikyz7ODilPii_41urIVGrcTyQnbpR2clGaaoTMthfbFXBjdeIg0WezVLoJKeVl_77rUMz-gPfdQ8Uch5x3un8OjgBrAQkSioHORm_5tmd?key=33lrmgretI2ZybEkR4RR3Q

    AD_4nXevp2414j5Zh_qpMLBRyxBmnVF308JmtYRF_GoOSIQHHjExaLYVkQtS-s18gLc4WyNTSNTM0BGvmJwHtTntdneAVc23-4QIu-0vtQ4GAxZqNzMvlt92IZJ4uknNPPsSQ34fUwe3Pu_0V5Du0v1M-_-vvcLa?key=33lrmgretI2ZybEkR4RR3Q


     

    Apache Pulsar Consumer

    The Apache Pulsar Consumer flow is used in the last step of the test where it consumes messages that have been propagated by the Kafka to Pulsar flow. It will also acknowledge the receipt of the message after consumption.

    AD_4nXdgF8yqUQ0Ce1z8d_vFQnx92OpJVp7wv0sCvC02TWLuAYVOradfAUiL6glHis-nKcTR8kXRuqd6xcNcC04m7sP_43rBO4P8PzVzBGJ1kqBE0JnwhXCYLNmMCsRqIezm2hLrT27RhuDjP3wrkFVVHkHoGQo?key=33lrmgretI2ZybEkR4RR3Q

    The configuration of the Apache Pulsar Consumer is below.

    AD_4nXdWfUG_BCMgcPLYbtC0En-PQZ_FY2XCaj52dUyWPP34-06ieCZOu0yD0H8Df5cD7Sxe2V6ktpQUvvA4bbwNf4nH03P-RY_fiCxPe4gDgLarKcBwbFCrHe6Ln9frVxHM-W8nfXZ2WWOxH_6loVRGG6jMeU4?key=33lrmgretI2ZybEkR4RR3Q

     

    The code written in Flogo  2.25.0 using the Visual Studio Code Extension is provided below for your reference.

    KafkaProducer.flogo PulsarConsumer.flogo Kafka2Pulsar.flogo

    • Like 1

    User Feedback

    Recommended Comments

    There are no comments to display.



    Create an account or sign in to comment

    You need to be a member in order to leave a comment

    Create an account

    Sign up for a new account in our community. It's easy!

    Register a new account

    Sign in

    Already have an account? Sign in here.

    Sign In Now

×
×
  • Create New...