Kafka message visualization using Spotfire and Live Datamart
Hi Team,
Kafka Connector available to the visualization using tibco spot fire tool. if yes kindly send me the way to do samples ASAP.
thanks Arumugam D
(2) Answers
Hi Team,
Thanks for your quick reply. I need StreamBase Software(.exe) and step by step installation Process.
Thanks
Arumugam D
1 Comment
- Download StreamBase (it includes Live Datmart in the same installer): https://edelivery.tibco.com/storefront/en/eval/tibco-streambase/prod11032.html
- Install StreamBase, assuming you are on Windows, which is very likely given your use of the term .exe and your use of Spotfire: http://docs.streambase.com/latest/topic/com.streambase.sb.ide.help/data/html/install/installonwindows.html
- Get and install the Custom Data Source for TIBCO Spotfire® to Read Data from TIBCO® Live Datamart
- Start StreamBase Studio
- File > New > StreamBase Project. Follow the wizard prompts.
- Select the name of your project in the Package Explorer view.
- Right-click > Open LiveView Project View
- In the LiveView Project Viewer, right-click > New Data Table. Follow the wizard to create a data table that corresponds to the fields in your data you want Spotfire to be able to get. You have to have a primary index, so hopefully there's a natural one in your data.
- In the LiveView Project Viewer, select the Data Table icon for you table.
- Right-click > Add > Publisher > Next > Apache Kafka (optionally support recovery) > Finish
- You may have to edit the Kafa<mytablename>Publisher<mytablename>.sbapp to have the KafkaConsumer deserialize the message into StreamBase tuples appropriately. You might have to learn to some StreamBase EventFlow in order to do it.
- Select the name of your project in the Package Explorer view.
- Right-click > Run As > LiveView Project
- Unzip the live_datamart_custom_data_source_vx.x.x.x.zip file
- Install and Configure as described in Live Datamart Custom Data Source vx.x.x.x > Documentation > Custom Data Source User Guide.pdf
- Run Spotfire.
- Start building out your dxp, File > Open From > Live Datamart Snapshot . . . .
- Have fun!

1 Comment
- Download StreamBase (it includes Live Datmart in the same installer): https://edelivery.tibco.com/storefront/en/eval/tibco-streambase/prod11032.html
- Install StreamBase, assuming you are on Windows, which is very likely given your use of the term .exe and your use of Spotfire: http://docs.streambase.com/latest/topic/com.streambase.sb.ide.help/data/html/install/installonwindows.html
- Get and install the Custom Data Source for TIBCO Spotfire® to Read Data from TIBCO® Live Datamart
- Start StreamBase Studio
- File > New > StreamBase Project. Follow the wizard prompts.
- Select the name of your project in the Package Explorer view.
- Right-click > Open LiveView Project View
- In the LiveView Project Viewer, right-click > New Data Table. Follow the wizard to create a data table that corresponds to the fields in your data you want Spotfire to be able to get. You have to have a primary index, so hopefully there's a natural one in your data.
- In the LiveView Project Viewer, select the Data Table icon for you table.
- Right-click > Add > Publisher > Next > Apache Kafka (optionally support recovery) > Finish
- You may have to edit the Kafa<mytablename>Publisher<mytablename>.sbapp to have the KafkaConsumer deserialize the message into StreamBase tuples appropriately. You might have to learn to some StreamBase EventFlow in order to do it.
- Select the name of your project in the Package Explorer view.
- Right-click > Run As > LiveView Project
- Unzip the live_datamart_custom_data_source_vx.x.x.x.zip file
- Install and Configure as described in Live Datamart Custom Data Source vx.x.x.x > Documentation > Custom Data Source User Guide.pdf
- Run Spotfire.
- Start building out your dxp, File > Open From > Live Datamart Snapshot . . . .
- Have fun!
Hello,
There is no out of the box connector available for Kafka from Spotfire directly yet but you can use it via Tibco Streambase-
StreamBase offers built-in adapters for Kafka. These adapters allow you to publish and consume messages with minimal effort and also have extension points to support non-standard message payload and header (key).
Here is documentation:
https://docs.tibco.com/pub/streambase_cep/7.6.5/doc/html/adaptersguide/embeddedInputKafkaConsumer.html
StreamBase and Spotfire apply to different usage scenarios. StreamBase processes data as it arrives and Spotfire at best can reload data on infrequent triggers.
The natural way of integration between StreamBase and Spotfire is Live Datamart. LDM stores the real-time data in tabular form suitable for consumption in Spotfire (there is connector available) and StreamBase coordinates the data flow, for example it can populate data from Kafka into table in LDM that can be loaded into Spotfire visualization.
1 Comment
Hi,
As can be inferred by the list of steps I posted in a comment to another answer to this question, somewhat easier that using the StreamBase Kafka Consumer Input Adapter directly in a StreamBase application is just to generate a Live Datamart Embedded Publisher application using Apache Kafka as a data provider. There's a wizard in StreamBase Studio to walk you through the process of creating the Publisher. You have to edit the generated EventFlow application to specify how deserialize the specific Kafka message contents into StreamBase tuples that correspond to the LiveView data table schema's fields, and also configure the Kafka Consumer's parameters with the broker URI and the Topic name. See my other comment for some of the details.

1 Comment
Hi,
As can be inferred by the list of steps I posted in a comment to another answer to this question, somewhat easier that using the StreamBase Kafka Consumer Input Adapter directly in a StreamBase application is just to generate a Live Datamart Embedded Publisher application using Apache Kafka as a data provider. There's a wizard in StreamBase Studio to walk you through the process of creating the Publisher. You have to edit the generated EventFlow application to specify how deserialize the specific Kafka message contents into StreamBase tuples that correspond to the LiveView data table schema's fields, and also configure the Kafka Consumer's parameters with the broker URI and the Topic name. See my other comment for some of the details.
Similar Questions
Haven't found what you are looking for?
Existing Best Answer
This Question already has a 'Best Answer'. If you believe this answer is better, you must first uncheck the current Best Answer