Integration with Solace Pub-Sub
What is Solace queue?
A queue acts as both a destination where clients can publish messages to and as an endpoint that clients can bind consumers to and consume messages from. A queue is typically used in a point-to-point (PTP) messaging environment.
It’s also possible to add topic subscriptions to a queue so messages published to matching topics are delivered to the queue. (For more information, refer to Topic Subscriptions.) Therefore, it is also possible to use queues in a publish and subscribe (Pub/Sub) model.
What are the Advantages of using queues?
- Queue helps us to remove the dependency between different systems and helps us to decouple applications.
- Queue enables asynchronous communication. Producers can add requests to the queue without waiting for them to be processed.
- In case of any system failure queue can store the data which needs to be transacted and process them later.
- By using queue we can achieve event-driven integration.
Difference between Solace queue and Atom queue
Solace Queue | Atom Queue |
If we use Solace as a messaging queue we have a good user interface, where we can manage our queues. | It is dependent on the Atom, if Atom goes down queue will not work |
If we use Solace as messaging queue we have a good user interface, where we can manage our queues. | 2. We need to manage the queues from Queue Management in the AtomSphere platform. |
We can customize the retry count. There is no limit of retries. | 3. We have at most have 6 retries. |
We can configure the dead letter queue. We can change the name of the dead letter queue according to us. | 4. We can’t customize the configuration of Atom’s dead letter queue. |
Let’s look into it how can we integrate Solace Pub/Sub cloud with Boomi:
Step 1. First we need to create one account in Solace cloud. (To create one account you can follow this link > Solace-Signup)
Step 2. After clicking on the link you will be redirected to this page.
Now you need to fill all the fields (Having * are the mandatory fields)
Step 3. After filling all the fields it should look like this –
Step 4. After signing up you will see the home page, now you need to click on ‘Cluster Manager’ to create one service.
Step 5. Click on ‘Create Service’ to create your own service.
Step 6. Create Service page will populate.
Note: In my case, I have already created one Service as Service type ‘Developer’ so that one error is showing, but when you create the service for the first time you will not get any error.
Step 7. Give a Service Name, choose a Service Type and choose a Cloud.
Step 8. Now Click on the cloud to choose the Region for your cloud.
Step 9. After choosing the Region, click on ‘Create Service’ button, your service will be created successfully .
Step 10. Go to the Cluster Manager tab again and click on the Service which you created right now.
Step 11.1. Go to the ‘Manage’ tab.
Step 11.2. Click on ‘Queues’.
Step 12. Queues page will be opened, now click on ‘+ Queue’ button to create one queue
Step 13. The ‘Create Queue’ window will open, now give the queue name and click on ‘Create’.
Step 14. Your queue(‘test’) is created successfully.
Now go to the Boomi Atomsphere Platform to start integrating Solace with Boomi.
Step 15. Go to platform.boomi.com
Step 16. Login by giving your valid credentials (Username and Password).
Step 17. Home page will be displayed, now click on ‘Integration’.
Step 18. Click on ‘Create New’ button.
Step 19. Select ‘Process’ and click on ‘Create’.
Step 20. Choose Start shape as ‘Connector’ type and choose connector as ‘Disk V2’
Step 21. Click on ‘+’ icon in the Connection to configure the connection.
Step 22. Give the connection name as ‘Disk’, in the directory text field give the directory path from where you want to read the file, at last click ‘Save and Close’.
Step 23. Select action as “Get” and Click on “+” icon in Operation to configure operation.
Step 24. Give the operation name as “Read file” and click on “Save and Close”.
Step 25. Click on Parameter, then click on “+ “ button.
Step 26. Choose Input as ID, and give the file name in Static value field. Now click on “Ok”.
Step 27. It should look like this, now click on “Ok”.
Step 28. Give the process name as “Send emp data to test queue”.
Step 29. Drag and drop one map shape to process canvas. Click on “+” button to create a map component.
Step 30. Choose Source profile as xml and import Employee profile.
Step 31. Now click on “Ok”.
Step 32. Likewise, choose a Flatfile emp profile in the destination also. Click on “+” icon to create one flat file profile.
Step 33. After creating the Flat file profile map all the fields and click on “Save and Close”.
Step 34. Now drag and drop one Solace PubSub+ connector. Click on “+” button in connection to configure the connection.
Step 35. Now you need to provide Username, Password, Api token etc to establish the connection, for these details, you need to go back to the Solace console page where we have created the Service and Queue.
Step 36. Go to Solace console page > Click one “Cluster Manager” > Click on your service which we created previously (refer to Step 10), go to the “Connect” tab and click on “Solace Messaging”.
Step 37. Copy Username, Password, Message VPN and SMF Host.
Step 38. Now click on User & Account and click on “Token Management” for getting the API token.
Step 39. Click on “Create Token”.
Step 40. Give a name to the Token and click on “Create Token” button.
Step 41. One API Token window will open, click on the ‘Copy’ button to copy the Token.
Step 42. Come back to the platform > Fill all connection details in the connector connection and hit “Test Connection” button to test the connection.
Step 43. Select the Atom and click on next.
Step 44. If the Success message displays then the connection is configured successfully, now click on “Finish” and click on “Save and Close”.
Step 45. Make the Action as “Send” and click on ‘+’ in the Operation to configure the operation.
Step 46. – Select Mode as Persistent Transacted.
– As we are using queue so we will select “Endpoint type” as “queue”.
– We will give the queue name in the “Destination” (We have already created one queue previously, refer to Step 12).
– Now Save and Close.
Solace Modes
Direct:
The connector listens to one or many topic subscriptions. Events are read from the event broker and submitted to the process in a fire and forget fashion. While this mode offers the highest performance, the broker does not preserve events during time periods when the Boomi process disconnects and if the Boomi process fails, the events are lost.
Direct mode is typically used when out-of-date events are not valuable. For example, if events contain stock prices and an updated event arrives every five seconds, rather than processing old events, the process simply waits for a new event to arrive.
Persistent Non-Transacted:
Instead of listening to a topic, the connector listens to a queue that holds events that matches topic(s). The benefit of using a queue is that the broker preserves events even when the Boomi process disconnects, allowing the Boomi process to handle the events when it reconnects.
The broker removes the event from the queue as soon as it is picked up by the Boomi process. This increases performance over persisted transacted, but also means that events may be lost in the event of an unexpected Boomi process or Atom failure.
Persistent Transacted:
Similar to persisted non-transacted, the connector listens to a queue that holds events that matches topic(s). The broker preserves events even when the Boomi process disconnects. However, in contrast to persisted non-transacted, the broker only removes the event once the Boomi process completes successfully.
Step 47. After configuring everything it should look like this. Now click on “Ok”.
Step 48. Drag and drop one Stop shape at the last.
- We have developed this process to send Employee data to the ‘test’ queue.
- Now we will develop one listener process to fetch this Employee data from the ‘test’ queue.
Step 49. Create one more process.
Step 50. Select the Start shape as Connector type and choose the connector as Solace PubSub+ connector.
Step 51. Click on “Make the recommended changes for me” to disable Capture run dates. After that select the connection(Solace connection) which we previously configured (refer Step 35).
Step 52. Action should be “Listen”, now click on ‘+’ in Operation to configure the operation.
Step 53. Select Mode as ‘Persistent Transacted’ and give the queue name to be listened (test) in “Destination”.
Now click on “Save and Close”.
Step 54. It should look like this. Now click “Ok”.
Step 55. Drag and drop one Disk V2 connector.
Step 56. Choose the connection what we have already created, Action should be “Create“. Now click on ‘+’ icon in operation to configure one operation.
Step 57. Now click on “Import”.
Step 58. Choose your Atom and click on Next.
Step 59. Choose File and click on Next.
Step 60. Now click on Finish.
Step 61. Now click on “Save and Close”.
Step 62. It should look like this, now click on “Ok”.
Step 63. Drag and drop one set properties shape to give the file name which we are creating in our disk.
Step 64. Click on ‘+’ to choose one property.
Step 65. Property type should be Document property, Source type is Connectors, Select Disk V2 connector as we are using Disk v2 and Property is File Name. Now hit “Ok”.
Step 66. Now give the file name in Property Value.
Step 67. Give the File name and click on “Ok”.
Step 68. Drag and drop one stop shape at last.
- Now our listener process is done. Now we need to deploy this listener process.
Step 69. Click on “Create Packaged Component”
Step 70. Click on Next Add Details.
Step 71. Click on Create Packaged Component.
Step 72. Click on “Deploy”.
Step 73. Choose your Environment and click on “Next: Select Versions”.
Step 74. After that click on “Next: Review”, then click on “Deploy”. Your listener process will be deployed successfully.
Step 75. Now execute the previously developed process once(Send emp data to test queue).
If we execute this process our listener process will be executed automatically and create one file in our local disk.
Step 76. To check the execution of the listener process go to the process reporting.
I have executed the process for 4 times so it is showing 4 times, and the csv file has created in our disk.