Welcome to the new Schneider Electric Community

It's your place to connect with experts and peers, get continuous support, and share knowledge.

  • Explore the new navigation for even easier access to your community.
  • Bookmark and use our new, easy-to-remember address (community.se.com).
  • Get ready for more content and an improved experience.

Contact SchneiderCommunity.Support@se.com if you have any questions.

Close
Invite a Co-worker
Send a co-worker an invite to the Exchange portal.Just enter their email address and we’ll connect them to register. After joining, they will belong to the same company.
Send Invite Cancel
84444members
353646posts

MQTT Sparkplug - Pushing data out of GeoSCADA

EcoStruxure Geo SCADA Expert Forum

Find out how SCADA systems and networks, like EcoStruxure Geo SCADA Expert, help industrial organizations maintaining efficiency, processing data for smarter decision making with IoT, RTU and PLC devices.

MikeDash
Ensign
Ensign
0 Likes
5
863

MQTT Sparkplug - Pushing data out of GeoSCADA

MQTT works great for getting data into GeoSCADA.  I have followed @sbeadle excellent videos for setting up MQTT.
I would like to know what the options are for using MQTT Sparkplug B for getting data out of GeoSCADA.

We would like to have data coming into GeoSCADA from field devices (PLCs, Modbus, etc), and then have it pushed out of GeoSCADA into a custom historian, via MQTT Sparkplug.

Would this possible with a modified version of the Sparkplug DDK?  
Can DDK drivers be event based (triggered when new data arrives)?
We want to be able to handle millions of incoming points, so polling isn't a good option).
 

Any advise or direction would be much appreciated.
5 Replies 5
MikeDash
Ensign
Ensign
0 Likes
4
857

Re: MQTT Sparkplug - Pushing data out of GeoSCADA

While looking at the other example projects I found this project on github:

 

https://github.com/GeoSCADA/Sample-DataFeeder

 

"The feeder uses the Advanced section of the client, which adds ‘on change’ functionality to points. In other words, if a point does not change then the data export process does not waste time interrogating the database and writing out unnecessary data."

 

"The purpose of the Data Feeder is to give a reasonably efficient way of exporting point data (value, quality and timestamps) to another system external to Geo SCADA."

 

@sbeadle do you think this will be scalable (to 1M+ tags) if I change the MQTT output from JSON to Sparkplug?

 

Thank-you for all the excellent examples and great documentation & videos! 

du5tin
Lieutenant
Lieutenant

Re: MQTT Sparkplug - Pushing data out of GeoSCADA

We have taken the DataFeeder code and have added onto it and plan to release it as a supported product in the near future. SP-B is one of our planned outputs for future. 

 

Scalability... I think this less a function of the protocol/data format and more about the source (Geo SCADA server) and destination (Event Hub/Time Series Insights for example can only ingest so much so fast). The DataFeeder code takes a fairly conservative approach to loading up the different parts of the system. 

 

If you want assistance with setting something up feel free to reach out privately. I am willing to share some of the things we have learned along the way. 

sbeadle
Janeway Janeway
Janeway

Re: MQTT Sparkplug - Pushing data out of GeoSCADA

Exporting 1M points would be a challenge (depends on data rate too)! At some point the Data Feeder sample code becomes limited, because:

a) the check interval would have to be quite long, as the system would get bogged down processing data reads. You might need a 15 or 30+ minute interval.

b) On startup the feeder checks and sets on-change in a call for every point. This would take a long time to get going from start or changeover. Could take an hour to start.

Both the above would load up the server, and you might then have to use a dedicated Standby-Only.

At this scale you'd be better off using the HSI software from the UK projects team. This uses the same internal mechanism as the WWH or eDNA historian feeds, and is a direct historian to historian link.

MikeDash
Ensign
Ensign
0 Likes
1
679

Re: MQTT Sparkplug - Pushing data out of GeoSCADA

Thanks for the details @sbeadle, sorry for the slow reply, I was pulled onto another project.

 

"At this scale you'd be better off using the HSI software from the UK projects team."

Can you please point me in the right direction where to find more information about this?

 

Are there any other options you would consider:

  1. OPC-UA server - I see a post from March 2021 that says this is under consideration?
  2. OPC-DA/HDA - I was unable to get time-series (device time-stamped) data out 
  3. Any other driver/protocol/method for serving timeseries data?

 

sbeadle
Janeway Janeway
Janeway
0 Likes
0
676

Re: MQTT Sparkplug - Pushing data out of GeoSCADA

To inquire about the UK team's HSI, please contact your SE reps and they will find their way to that offer.

 

Apart from using a DNP3 Slave RTU (suitable for relatively small point counts - few thousand points), the only driver-based options are Pull, not Push.

 

The Client API, as described above will allow you to go to 100K-ish (YMMV) points to 'watch' and export with some care over architecture and update times.

 

Beyond that you need a high performance connector which works at database level. The eDNA and WWH/Insights connectors do this but only to their own database systems.