Does anyone have aldready implemented data export (telemetry only) from any EBO/SBO to a cloud platform ?
For example exporting data from the building either to an Azure IoTHub or Azure storage component (Datalake, Blob, ...).
Are there any initiatives in the group to implement this (e.g. throughout a custom SmartConnector extension) ?
I've seen the EboIoTEdgeConnector project on git but I'm not sure it fully fit the need of device to cloud communication.
Thanks for any useful links !
With every EBO --> Cloud integration there seems to be different requirements as far as where, how, and in what format the data needs to be sent to the cloud.
Personally I have played around with doing this many different times (e.g. in Azure I have played around with sending data directly to IoT Hub, through IoT Edge, and to Blob Storage (via an API in Azure). There has been no plans to make a 'Standard' way to do this. This is because as I said before, it seems that every party that wants to this, wants the data in their own format.
That said, it is likely that some base Smart Connector code could be written, where people who want to implement something, simply need to override a class and implement their 'translation' logic or something, but like I said, at the moment there is no initiative to do this, and you would likely need to develop your own solution from scratch (I can likely send you some sample code that I wrote a while ago if you wanna shoot me an email firstname.lastname@example.org).
The IoT Edge Conector that you found on git was a solution specifically developed for a third party, so it is of course very likely this won't meet your needs as far as where/how this data is being sent/received, and the format that the data is being sent in.
Thank you for this complete answer !
In the meantime I've been in contact with the Global Engineering Center Of Excellence (EcoBuilding, Adam Summers' team).
We disscussed it and they gave me some interresting feedbacks : as you say they don't have on the shelf smart connector exporting to Azure storages but they have one able to generate csv files from EWS history from (EBO trends) and either export them locally, to AWS or to a ftp server. I'm currently trying the last version to export to some Azure FilleZilla Server VM.
The other option I started to investigate was to implement myself a custom SmartConnector extension to export to Azure Blob Storage (or datalake store).
If you did it already, it could be interresting to share on it !
Thanks again !
Hi Dimitri. Have you implemented anything after your post?
I am running a cloud ES for test purposes connected to field SmartX controllers.
I also have a separate Azure PostgreSQL which is fed from the CLoud ES. The ES does not need to be in the cloud environment obviously; can push the data to the Azure PostgreSQL with the correct settings.
Would be interested in seeing your approach
We finally setup a pilot project with the following design
End to end use case overview
The idea was to be able to combine heterogeneous data sources for cross-analysis.
EBO: the trend logs corresponding to the data points we want to gather must be first configured on EBO side
PME: default trend logs are already defined
The choice was made to use the smart connector to query on one side EBO & PME systems through EWS (that’s why it requires first to have the trend logs configured).
We use an extension for the smart connector called TrendLogExportProcessor which is able to periodically query an EWS system and gather all the data of the last period into a csv file.
Then, this same extension is able to export the produced file to a cloud based location. (e.g. AWS S3 is supported, we used an FTP server)
We then developed a Data Transformation Pipeline (based on our Data Refinery platform), to refine the data thru a workflow. The entry point of the pipeline is the FTP server fed by the Smart Connector.
The Data Refinery simplified architecture
The Building Digital Twin data pipeline implemented through the Data Refinery
Data As A Service:
The last step of the pipeline export the produced gold dataset into the OpenDataSoft platform which serve as the backend of the SE Exchange Data As A Service.
I have seen this design last year from DataLib @ Grenoble team (https://schneider-electric.opendatasoft.com/)
Actually I am really interested in the data ingestion and the tagging ability of the objects.
Can we connect regarding your pilot project? I am really keen on finding more information.