EcoStruxure Geo SCADA Expert Forum
Schneider Electric support forum about installation, configuration, integration and troubleshooting of EcoStruxure Geo SCADA Expert (ClearSCADA, ViewX, WebX).
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2020-09-18 09:26 PM
Do you mean columns, or do you mean rows?
If you are thinking that you will run into an issue with too many columns, then I think your design is flawed. You should not have this many columns. I suspect that you will run into performance issues long before you start to hit the theoretical limits of the number of columns.
SQL itself supports 255 columns, so this will be the absolute upper limit.
ODBC / OLEDB have similar limits.
In regards to the number of rows, I believe the RowId value is a 32-bit Integer, so that puts a cap at something a little over 2 million. But the Server Configuration query limits will also put a cap on it.
If you're getting towards the limits of the row count, then you likely also have a flawed design. Geo SCADA's Data Table functionality isn't really intended to be a replacement for a stand alone database. You will almost certainly suffer from performance limitations if trying to query for so many records from Geo SCADA.
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2020-09-18 09:26 PM
Do you mean columns, or do you mean rows?
If you are thinking that you will run into an issue with too many columns, then I think your design is flawed. You should not have this many columns. I suspect that you will run into performance issues long before you start to hit the theoretical limits of the number of columns.
SQL itself supports 255 columns, so this will be the absolute upper limit.
ODBC / OLEDB have similar limits.
In regards to the number of rows, I believe the RowId value is a 32-bit Integer, so that puts a cap at something a little over 2 million. But the Server Configuration query limits will also put a cap on it.
If you're getting towards the limits of the row count, then you likely also have a flawed design. Geo SCADA's Data Table functionality isn't really intended to be a replacement for a stand alone database. You will almost certainly suffer from performance limitations if trying to query for so many records from Geo SCADA.
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2020-09-19 01:18 AM
Hi,
thanks Bevan. Automation Interface externally and I am looking at approx. 70 columns to add.
I have added 5 test columns through code and was hoping I could create the rest in the next day or so.
Sounds like there should be no issues.
Thanks,
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2020-09-20 04:34 AM
70 columns does seem like a lot of data which is all strictly associated, and which you would expect to retrieve all at once.
I really can't think of a situation that would actually need 70 columns of data.
70 columns shouldn't be a problem for most of the data platforms you'd encounter, although possibly Excel CSV export might have problems with it, and you might start to push into untested territory around some other interfaces (like Crystal Reports, or the ClearSCADA ODBC driver etc).
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2020-09-22 03:09 PM
Also the maximum recommended size of a data table is 100MB, and if its updated frequently it really shouldn't be anywhere near that size as the whole blob is synchronised each change.
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2020-09-23 08:16 PM
I did not know the whole blob would be sent for every change.
I really would have expected only 'transactions' to be sent.
I guess there's probably not much motivation to optimise this, it's a bit of a niche situation, and I wouldn't consider it 'recommended' usage of the Data Table items in Geo SCADA to be used for a volume of data approaching this (he says as he then thinks about those situations where he's probably got huge string tables being stored in Geo SCADA right now... like some alarm instruction database that allows for something like 4000 characters per alarm instruction, and many tens of thousands of alarm instructions)
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2020-09-29 12:01 PM
We use DataSets (different, but similar) with lots of columns to summarize data for oil and gas sites. Our master dataset has over 100 columns with various pieces of information possible at the different sites. The system has over 1,000 wells and maybe 25 major site types with lots of different meters and types of equipment.
Because ClearSCADA JOIN performance is so terrible and types of joins are limited we decided to create this monster dataset instead. We were hitting major performance limitations joining two or more datasets together. If we re-architected with separate datasets and joins we could use more than 5 joins to get all the information the operations group is looking for.
It actually works, and works much better than doing joins.
Link copied. Please paste this link to share this article on your social media post.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2020-10-01 08:44 PM
That is definitely quite a wide dataset.
I'm surprised that there are 100 items of information per well though, even with a sewer pump station there's typically not that much meaningful information.
It is indeed challenging to get good performance out of Geo SCADA joins. You really need to use a unique and indexed ID column to bridge the tables, and having that ID column index be cheap to compute is important also.
Things like FullName might be nice for us humans to reference, but they are expensive for the computer, even when indexed it still requires a hash calculation for every modification, and may still result in a partial linear search if there are hash collisions.
And having an ID column that is not explicitly unique and indexed is worthless.
I don't believe Data Sets themselves can have index columns, so joins will always exhibit performance challenges there 😞
I think the same situation applies for Data Grids. I believe it is ONLY Data Tables that can have indexes.
Link copied. Please paste this link to share this article on your social media post.
Create your free account or log in to subscribe to the board - and gain access to more than 10,000+ support articles along with insights from experts and peers.