Ask our Experts
Didn't find what you are looking for? Ask our experts!
Share Your Feedback – Help Us Improve Search on Community! Please take a few minutes to participate in our Search Feedback Survey. Your insights will help us deliver the results you need faster and more accurately. Click here to take the survey
Schneider Electric support forum about installation, configuration, integration and troubleshooting of EcoStruxure Geo SCADA Expert (ClearSCADA, ViewX, WebX).
Search in
Link copied. Please paste this link to share this article on your social media post.
Posted: 2019-11-06 12:43 PM . Last Modified: 2023-05-03 12:26 AM
>>Message imported from previous forum - Category:ClearSCADA Software<<
User: sbeadle, originally posted: 2019-03-07 09:30:40 Id:379
I have this alarm or something like it - why and what to do with it?
This feature is designed to protect ClearSCADA from bad configuration or bad devices reporting too much data.
The Warning level just raises an alarm, and the Maximum will cause ClearSCADA to discard events for that time period.
They apply to Historic value data, Events, Alarm Summary and Configuration Change data.
From 2017 R2, the Events configuration includes a stream limit and an object limit. (The stream limit will be higher – it bunches together events in each range of object ids).
Q - Is there a place where I can find the number of records for each files ?
I want to set good value in these parameters.
You can:
a) Check the largest files in Server Status, and Snapshot files. See the section Historic | Historian. Scroll right. (Record sizes are another column).
b) Use a query to find largest files and details:
select top (100) * from CDBEventFile order by RecordCount desc
(Also table CDBAlarmSummaryFile, CDBConfigChangesFile, CDBHistoricFile)
CDBHistoric data is one file per object per week.
Stream data in CDBEvent, CDBAlarmSummary and CDBConfigChanges are not one file per hour per object, they are one file per hour per stream. A stream is a range of objects with ids going from 0 to (say) 1023, then the next from 1024 to 2047 etc. You can set the stream size in the Server Config tool in the Historic Configuration pages. Adjust the field ‘Stream Size'.
a) Only change this before commissioning, and it's a lot quicker if you delete data first. (Stop, delete the relevant historic folder and restart). Otherwise you can be waiting a long time at startup for existing files to be reorganised. This is a good argument for doing system design early!
b) You are looking for a balance between file size (not too big) and number of stream files (not too many!) A single file larger than 50Mb could cause delays when queried, as the database is locked while it is loaded into memory.
c) Make sure you use the Events object limit configuration on the root group to stop logging events on an object basis. This is a good way to prevent ‘denial of service' by a field device logging too much data.
Hope this helps.
Steve
Link copied. Please paste this link to share this article on your social media post.
You’ve reached the end of your document
Create your free account or log in to subscribe to the board - and gain access to more than 10,000+ support articles along with insights from experts and peers.