[Imported] Historic Files - Too many event records?
EcoStruxure Geo SCADA Expert Forum
Schneider Electric support forum about installation, configuration, integration and troubleshooting of EcoStruxure Geo SCADA Expert (ClearSCADA, ViewX, WebX).
Send a co-worker an invite to the portal.Just enter their email address and we'll connect them to register. After joining, they will belong to the same company.
You have entered an invalid email address. Please re-enter the email address.
This co-worker has already been invited to the Exchange portal. Please invite another co-worker.
Please enter email address
Send InviteCancel
Invitation Sent
Your invitation was sent.Thanks for sharing Exchange with your co-worker.
Link copied. Please paste this link to share this article on your social media post.
Posted: 2019-11-0612:43 PM. Last Modified: 2023-05-0312:26 AM
[Imported] Historic Files - Too many event records?
>>Message imported from previous forum - Category:ClearSCADA Software<< User: sbeadle, originally posted: 2019-03-07 09:30:40 Id:379 I have this alarm or something like it - why and what to do with it?
This feature is designed to protect ClearSCADA from bad configuration or bad devices reporting too much data.
The Warning level just raises an alarm, and the Maximum will cause ClearSCADA to discard events for that time period.
They apply to Historic value data, Events, Alarm Summary and Configuration Change data.
From 2017 R2, the Events configuration includes a stream limit and an object limit. (The stream limit will be higher – it bunches together events in each range of object ids).
Q - Is there a place where I can find the number of records for each files ? I want to set good value in these parameters.
You can: a) Check the largest files in Server Status, and Snapshot files. See the section Historic | Historian. Scroll right. (Record sizes are another column). b) Use a query to find largest files and details: select top (100) * from CDBEventFile order by RecordCount desc (Also table CDBAlarmSummaryFile, CDBConfigChangesFile, CDBHistoricFile)
CDBHistoric data is one file per object per week.
Stream data in CDBEvent, CDBAlarmSummary and CDBConfigChanges are not one file per hour per object, they are one file per hour per stream. A stream is a range of objects with ids going from 0 to (say) 1023, then the next from 1024 to 2047 etc. You can set the stream size in the Server Config tool in the Historic Configuration pages. Adjust the field ‘Stream Size'.
a) Only change this before commissioning, and it's a lot quicker if you delete data first. (Stop, delete the relevant historic folder and restart). Otherwise you can be waiting a long time at startup for existing files to be reorganised. This is a good argument for doing system design early! b) You are looking for a balance between file size (not too big) and number of stream files (not too many!) A single file larger than 50Mb could cause delays when queried, as the database is locked while it is loaded into memory. c) Make sure you use the Events object limit configuration on the root group to stop logging events on an object basis. This is a good way to prevent ‘denial of service' by a field device logging too much data.