The biggest threat to performance of any Blue Prism database is mismanagement of the database size, in particular, letting Session Logs go unchecked and table sizes growing beyond what can be reasonably managed.
Your Session Log archiving process should be mature and fit the requirements of your specific data retention policies.
Numerous methods can be implemented to ensure good Session log archiving. These include:
- Using the in-application archiving functionality to delete unwanted logs, if records are not required.
- Using the in-application archiving functionality to archive unwanted logs.
- Use the SQL scripts provided by Blue Prism Customer Support \ Professional Services.
- Use the command-line AutomateC command to perform session log archiving.
Consider exporting data to 3rd Party applications, such as, Splunk, Power BI, Tableau or SharePoint. Blue Prism is not a data warehouse and the Database should not be treated this way.
In Blue Prism version 6.5+ it is highly recommended to take advantage of the functionality of Data Gateways. This will allow you to export any session logs (and published dashboards) to repositories external to the Blue Prism database. This can ease a lot of the pain associated with storing detailed and potentially old logs within the database itself, and subsequently having to archive those logs. Note, any Application Server hosting a Data Gateway should have at least 8GB of RAM.
Ensure the additional application housekeeping scripts, provided by Blue Prism on request, are in operation. This will prevent tables upon which there is no natural housekeeping to be maintained at a suitable level. Neglecting to do so will eventually result in performance issues. The scripts trim the Audit Events, Schedule Logs and Work Queue Items tables, and should be run as a SQL script.
If your DBAs identify any new indexes that they believe would assist in the Database operation then do not ask for Blue Prism’s permission to apply it. However, your DBA will there after be responsible for maintaining the Database structure, and Blue Prism cannot guarantee that future Database updates will not remove customer-created indexes.
Ensure the relevant tuning parameters are on. Specifically, these are:
AUTO_CREATE_STATISTICS and AUTO_UPDATE_STATISTICS should be switched on.
Check for defragmentation in Indexes and whenever they reach a certain threshold either rebuild or reorganize.
Run \ check DBCC CHECKDB commands and act upon any output.
Conservative estimates for the size of your Blue Prism Database are as follows:
- Data file: 10GB per Digital Worker
- Log File: 50% of the Data file
A well maintained database, where session log archiving is performed regularly and effectively, should only require around half of the size estimates given above. However it is important that auto-growth settings of 1024MB should still be set on both Data and Log File.