This article describes a range of best practices that you can apply in your configuration of the platform, which is aimed at optimising the performance and stability of your instance.
The recommendations are primarily introduced to give you an understanding of what can affect performance of the platform and what you can strive for in order to ensure high performance.
It might not be possible for you to apply best practices within all areas, but the point is that if you apply best practices within one or more areas, you will also experience optimisation of performance.
Please be aware that performance can be affected by a lot of different factors so if you continuously struggle with performance, you might need to investigate multiple combinations of the factors that can have an impact.
The article will go through several topics in Agillic where your configuration can have an impact on performance. The topics are not in any order and you should pick the topics that apply to your instance. At the point of reading this article, you might already have experienced some performance issue or maybe you are simply interested in following best practices. In any case, it is expected to have a good understanding of the key functionalities in Agillic in order to follow the article, as well a good understanding of what data you have on your instance and how the data infrastructure is set up.
Contents
Instance Clean Up
First of all, the amount of data and configuration on your instance can impact performance. Therefore, we encourage you to do regular cleanups of your instance, so you always have the most up-to-date data, content and configuration on the instance. Whether this should be done once, twice or thrice a year depends on how much you configure and send out every day and your concrete use cases.
However, a general rule of thumb, content that has not been sent out for 24 months should be deleted. This goes for emails, SMS' and print files as these will take up unnecessary space on the instance and make it more complex to navigate the Channels module.
You can also consider deleting old target groups, flows, promotions, events, and export profiles. In order to do this smartly, you can use the Clean Up Feature which can automatically delete old items within a timeframe that you define. Read more about this feature here.
Please note that referenced items cannot be deleted which is why we recommend deleting items in this order:
Flows
Start by deleting flows, as these have the least references.
You can delete multiple flows at a time by holding the SHIFT key (to mark a range) or the CONTROL key (to mark non-consecutive items).
Conditions and Steps
Next, delete places where content can be referenced. This would, for example, be steps where you have used an Email condition.
Content Items
When the flows are deleted, the content items no longer have references and you are able to delete them from the Channels module.
Please note that at the moment, it is not possible to delete multiple items at a time from the Channels module.
Target groups
Then, you can delete all the target groups that you no longer use in flows or conditions on flow steps.
Data Structures
Finally, the last thing you should delete is all the data structures that you have setup on the instance. This is for example all the Global Data Tables, One-to-Many Tables and Data Lookups that you might have configured but do not need anymore. By deleting a data table, you are also deleting all your recipients' data might have in this table on production.
Target groups
If you experience performance issues when you work in the target group section, such as slow population counts or blank recipient windows, there might be different configurations you can do in order to optimise this.
First of all, make sure you understand how target group population count works by reading this article.
The target group section can be one of the most central places where you experience performance issue as this is where all data is loaded and evaluated for you recipients. This means that the more data you want evaluated, the longer time it takes to get population counts.
Depending on your setup, you might be able to optimise this by following some guidance on how to configure your target group conditions:
- Condition order: Strive for placing unique and indexed Person Data conditions at the top of your condition tree. This will decrease the amount of recipients for larger calculations. Also strive for placing data lookup conditions at the bottom of the condition tree.
-
Target group conditions: Avoid having too many nested target group conditions (where you target group conditions in target group conditions). You should avoid having more than 3-4 nested levels. Too many nested target group conditions can result in:
- Slower target group evaluation due to unnecessarily complex conditions
- A lost overview of relation between target groups, where user-error can occur during configuration. Are you for example checking the same field twice in two different target groups?
- Complex conditions: Conditions that evaluate between multiple data tables such as data lookup conditions, can be the most expensive conditions in terms of population counts. If you are doing a lot of advanced conditioning based on lookups and One-to-Many data, consider if you can consolidate some of this logic in daily data flows that use side effects to "flag" recipients if they belong to segment A or B. Also see the point about Data in this article about more information on this.
- Nightly population counts: Be aware that Agillic conducts population counts, so if you simply need a number for today's count, you can see the population column in list of target groups in the target group section in the Data module.
Flows
When it comes to Flows and performance, there are some things you need to be aware of. Flows are central to all execution on your instance and are therefore designed to process a lot of data, however, there are still things you might avoid in order not to clutter data processing in the flows or to make it more user friendly to work with.
- Consider how many flow you are executing at the same time. Multiple flow executions in the same minute may effect performance if you have more than 10 flows running at the same time (depending on the target group size).
- If possible, avoid complex conditions on loops and split steps. You should also refrain from using tracking conditions on steps completely.
- Steps that require more evaluation: Some step types require more processing than others. For example, a paid media step takes longer time to execute than an email step. This also goes for extension steps that call external systems.
- There is no limit as to how many steps you can use in a flow, but avoid cluttering your flow with some of the above mentioned step types that require a lot of evaluation, as this will increase the time until the flow is evaluated to end.
- Export flows are useful for keeping your data ecosystem updated, however consider if you need to export all data, or if you can do optimisations to the export by doing only delta export. In order to do this, you can for example use the fixed person data field LAST_UPDATED as condition in your export target group.
Promotions
When we talk about promotions in relation to performance, there are two things you need to bear in mind. Promotions are both about content and evaluation (based on conditions). Therefore, a promotion takes up performance space both in terms of the content it contains and the evaluation it does based on your configuration.
Therefore, the same goes for what has already been mentioned in terms of cleaning up content, which means you should delete old promotions that are not in use, and the target group conditions, as the more complex conditions you have on the different proposition, the longer each evaluation will take.
Furthermore, you should avoid using promotions for pure reporting purposes because this will increase the logged activity which then will have an impact on performance.
Data
The amount of data that you have on your instance is probably the most crucial part of your performance optimisation. How much and what data you should have in Agillic cannot be answered in one sentence, but as a rule of thumb we advise that you only have data in Agillic you need in order to do your marketing communication.
This means that you should not import all transactional data for all your recipients, if you are not going to do communication based on this.
However, with this being said, there are some guidance as to the amount of data that you can store in order for one-to-many tables and global data tables perform well for execution. Please refer to these guidelines when building your data model.
The same goes for the number of recipients. It is realistic to have some overhead between active recipients and total recipients on the instance (i.e. the count of All recipients target group), but you might consider if you are ever going to try to reactivate the inactive recipients or whether they should be deleted from Agillic. This is only about optimising performance but also minimising the risk of unnecessarily large sendouts.
Do not have OTM or GDT fields just for the sake of having them. Only have ones you are actually planning to use for segmentation
- Be aware that we do not recommend having more than 2000 recipients on your staging environment. Also, avoid loading more than 1000 rows in a Global Data Table into staging as well.
- Index the 5 Person Data fields in System Settings that you use most often for segmentations, as this will speed up evaluation of conditions based on these.
- Consider if you need all columns in a One-to-Many table, or if can delete some columns from the table and consolidate the data.
- Consider for how long time you need historic data in Agillic. Depending on the size of your database, we recommend not storing transactional data, product data, subscription data and similar for more than 2-3 years. There can be good use cases where you would need data that old, but consider how often the use cases appear and whether these can be solved in another way, for example, by consolidating this segment in your CRM platform.
- Be mindful when changing field types on tables that are already in use in data lookups, conditions or in content. You will need to reconfigure your lookups after having changed the fields.
Publish
Remember to do publishes regularly to keep your staging and production environment aligned. This will give you a better overview when working on the instance. If you have flows that you working on or for some other reason are not to be executed on production, then do not set an execution method (trigger or schedule) on the flow.
Also, we advise you to not to rely solely or heavily on partial publish. Some aspects of the Agillic components require other types of publishes, such as context parameters. As this will not impact performance per se, it can complicate the performance of your instance if all items in use are not published correctly.
Reports
The reporting module evaluates all activity data on the instance, which happens every night, so the updated numbers are ready when you log into the instance. However, sometimes you need to refresh a report during your workday. In this regard, we recommend that you strive to refresh 1 report at a time.
Details View
The Details View in the Recipients view in the Target Group editor can be useful in order to show you data fields and tables for the recipients in the target group, however, you should be mindful of how many fields you are loading into the view.
Instead of having one view with all the data fields, you should strive to have multiple smaller views with restricted views. You could, for example, have a view with the 10 Person Data fields that holds the basic account information on the recipients, or have a view with some transactional data tables for your recipients.
Any categorisation that you find fulfilling for your use case will optimise performance when loading the recipients' data in this view.