Migration Summary

Your segmentation inside FT CRM will be updated through real-time events. Once an event is sent, all segmentation which is based on that type of data will be instantly updated for that player. From the date the event stream is fully implemented, the integration will ensure that all segmentation is updated.
However, if you want to get segmentation capabilities of the player's activity before the live stream was implemented, the historical data needs to be migrated into FT CRM. Below you can find an explanation of the process to achieving a successful data migration.
File requirements
  • Files must contain headers
  • Headers must be exactly named as in the templates you receive
  • The amount of headers must be exact the same as in the templates you receive
  • The headers must be in the exact order as in the template files
  • Files must be exported in UTF8 encoding
  • Strings must be encapsulated with double quotes
  • Bools should be represented as 0 and 1, not true and false
  • Files must have unix style line endings (LF) not Windows style line endings (CRLF)
  • All timestamps must have RFC 3339 format
  • Any unicode glyphs or emojis in any fields will be replaced with question marks


The following is needed prior to migration.

1. Real-time events

Events needs to be sent on a production environment and the data flow needs to be reconciled so it matches against your database. You can for instance take a few sample users which registered after the event stream was enabled and compare that the segmentation in FT CRM matches your database.
The most common causes for data discrepancies are setup issues with the events, certain events are not being sent, events getting lost or our segmentation mapping doesn't match the definition you expect.


The operator needs to provide Fast Track access to an SFTP where the data files can be uploaded. The most common is an S3 Bucket on your AWS account.

3. Dry-run

Your Integrations Manager will provide you with a template CSV file which will be used for a dry-run exercise, this is done to prevent any issues with the real migration. We expect you to send us an extract of data in the exact same format as in the template, it's ideal that you include the full history of data.
The dry run will include two different types of data sets.
Data set Description
Data set
Event Table
Individual transactions
Activity Table
Aggregated values
Once the file is received, Fast Track can plan a dry-run migration and provide you with the results. It's important that the results are properly reconciled before the actual migration takes place.


Once the pre-requisites are ready, you will need to schedule in a date and time to migrate the data into the Fast Track system, please contact your Integrations Manager to reserve a slot.


  1. Fast Track will scale down the services on the agreed timestamp, this is done to avoid any new events being processed during the time of the migration.
  2. The operator will generate the migration files and upload it to the SFTP. It's important that the file uploaded follows the same format as the verified dry-run and that the extract is until the exact time when we scaled down the services.
  3. Fast Track will download the file, import the data into temporary tables, ready to be verified. We will then upload the results to your SFTP for reconciliation.
  4. The operator verifies the migration results and informs Fast Track once completed.
  5. Fast Track can run the import against the real database.
  6. Once completed, Fast Track will scale up the services and events will start getting processed again and the migration is completed.
Services will be scaled down during the migration process, no Activities based on real-time events will be able to fire during this period.