API Data Integrity
Data integrity is incredibly important in any system but is especially true in financial systems. This is made more difficult when the data processing ownership is split across multiple systems.
Most of the data itself such as the project and vintage attributes, serial numbers and other underlying data is owned by the registry and that data is the source of truth, therefore it is declared read-only on-chain. However the minting of tokens is a process owned by the Bridge Core and so it is imperative that we mint the correct number of tokens based on the credits we have in the collateralization pool for example.
We use a number of techniques to ensure the data is kept in-line.
In essence sending the same request twice is safe and the Bridge won't process it twice. This is a consideration that we've applied across the application and requires a number of checks and balances all the way through the stack.
The system is horizontally scalable for performance purposes which means it is essentially multi-threaded. Processing the same event at the same time will not ultimately result in it being processed twice. We have taken care across the application to ensure there are no race conditions. In an event driven system like the Bridge, if these are not paid due care and attention there would be several areas that could ultimately cause messages to be processed twice, even with idempotent mechanics in play.
In the case of a system failure the Bridge is able to pick up where it left off and absolutely ensure it doesn't process anything multiple times. This requires careful consideration on the ordering of processes and the ability to re-run processes that have partially completed.