[Discussion] Decentralizing Safe’s Transaction Service & Creating a Sustainable Revenue Model for the SAFE Token

Thanks for the post and bringing this topic up again!

Agree that the transaction service is a centralized point of failure. It serves 2 main purposes:

  1. Indexing historical onchain data - “indexer”
  2. Storing tx data and pre-confirmations before they are executed - “queue”

The indexer is much heavier and should ideally be run by large infrastructure providers such as Alchemy. Privacy is less of an issue since data is historical and onchain anyway. Of course the question of right incentives and economic soundness as part of a token model remains. This seems what TheGraph originally set out to do, right? Can we draw any learnings from them?

The queue is much more lightweight in terms of processing overheard. It is what the original post is mainly about, right? The number of actors being able to run it, is much bigger. Question of incentives remains, of course. Are users willing to pay for each pre-confirmation or prefer free but less available services. The queue is the component though where privacy is more relevant. While both are relevant I feel we should look at the 2 topics of privacy and decentralization separately.

Privacy:
To the point of encrypting via eg. HPKE, assuming the Safe has multiple owners, the tx data would just be encrypted for each owner, separately, correct?

(Side topic and not so relevant for this proposal: We will have to showcase ways to encrypt data with a Safe that also allows for changing owner structure. This would enable all kinds of use cases leveraging end2end encryption with Safes.)

Decentralization:
I don’t have much comments at this point on the proposed Waku/EigenLayer idea, however was thinking of the OG Gnosis MultisigWallet which just put all tx data + confirmations on chain. At Safe we have an internal, more lightweight prototype for this ready to be published soon which should help us explore how to decentralize the transaction queue.

4 Likes