Skip to main content
Version: v2.0

Events publishers

Events

The following events types are available:

  • COMMITTED_TRANSACTIONS
  • SAVED_METADATA
  • UPDATED_MAPPING
  • REVERTED_TRANSACTION

Publishing to HTTP

An HTTP publisher is available out of the box since 1.3.x. It allows you to push ledger events to an http server of your choice, in a webhook-like fashion. It can be enabled by setting the configuration variables PUBLISHER_HTTP_ENABLED and PUBLISHER_TOPIC_MAPPING.

info

For HTTP publishing on Formance Cloud, please refer instead to the webhooks section of The Formance Platform.

Example

# Publishing COMMITTED_TRANSACTIONS events
ledger serve \
--publisher-http-enabled \
--publisher-topic-mapping='COMMITTED_TRANSACTIONS:http://localhost:3042/numary-hook'

# Publishing all events
ledger serve \
--publisher-http-enabled \
--publisher-topic-mapping='*:http://localhost:3042/numary-hook'

On the receiving hand, the http body sent will be formatted as per these structs:

{
"date": "2022-05-12T14:31:07.808419+02:00",
"type": "COMMITTED_TRANSACTIONS",
"payload": {
"transactions": [
{
"postings": [
{
"source": "world",
"destination": "central_bank",
"amount": 100,
"asset": "GEM"
}
],
"reference": "",
"metadata": {
"description": "A testing transaction."
},
"txid": 0,
"timestamp": "2022-05-12T12:31:07Z"
}
],
"volumes": {
"central_bank": {
"GEM": {
"input": 100,
"output": 0
}
},
"world": {
"GEM": {
"input": 0,
"output": 100
}
}
}
},
"ledger": "example-ledger-1"
}

Publishing to Kafka

An integration with Kafka is available out of the box since 1.3.x. It allows you to push ledger events to a kafka broker of your choice, bringing easy and reliable extensibility. It can be enabled by setting the configuration variables PUBLISHER_KAFKA_ENABLED, PUBLISHER_KAFKA_BROKER and PUBLISHER_TOPIC_MAPPING.

Other variables are available for further authentication and configuration.

info

Access to Kafka publishing is not yet available in Formance Cloud.

Example

# Publishing COMMITTED_TRANSACTIONS events to the default topic
ledger serve \
--publisher-kafka-enabled \
--publisher-kafka-broker="localhost:9092" \
--publisher-topic-mapping='COMMITTED_TRANSACTIONS:default'

# Publishing all events to the default topic
ledger serve \
--publisher-kafka-enabled \
--publisher-kafka-broker="localhost:9092" \
--publisher-topic-mapping='*:default'

On the receiving hand, the http body sent will be formatted as per these structs:

{
"date": "2022-05-12T14:31:07.808419+02:00",
"type": "COMMITTED_TRANSACTIONS",
"payload": {
"transactions": [
{
"postings": [
{
"source": "world",
"destination": "central_bank",
"amount": 100,
"asset": "GEM"
}
],
"reference": "",
"metadata": {
"description": "A testing transaction."
},
"txid": 0,
"timestamp": "2022-05-12T12:31:07Z"
}
],
"volumes": {
"central_bank": {
"GEM": {
"input": 100,
"output": 0
}
},
"world": {
"GEM": {
"input": 0,
"output": 100
}
}
}
},
"ledger": "example-ledger-1"
}