Skip to content
This repository has been archived by the owner on Oct 17, 2023. It is now read-only.

Adaptors

Dj Walker-Morgan edited this page Mar 27, 2017 · 3 revisions

To get data from a database or to write data to a destination, Transporter uses Adaptors. These are software connectors that are built into Transporter to handle input and output. Running transporter about will list the available Adaptors. An adaptor can be configured as a Source, where the adaptor reads from a database and sends messages into the Transporter pipeline, or as a Sink, where the adaptor gets messages from the Transporter pipeline and writes them to a database.

elasticsearch

The Elasticsearch adaptor can only be a Sink and write to Elasticsearch. Incoming messages from the pipeline become JSON documents in an Elasticsearch index.

mongodb

The MongoDB adaptor can read from (Source) and write to (Sink) MongoDB databases of all types. Messages are converted to and from JSON documents by the adaptor. When reading from a MongoDB instance, it can be put into tail mode which, once initial data has been copied, allows it to generate a synchronizing stream of changes. When writing, the MongoDB adapter can put into bulk mode to send multiple updates or new records in a single batch.

postgres

The PostgreSQL/Postgres adaptor can read from (Source) and write to (Sink) PostgreSQL databases. When reading, Postgres table rows are converted to a flat JSON document of column names as keys and row values as values. Using logical replication, reading can create a stream of changes in the table. When writing, the message's top level key/values are mapped to column names and values.

rethinkdb

The RethinkDB adaptor can read and write to RethinkDB databases. Messages are converted to and from JSON documents by the adaptor. When reading from a RethinkDB cluster, it can be put into tail mode which, once initial data has been copied, allows it to generate a synchronizing stream of changes.

file

The file adaptor can read (Source) and write (Sink) disk files, standard in/out or pipes. When writing, the adaptor writes a single JSON string generated from the Message contents. When reading, the adaptor assumes each line read is a JSON object, parses it and populates the Message with the result.

rabbitmq

The RabbitMQ adaptor handles publish/subscribe messages to and from the Transporter. It reads and writes message bodies as JSON objects encoded as strings. The routing key for RabbitMQ can be set in the configuration or extracted from a named field in the messages.