Adrestia is the new codename for all Cardano tooling and includes tooling that will have long-term support.
Adrestia recommends powering light clients with GraphQL. This is done by using cardano-db-sync to dump the database to a database and then using cardano-graphql to serve that data.
This provides an API useful to light wallets for Cardano.
For some endpoints, we use SQL queries directly (either because they aren't possible in GraphQL or because we need the performance of raw SQL queries)
However, we initially wanted every query to be in GraphQL so that any project using yoroi-graphql-migration-backend can use our REST API as a intermediate step to running GraphQL queries directly.
To run this, you will need to run the following
- cardano-node
- cardano-db-sync
Development build (with hot reloading):
# install the right version of Node
nvm use
nvm install
# install dependencies
npm install
# run the server
npm run dev
The server will then run at http://localhost:8082. You can query using curl (ex: curl http://localhost:8082/bestblock
)
This is no easy way to configure runtime settings. However, you can edit lines 23-26 of src/index.ts to change port settings, graphql uri, et cetera.
There are test which run by querying your local cardano-db-sync and cardano-graphql. You can run it by doing the following
# run the server on a terminal window
npm run dev
# run the tests on a different terminal
npm run test
For addresses, refer to CIP5 for how they should be encoded. Notably, we support
addr
&addr_test
stake
&stake_test
addr_vkh
We recommend querying using payment key hashes (addr_vkh
) when possible (otherwise you may miss addresses for a wallet such as mangled based addresses or other address types like pointer addresses)
txs/utxoForAddresses
InputUp to 50 addresses in the request
{
// byron addresses, bech32 address, bech32 stake addresses or addr_vkh
addresses: Array<string>
}
Output
Array<{
utxo_id: string, // concat tx_hash and tx_index
tx_hash: string,
tx_index: number,
block_num: number, // NOTE: not slot_no
receiver: string,
amount: string,
assets: Asset[],
}>
account/registrationHistory
Input{
// bech32 stake address
addresses: Array<string>
}
Output
{
[addresses: string]: Array<{|
slot: number,
txIndex: number,
certIndex: number,
certType: "StakeRegistration"|"StakeDeregistration",
|}>
}
account/state
Input{
// bech32 stake addresses
addresses: Array<string>
}
Output
{
[addresses: string]: null | {|
poolOperator: null, // not implemented yet
remainingAmount: string, // current remaining awards
rewards: string, //all the rewards every added (not implemented yet)
withdrawals: string // all the withdrawals that have ever happened (not implemented yet)
|}
}
account/rewardHistory
Input{
// bech32 stake address
addresses: Array<string>
}
Output
{
[addresses: string]: Array<{
epoch: number,
reward: string,
poolHash: string,
}>
}
pool/info
Input{
poolIds: Array<string> // operator key (pool id)
}
Output
{
[poolId: string]: null | {|
info: {
name?: string,
description?: string,
ticker?: string,
... // other stuff from SMASH.
},
history: Array<{|
epoch: number,
slot: number,
tx_ordinal: number
cert_ordinal: number
payload: Certificate // see `v2/txs/history`
|}>
|}
}
pool/delegationHistory
Input{
poolRanges: Dictionary<string, Dictionary<string, {fromEpoch: number, toEpoch?: number}>> // operator key (pool id), fromEpoch and toEpoch are inclusive
}
Output
[
{|
epoch: number;
poolHash: string;
slot: number;
tx_ordinal: number
cert_ordinal: number;
payload: Certificate | null;
info: {
name?: string;
description?: string;
ticket?: string;
homepage?: string;
}
|}
]
txs/utxoSumForAddresses
InputUp to 50 addresses in the request
{
addresses: Array<string>
}
Output
{
sum: ?string
}
v2/addresses/filterUsed
InputUp to 50 addresses in the request
{
// byron addresses, bech32 address or addr_vkh
addresses: Array<string>
}
Output
Array<string>
v2/txs/history
Since short rollbacks are common (by design) in Cardano Shelley, your app needs to be ready for this. The pagination mechanism should help make this easy for you.To handle pagination, we use an after
and untilBlock
field that refers to positions inside the chain. Usually, pagination works as follows:
- Query the
bestblock
endpoint to get the current tip of the chain (and call thisuntilBlock
) - Look up the last transaction your application has saved locally (and call this
after
) - Query everything between
untilBlock
andafter
. IfuntilBlock
no long exists, requery. Ifafter
no long exists, mark the transaction as failed and re-query with an earlier transaction - If more results were returned than the maximum responses you can receive for one query, find the most recent transction included in the response and set this as the new
after
and then query again (with the same value foruntilBlock
)
Note: this endpoint will throw an error if either the untilBlock
or after
fields no longer exist inside the blockchain (allowing your app to handle rollbacks). Notably, the error codes are
- 'REFERENCE_BLOCK_MISMATCH'
- 'REFERENCE_TX_NOT_FOUND'
- 'REFERENCE_BEST_BLOCK_MISMATCH'
Input
Up to 50 addresses in the request
{
// byron addresses, bech32 address, bech32 stake addresses or addr_vkh
addresses: Array<string>,
// omitting "after" means you query starting from the genesis block
after?: {
block: string, // block hash
tx: string, // tx hash
},
untilBlock: string, // block hash - inclusive
}
Output
Up to 50
transactions are returned. Use pagination with the after
field to get more.
Array<{
// information that is only present if block is included in the blockchain
block_num: null | number,
block_hash: null | string,
tx_ordinal: null | number,
time: null | string, // timestamp with timezone
epoch: null | number,
slot: null | number,
// information that is always present
type: 'byron' | 'shelley',
hash: string,
last_update: string, // timestamp with timezone
tx_state: 'Successful' | 'Failed' | 'Pending',
inputs: Array<{ // these will be ordered by the input transaction id asc
address: string,
amount: string,
id: string, // concatenation of txHash || index
index: number,
txHash: string,
assets: Asset[]
}>,
outputs: Array<{ //these will be ordered by transaction index asc.
address: string,
amount: string,
assets: Asset[]
}>,
withdrawals: Array<{| address: string, // hex
amount: string
|}>,
certificates: Array<{|
kind: 'StakeRegistration',
rewardAddress:string, //hex
|} | {|
kind: 'StakeDeregistration',
rewardAddress:string, // hex
|} | {|
kind: 'StakeDelegation',
rewardAddress:string, // hex
poolKeyHash: string, // hex
|} | {|
kind: 'PoolRegistration',
poolParams: {|
operator: string, // hex
vrfKeyHash: string, // hex
pledge: string,
cost: string,
margin: number,
rewardAccount: string, // hex
poolOwners: Array<string>, // hex
relays: Array<{| ipv4: string|null,
ipv6: string|null,
dnsName: string|null,
dnsSrvName: string|null,
port: string|null |}>,
poolMetadata: null | {|
url: string,
metadataHash: string, //hex
|},
|},
|} | {|
type: 'PoolRetirement',
poolKeyHash: string, // hex
epoch: number,
|} {|
type: 'MoveInstantaneousRewardsCert',
rewards: { [addresses: string]: string } // dictionary of stake addresses to their reward amounts in lovelace
pot: 0 | 1 // 0 = Reserves, 1 = Treasury
|}>
}>
v2/bestblock
InputNone (GET request)
Output
{
// 0 if no blocks in db
height: number,
// null when no blocks in db
epoch: null | number,
slot: null | number,
hash: null | string,
}
txs/signed
Input{
// base64 encoding of the transaction
signedTx: string,
}
Output
[]
status
This endpoint is used to test whether or not the server can still be reached and get any manually flagged errors.
Input
None (GET request)
Output
{
isServerOk: boolean, // heartbeat endpoint for server. IF you want the node status, use v2/importerhealthcheck instead
isMaintenance: boolean, // manually set and indicates you should disable ADA integration in your app until it returns false. Use to avoid weird app-side behavior during server upgrades.
serverTime: number, // in millisecond unix time
}
v2/importerhealthcheck
This endpoint is used to check whether or not the underlying node is properly syncingInput
None (GET request)
Output
200 status if things look good. Error if node is not syncing