Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No pending funding transaction found! #1205

Closed
holzeis opened this issue Sep 1, 2023 · 10 comments · Fixed by #1236
Closed

No pending funding transaction found! #1205

holzeis opened this issue Sep 1, 2023 · 10 comments · Fixed by #1236
Assignees
Labels
bug Something isn't working prod-environment

Comments

@holzeis
Copy link
Contributor

holzeis commented Sep 1, 2023

I just opened a new JIT channel on mainnet and did not pay JIT channel opening fees, because the transaction could not be found upon the channel ready event.

The following error happened when trying to register the funding transaction to get payed.

Failed to fetch transaction: 5430bccdf192cf1df8475e9d72bb6403cd8d714e2fad064fcdc6795bffb7c44a from esplora. Error: reqwest::Error { kind: Decode, source: Error("expected value", line: 1, column: 1) } 

This results in no funding transaction getting registered and thus when trying to pay for it, it fails because we do not know how much fees we have to pay.

@holzeis holzeis added bug Something isn't working prod-environment labels Sep 1, 2023
@holzeis
Copy link
Contributor Author

holzeis commented Sep 1, 2023

I think @luckysori suggestion to generally setup an async task that is regularly checking if fees have to be paid may be the less failure prune approach here.

@holzeis
Copy link
Contributor Author

holzeis commented Sep 1, 2023

Maybe as well fixed, when running our own esplora instance.

@bonomat
Copy link
Contributor

bonomat commented Sep 4, 2023

I assume the problem was that we checked too fast for the transaction. It was eventually confirmed.
https://mempool.space/tx/5430bccdf192cf1df8475e9d72bb6403cd8d714e2fad064fcdc6795bffb7c44a

I wonder if we should bother with this or try to build out the proper LSP flow #843?

@bonomat bonomat assigned bonomat and unassigned bonomat Sep 5, 2023
@bonomat
Copy link
Contributor

bonomat commented Sep 5, 2023

Unasigning myself again. I believe this requires a different fix than just using our self-hosted esplora instance.
I propose to

@holzeis
Copy link
Contributor Author

holzeis commented Sep 5, 2023

I assume the problem was that we checked too fast for the transaction. It was eventually confirmed.

The broadcast should have happened over the same esplora endpoint and it was definitely broadcasted before we make the query. Why do you think that we checked too fast? Shouldn't we also be able to fetch unconfirmed transactions?

@bonomat
Copy link
Contributor

bonomat commented Sep 5, 2023

Exactly, the broadcasting worked so it should have picked up the tx from the mempool. My assumption is thought that there was a race between broadcasting and querying for the transaction.

For the log above, do you have the timestamp it was printed?

I have the same error and it's dangerously close together:

in the coordinator

{"timestamp":"2023-09-01T07:19:53.969466643Z","level":"INFO","fields":{"message":"Broadcasting transaction","txid":"2b5ff594f8318ead5a79cb3eba978f9a8ba62e9a313e05fd728f4558b03e2792","raw_tx":"....."},"target":"ln_dlc_node::ldk_node_wallet"}

in the app:

2023-09-01 10:19:54.251  LogLevel.ERROR  Failed to fetch transaction: 2b5ff594f8318ead5a79cb3eba978f9a8ba62e9a313e05fd728f4558b03e2792 from esplora. Error: reqwest::Error { kind: Decode, source: Error("expected value", line: 1, column: 1) }   

The difference in time is about 300ms. It can very much be that we tried to fetch before blockstream.info was done processing the other request and did not find the transaction yet.

@bonomat bonomat self-assigned this Sep 5, 2023
@holzeis
Copy link
Contributor Author

holzeis commented Sep 5, 2023

My guess is that we run into a 429 too many requests error. But we should probably log the response we are getting to be sure.

@bonomat
Copy link
Contributor

bonomat commented Sep 6, 2023

I don't think so, we would have seen this in the log as shown here: #658 (comment)

@holzeis
Copy link
Contributor Author

holzeis commented Sep 6, 2023

I don't think so, we would have seen this in the log as shown here: #658 (comment)

I don't think, so look at the error

Error: reqwest::Error { kind: Decode, source: Error("expected value", line: 1, column: 1) }

The response couldn't be parsed. We do not check if the http request was actually successful. My guess is that if we would check that we would have seen a 429 error.

@bonomat
Copy link
Contributor

bonomat commented Sep 6, 2023

Error: reqwest::Error { kind: Decode, source: Error("expected value", line: 1, column: 1) }

Well, I guess we won't be able to determine who is right or wrong because in both cases a response is returned which cannot be parsed into JSON.

In case of 429 we get:

<html>\r\n<head><title>429 Too Many Requests</title></head>\r\n<body>\r\n<center><h1>429 Too Many Requests</h1></center>\r\n<hr><center>nginx</center>\r\n</body>\r\n</html>\r\n

in case of 404 (tx not found), we get

Transaction not found

So let's just hope this was fixed in #1236 or when using our own esplora instance :)

p.s. I wrote a little client and bombarded them with requests... now I believe they blacklisted me forever 😅

Caused by:
    0: error trying to connect: dns error: failed to lookup address information: nodename nor servname provided, or not known
    1: dns error: failed to lookup address information: nodename nor servname provided, or not known
    2: failed to lookup address information: nodename nor servname provided, or not known

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working prod-environment
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants