-
Notifications
You must be signed in to change notification settings - Fork 2.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TLV JS: Introduce JS library for TLV #5184
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is definitely heading in the right direction. Thanks again for continuing to contribute here!
I will say I think it's worth matching a little bit more of the behavioral aspects of the Rust libraries, namely:
Parsing still succeeds if the tlv data is not the right length. entry just returns null for a partial last entry and entries just excludes the last partial entry.
In Rust, this would fail to parse, so we probably want this to happen in JS as well. We might even need to introduce some errors similar to the Rust ones.
solana-program-library/libraries/type-length-value/src/state.rs
Lines 143 to 145 in f35dc5f
if tlv_data.len() < value_end { | |
return Err(ProgramError::InvalidAccountData); | |
} |
I also think we should try to use similar nomenclature for the class and method names, to avoid confusion, ie: TlvState
, get_first_bytes
, etc. Obviously we don't need the separated mut
API in here, though.
token/js/src/extensions/tlvData.ts
Outdated
function readTLVNumberSize<T>( | ||
buffer: Buffer, | ||
size: TLVNumberSize, | ||
offset: number, | ||
constructor: (x: number | bigint) => T | ||
): T { | ||
switch (size) { | ||
case 1: | ||
return constructor(buffer.readUInt8(offset)); | ||
case 2: | ||
return constructor(buffer.readUInt16LE(offset)); | ||
case 4: | ||
return constructor(buffer.readUInt32LE(offset)); | ||
case 8: | ||
return constructor(buffer.readBigUInt64LE(offset)); | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is a neat function here, but our TLV library is using u32
for length and Token2022 is using u16
, so we can probably do away with handling u64
and eliminate the need for bigint
. What do you think?
pub struct Length(PodU16); |
pub struct Length(PodU32); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
But the discriminator could still by 8 bytes, right? This function is currently used for both parsing the discriminator and the length
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes but it's not a u64
it's just an array, so it doesn't need to be represented as bigint
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yep that makes sense. This was only added because you can then supply a BigInt or number in the entry
function as well as a Buffer.
I can see that you're going for a more generalized approach here, and I'm not sure this idea I'm going to suggest will be as slick as I imagine it, but what do you think about instead using an interface and/or super class that together mock the That's what a lot of these tests are doing; just creating objects that implement In JS, I understand we would have to somehow replicate Rust's |
Agreed. Let's keep it similar to the rust implementation. Also think it makes sense to keep the naming the same. i.e.
I think it would definitely make sense to add some form of I think probably the easiest (and fastest) implementation could be a single function that takes a discriminator string and turns it into bytes/bigint/number which you can then throw into the export function splDiscriminator(key: string, length = 8): Buffer {
// TODO: make sure this also works in browser without needing to polyfill node crypto/buffer?
const digest = createHash('sha256').update(key).digest();
return digest.subarray(0, length);
} |
Ok I think we managed to fix the CI! Let me know when you're ready for another review |
Yes, I like this approach! I think that's perfect. The more I think about it, the more I think a more generalized approach as you've written makes sense, because as I mentioned above we want to actually not do what we did in Rust for Token2022 and TLV, and shared all TLV operations across this library. If Token2022 wasn't being audited, I'd vote to do this in Rust as well using the Rust TLV lib, and modify the Rust |
Can we roll a separate library for this stuff, rather than working it into Token JS? I think after some of our discussion it makes sense to introduce new ones, and then they can be imported into the metadata library as well as token (#5228). I think, for now, let's just keep SplDiscriminate and TLV stuff in one JS library - we'll call it We could prob add a folder to the type-length-value library for |
I think that makes sense. What location in the repository would you suggest for that? There is the I'm leaning towards either a |
You're right but that's only because none of those libraries have any JS counterparts 😂 I'd say just stick a |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is coming together nicely! I left comments mostly around simplifying the provided discriminator(s). I think if we can strip some of that complexity away, we'll be in good shape.
Thanks again!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Lgtm. Thanks for putting this together and addressing all feedback!
resolves #5127
In this PR I have isolated parsing of TLVData into its own file. For ticket #5127, I think this is fine for now as this allows us to very quickly extract TLVData parsing into its own library if we ever need it.
The
TLVData
class adds two helper functions for reading TLVData.entry(type)
returns the first entry where the type matches the supplied type (preferred)entries()
returns a map of all entries in the TLVData.Parsing still succeeds if the tlv data is not the right length.
entry
just returns null for a partial last entry andentries
just excludes the last partial entry.The TLVData class allows you to specify the length/span of both the type and length part of the tlv which should make parsing tlv data much easier in the future.
I added a couple simple tests for parsing and reading TLVData.
This new class is not actually being used yet anywhere in the code. I plan to create (at least) two follow up PRs that switch out the old implementation for this new one once this PR is merged:
getExtensionData
in./token/js/extension/extensionType.ts
getExtraAccountMetaAccount
in./token/js/exension/transferHook/state.ts