Replies: 2 comments
-
same issue with me |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Recently I am required to deploy a transformer model on FPGA using hls4ml by my mentor.
I am not sure if this is possible, because the model is too large and may need to buffer a large amont of intermediate data.
So, is it possible, or worth to try?
Beta Was this translation helpful? Give feedback.
All reactions