-
Notifications
You must be signed in to change notification settings - Fork 6.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Example Request] #4667
Comments
HubGab-Git
added a commit
to HubGab-Git/amazon-sagemaker-examples
that referenced
this issue
Oct 6, 2024
HubGab-Git
added a commit
to HubGab-Git/amazon-sagemaker-examples
that referenced
this issue
Oct 6, 2024
HubGab-Git
added a commit
to HubGab-Git/amazon-sagemaker-examples
that referenced
this issue
Oct 6, 2024
HubGab-Git
added a commit
to HubGab-Git/amazon-sagemaker-examples
that referenced
this issue
Oct 6, 2024
HubGab-Git
added a commit
to HubGab-Git/amazon-sagemaker-examples
that referenced
this issue
Oct 6, 2024
5 tasks
Hi @math-sasso, I’ve created a notebook for training the Flan T5 Small model to keep costs low, as I’m doing this on my personal account for training purposes. Please find the notebook at the link below and let me know what you think: Feel free to let me know if you’d like any further adjustments! |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Describe the use case example you want to see
I would like to know how can I use Tensorboard with SagemakerJumpstart. Could you please provide an example?
If you could specially train in a llama3 (can be 8B for testing purposes) would be amazing.
How would this example be used? Please describe.
I am training my models but can not get the loss curves. Would like to export to tensorboard.
Describe which SageMaker services are involved
Sagemaker Jumpstart
Describe what other services (other than SageMaker) are involved*
Tensorboard
Describe which dataset could be used. Provide its location in s3://sagemaker-sample-files or another source.
You can use your own example datasets.
The text was updated successfully, but these errors were encountered: