Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Always set 0 as the port for distributed training (?) #468

Open
frostedoyster opened this issue Jan 31, 2025 · 1 comment
Open

Always set 0 as the port for distributed training (?) #468

frostedoyster opened this issue Jan 31, 2025 · 1 comment
Labels
Discussion Issues to be discussed by the contributors

Comments

@frostedoyster
Copy link
Collaborator

"In Linux, if you specify 0 as the port number when creating a socket, the system will automatically assign an available port from the dynamic or ephemeral port range."

Source: ChatGPT

That would allow us to remove one parameter for models that can do distributed training (e.g.

)
@Luthaf

@frostedoyster frostedoyster added the Discussion Issues to be discussed by the contributors label Jan 31, 2025
@Luthaf
Copy link
Member

Luthaf commented Feb 3, 2025

That sounds like a good idea, ideally printing the port we picked in the output somewhere?

We could also leave the option to manually specify a port in the hypers, but default it to 0

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Discussion Issues to be discussed by the contributors
Projects
None yet
Development

No branches or pull requests

2 participants