We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hey @loong - love this repo and its helping me improve my Go skills
After adding what I think is the desired solution the tests continue to fail. Even when I include a 5 second delay.
func Crawl(url string, depth int, wg *sync.WaitGroup) { + rateLimit := time.Second * 5 + throttle := time.NewTicker(rateLimit) + defer throttle.Stop() defer wg.Done() if depth <= 0 { return } body, urls, err := fetcher.Fetch(url) if err != nil { fmt.Println(err) return } fmt.Printf("found: %s %q\n", url, body) wg.Add(len(urls)) for _, u := range urls { // Do not remove the `go` keyword, as Crawl() must be // called concurrently + <-throttle.C // rate limit client calls go Crawl(u, depth-1, wg) } return }
Go version go version go1.21.0 darwin/amd64
go version go1.21.0 darwin/amd64
Please let me know if I'm missing something here. Thanks!
The text was updated successfully, but these errors were encountered:
@p-duke Hi! Have you figured it out by now?
The issue is that the throttle ticker gets initialized each time Crawl() is called. Meaning you are creating a new ticker every time!
You can initialize one ticker and pass it down to Crawl() as a parameter. Here is an example: https://github.com/loong/go-concurrency-exercises/pull/19/files
Sorry, something went wrong.
No branches or pull requests
Hey @loong - love this repo and its helping me improve my Go skills
After adding what I think is the desired solution the tests continue to fail. Even when I include a 5 second delay.
Go version
go version go1.21.0 darwin/amd64
Please let me know if I'm missing something here. Thanks!
The text was updated successfully, but these errors were encountered: