-
-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Parallelize time-consuming parts when there are many local branches #130
base: main
Are you sure you want to change the base?
Conversation
It's good that it will be faster. The reason I put off parallelization is that the GitHub API has a rate limit, and if there are too many requests, an error may occur. I'll check the specifications around that. |
Yes, I had the same concern, too. Nonetheless, as the GitHub API queries for getting corresponding PRs for a set of commit hashes are already chunked by 6 hashes due to the variable length limit (256 characters) in the GraphQL schema. So, even with ~30 local branches, the number of actual GitHub API requests would be less than 10 for a single execution of I think practically this shouldn't be the issue as long as the user is logged in, unless the user repeats |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Parallelization looks good!
The API rate limit seems most affected by the number of requests per minute, but I don't think it will cause an error unless there are many local branches.
https://docs.github.com/en/graphql/overview/rate-limits-and-node-limits-for-the-graphql-api#secondary-rate-limits
} | ||
oids, err := connection.GetLog(ctx, branch.Name) | ||
if err != nil { | ||
return |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If the git process returns an error, the user should be informed of the error.
} | ||
|
||
queryHashes := shared.GetQueryHashes(branches) | ||
results := make(chan pullRequestResult, len(queryHashes)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I want to unify the naming with the goroutine below, so how about calling it something like prChan
?
I have more than 30 local branches working with.
This is partly because we are using Graphite.dev to make stacked, fine-grained, small-sized PRs, increasing the overall number of PRs.
After inserting simple
Println()
calls around the code, I found thatapplyCommits()
and querying the PRs by commit hashes from the GitHub were the two major bottlenecks with a large number of local branches.This PR simply parallelizes them using goroutines and wait groups.
A simple benchmark:
Before:
After:
You could observe that the most of time were wasted by waiting for I/O.