Performance improvements for fetching large changelogs #31990
isaac57
started this conversation in
Suggest an Idea
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Changelog performance issues and improvements
I am using a self-hosted Renovate instance running Renovate inside docker using the
latest
tag. We are targeting repositories in Bitbucket Cloud. Periodically we see a huge increase in runtime for Renovate, our runtime typically averages about 10 minutes, but that can increase to over an hour when certain PRs are open. In checking the logs, I can see that fetching changelogs for certain dependencies take a long time if those dependencies are far out of date (I've seen up to 40 minutes for a single dependency). And since Renovate re-fetches the changelogs each time it runs for an open PR (I assume to make sure the change logs weren't updated), this creates a bottleneck and inefficiencies on our Renovate server. I can think of a few options to resolve this:1.) Add a timeout when fetching changelogs
2.) Don't re-fetch the changelogs unless the PR needs to be updated for another reason
3.) Add a limit for the number of releases it checks for change logs. For example if we are upgrading from version 1.0.1 to 1.0.100, (we are behind by 100 releases) only fetch change logs from the most recent x releases.
Beta Was this translation helpful? Give feedback.
All reactions