Improve retry mechanism (#194) (#194)

When a ChunkedEncodingError occurs, request and response are not set and there is no way to get the URL that causes the error.
With this change all URLs are retried. The max_retries parameter is decreased each time so that we do not get stuck in an infinite loop.

I also considered to also wait before retrying, but for now I don't see any benefit to it.

Relates to #188.
This commit is contained in:
Marc Wrobel
2023-11-26 19:00:32 +01:00
committed by GitHub
parent 37683f9677
commit 1e65a048b0
2 changed files with 18 additions and 23 deletions

View File

@@ -33,6 +33,8 @@ print(f"::group::{PRODUCT}")
all_versions = {}
next_page_url = URL
# Do not try to fetch multiple pages in parallel: it is raising a lot of ChunkedEncodingErrors and
# make the overall process slower.
while next_page_url:
next_page_url = fetch_releases(all_versions, next_page_url)