When a ChunkedEncodingError occurs, request and response are not set and there is no way to get the URL that causes the error. With this change all URLs are retried. The max_retries parameter is decreased each time so that we do not get stuck in an infinite loop. I also considered to also wait before retrying, but for now I don't see any benefit to it. Relates to #188.
This commit is contained in:
@@ -33,6 +33,8 @@ print(f"::group::{PRODUCT}")
|
||||
all_versions = {}
|
||||
next_page_url = URL
|
||||
|
||||
# Do not try to fetch multiple pages in parallel: it is raising a lot of ChunkedEncodingErrors and
|
||||
# make the overall process slower.
|
||||
while next_page_url:
|
||||
next_page_url = fetch_releases(all_versions, next_page_url)
|
||||
|
||||
|
||||
Reference in New Issue
Block a user