Replies: 2 comments
-
You can enqueue it again with a different |
Beta Was this translation helpful? Give feedback.
0 replies
-
I am afraid this is not possible.
You can use the request = Request.from_url('https://crawlee.dev/', always_enqueue=True) which will generate "unique" |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Crawlee optimizes requests by suppressing redundant URLs and giving up on a URL after reaching a configurable retry limit. This is great, but I encountered an edge case.
Situation: You're extracting data from a page and realize that it wasn’t downloaded properly.
Ideal Goal: Re-queue the URL while decrementing its retry counter.
Alternative: Add the URL back to the queue with a fresh retry counter.
How can I achieve either of these?
Beta Was this translation helpful? Give feedback.
All reactions