Problem
Pagination methods have hardcoded sleep durations:
searchAll(): 1000ms between pages
getUserTweetsAll(): 1000ms between pages
getBookmarksAll(): 500ms between pages
These values are baked into the code. Users who want faster scraping (at the risk of rate limits) or slower scraping (for safety) have no way to adjust.
Proposed solution
Add a pageDelay option to the client constructor and per-method overrides:
const client = new XReaderClient({
cookies: { ... },
pageDelay: 1500, // default between pages
});
// Or per-command via CLI
x-reader search "query" --all --delay 2000
x-reader bookmarks --all --delay 500
Additional context
This pairs well with #1 (retry with backoff). Together they give users full control over request pacing.
Problem
Pagination methods have hardcoded sleep durations:
searchAll(): 1000ms between pagesgetUserTweetsAll(): 1000ms between pagesgetBookmarksAll(): 500ms between pagesThese values are baked into the code. Users who want faster scraping (at the risk of rate limits) or slower scraping (for safety) have no way to adjust.
Proposed solution
Add a
pageDelayoption to the client constructor and per-method overrides:Additional context
This pairs well with #1 (retry with backoff). Together they give users full control over request pacing.