other versions
- bookworm 1.25.3-1
- bookworm-backports 1.28.4-1~bpo12+1
- testing 1.28.4-1
- unstable 1.28.4-1
GALLERY-DL(1) | gallery-dl Manual | GALLERY-DL(1) |
NAME¶
gallery-dl - download image-galleries and -collections
SYNOPSIS¶
gallery-dl [OPTION]... URL...
DESCRIPTION¶
gallery-dl is a command-line program to download image-galleries and -collections from several image hosting sites. It is a cross-platform tool with many configuration options and powerful filenaming capabilities.
OPTIONS¶
- -h, --help
- Print this help message and exit
- --version
- Print program version and exit
- -f, --filename FORMAT
- Filename format string for downloaded files ('/O' for "original" filenames)
- -d, --destination PATH
- Target location for file downloads
- -D, --directory PATH
- Exact location for file downloads
- -X, --extractors PATH
- Load external extractors from PATH
- --user-agent UA
- User-Agent request header
- --clear-cache MODULE
- Delete cached login sessions, cookies, etc. for MODULE (ALL to delete everything)
- -U, --update-check
- Check if a newer version is available
- -i, --input-file FILE
- Download URLs found in FILE ('-' for stdin). More than one --input-file can be specified
- -I, --input-file-comment FILE
- Download URLs found in FILE. Comment them out after they were downloaded successfully.
- -x, --input-file-delete FILE
- Download URLs found in FILE. Delete them after they were downloaded successfully.
- --no-input
- Do not prompt for passwords/tokens
- -q, --quiet
- Activate quiet mode
- -w, --warning
- Print only warnings and errors
- -v, --verbose
- Print various debugging information
- -g, --get-urls
- Print URLs instead of downloading
- -G, --resolve-urls
- Print URLs instead of downloading; resolve intermediary URLs
- -j, --dump-json
- Print JSON information
- -J, --resolve-json
- Print JSON information; resolve intermediary URLs
- -s, --simulate
- Simulate data extraction; do not download anything
- -E, --extractor-info
- Print extractor defaults and settings
- -K, --list-keywords
- Print a list of available keywords and example values for the given URLs
- -e, --error-file FILE
- Add input URLs which returned an error to FILE
- -N, --print [EVENT:]FORMAT
- Write FORMAT during EVENT (default 'prepare') to standard output. Examples: 'id' or 'post:{md5[:8]}'
- --print-to-file [EVENT:]FORMAT FILE
- Append FORMAT during EVENT to FILE
- --list-modules
- Print a list of available extractor modules
- --list-extractors [CATEGORIES]
- Print a list of extractor classes with description, (sub)category and example URL
- --write-log FILE
- Write logging output to FILE
- --write-unsupported FILE
- Write URLs, which get emitted by other extractors but cannot be handled, to FILE
- --write-pages
- Write downloaded intermediary pages to files in the current directory to debug problems
- --print-traffic
- Display sent and read HTTP traffic
- --no-colors
- Do not emit ANSI color codes in output
- -R, --retries N
- Maximum number of retries for failed HTTP requests or -1 for infinite retries (default: 4)
- --http-timeout SECONDS
- Timeout for HTTP connections (default: 30.0)
- --proxy URL
- Use the specified proxy
- --source-address IP
- Client-side IP address to bind to
- -4, --force-ipv4
- Make all connections via IPv4
- -6, --force-ipv6
- Make all connections via IPv6
- --no-check-certificate
- Disable HTTPS certificate validation
- -r, --limit-rate RATE
- Maximum download rate (e.g. 500k or 2.5M)
- --chunk-size SIZE
- Size of in-memory data chunks (default: 32k)
- --sleep SECONDS
- Number of seconds to wait before each download. This can be either a constant value or a range (e.g. 2.7 or 2.0-3.5)
- --sleep-request SECONDS
- Number of seconds to wait between HTTP requests during data extraction
- --sleep-extractor SECONDS
- Number of seconds to wait before starting data extraction for an input URL
- --no-part
- Do not use .part files
- --no-skip
- Do not skip downloads; overwrite existing files
- --no-mtime
- Do not set file modification times according to Last-Modified HTTP response headers
- --no-download
- Do not download any files
- -o, --option KEY=VALUE
- Additional options. Example: -o browser=firefox
- -c, --config FILE
- Additional configuration files
- --config-yaml FILE
- Additional configuration files in YAML format
- --config-toml FILE
- Additional configuration files in TOML format
- --config-create
- Create a basic configuration file
- --config-status
- Show configuration file status
- --config-open
- Open configuration file in external application
- --config-ignore
- Do not read default configuration files
- -u, --username USER
- Username to login with
- -p, --password PASS
- Password belonging to the given username
- --netrc
- Enable .netrc authentication data
- -C, --cookies FILE
- File to load additional cookies from
- Export session cookies to FILE
- Name of the browser to load cookies from, with optional domain prefixed with '/', keyring name prefixed with '+', profile prefixed with ':', and container prefixed with '::' ('none' for no container (default), 'all' for all containers)
- -A, --abort N
- Stop current extractor run after N consecutive file downloads were skipped
- -T, --terminate N
- Stop current and parent extractor runs after N consecutive file downloads were skipped
- --filesize-min SIZE
- Do not download files smaller than SIZE (e.g. 500k or 2.5M)
- --filesize-max SIZE
- Do not download files larger than SIZE (e.g. 500k or 2.5M)
- --download-archive FILE
- Record all downloaded or skipped files in FILE and skip downloading any file already in it
- --range RANGE
- Index range(s) specifying which files to download. These can be either a constant value, range, or slice (e.g. '5', '8-20', or '1:24:3')
- --chapter-range RANGE
- Like '--range', but applies to manga chapters and other delegated URLs
- --filter EXPR
- Python expression controlling which files to download. Files for which the expression evaluates to False are ignored. Available keys are the filename-specific ones listed by '-K'. Example: --filter "image_width >= 1000 and rating in ('s', 'q')"
- --chapter-filter EXPR
- Like '--filter', but applies to manga chapters and other delegated URLs
- -P, --postprocessor NAME
- Activate the specified post processor
- --no-postprocessors
- Do not run any post processors
- -O, --postprocessor-option KEY=VALUE
- Additional post processor options
- --write-metadata
- Write metadata to separate JSON files
- --write-info-json
- Write gallery metadata to a info.json file
- --write-tags
- Write image tags to separate text files
- --zip
- Store downloaded files in a ZIP archive
- --cbz
- Store downloaded files in a CBZ archive
- --mtime NAME
- Set file modification times according to metadata selected by NAME. Examples: 'date' or 'status[date]'
- --rename FORMAT
- Rename previously downloaded files from FORMAT to the current filename format
- --rename-to FORMAT
- Rename previously downloaded files from the current filename format to FORMAT
- --ugoira FMT
- Convert Pixiv Ugoira to FMT using FFmpeg. Supported formats are 'webm', 'mp4', 'gif', 'vp8', 'vp9', 'vp9-lossless', 'copy', 'zip'.
- --exec CMD
- Execute CMD for each downloaded file. Supported replacement fields are {} or {_path}, {_directory}, {_filename}. Example: --exec "convert {} {}.png && rm {}"
- --exec-after CMD
- Execute CMD after all files were downloaded. Example: --exec-after "cd {_directory} && convert * ../doc.pdf"
EXAMPLES¶
- gallery-dl URL
- Download images from URL.
- gallery-dl -g -u <username> -p <password> URL
- Print direct URLs from a site that requires authentication.
- gallery-dl --filter 'type == "ugoira"' --range '2-4' URL
- Apply filter and range expressions. This will only download the second, third, and fourth file where its type value is equal to "ugoira".
- gallery-dl r:URL
- Scan URL for other URLs and invoke gallery-dl on them.
- gallery-dl oauth:SITE-NAME
- Gain OAuth authentication tokens for deviantart, flickr, reddit, smugmug, and tumblr.
FILES¶
- /etc/gallery-dl.conf
- The system wide configuration file.
- ~/.config/gallery-dl/config.json
- Per user configuration file.
- ~/.gallery-dl.conf
- Alternate per user configuration file.
BUGS¶
AUTHORS¶
Mike Fährmann <mike_faehrmann@web.de>
and https://github.com/mikf/gallery-dl/graphs/contributors
SEE ALSO¶
2025-01-12 | 1.28.4 |