I've been collecting reference images for design work - Pinterest boards with thousands of pins, Pixiv artists with hundreds of works, Instagram accounts I wanted to archive. Right-clicking everything wasn't going to work.
Tried a few options first. Browser extensions mostly grabbed thumbnails instead of original images. Some desktop apps worked but quality was inconsistent. Writing my own scripts with requests/BeautifulSoup got blocked pretty quickly.
Eventually came across gallery-dl. It's a command-line tool that handles a bunch of image hosting sites. The main thing that stood out was it actually downloads the original files - no compression, no resizing.
Quality difference
Ran a test on a Pixiv artist to compare:
| Method | Resolution | File size |
|---|---|---|
| Right-click save | 1200x1200 | ~400 KB |
| Gallery-dl | 6000x6000 (original) | ~8 MB |
That's the difference - gallery-dl pulls the exact file the artist uploaded. No browser compression applied.
Installation
pip install gallery-dl
Or on macOS: brew install gallery-dl
Basic usage
Just paste the URL:
# Pixiv - all works from an artist
gallery-dl "https://www.pixiv.net/users/12345"
# Instagram - all posts from a profile
gallery-dl "https://www.instagram.com/username/"
# Pinterest - entire board
gallery-dl "https://www.pinterest.com/username/board-name/"
It creates folders automatically:
Pixiv/
├── artist_12345/
│ ├── 8473291_1_p0.png
│ └── 8473291_1_p1.png
└── artist_67890/
Cookies (for restricted content)
Some content requires login - R-18 content on Pixiv, private Instagram accounts, etc. This is the part that confused me initially.
Gallery-dl doesn't handle login forms. You export cookies from your browser after logging in manually:
Get a cookie export extension
"Get cookies.txt LOCALLY" for Chrome/Edge works well. There's also "cookies.txt" for Firefox.
Export and configure
1. Log into the site in your browser
2. Click the extension, export cookies.txt
3. Create config file: ~/.config/gallery-dl/config.json
Add to config.json:
{
"extractor": {
"pixiv": {
"cookies": "~/cookies/pixiv.txt"
},
"instagram": {
"cookies": "~/cookies/instagram.txt"
}
}
}
Or use browser cookies directly (simpler):
{
"extractor": {
"pixiv": {
"cookies": "chrome"
}
}
}
Supported: chrome, chromium, firefox, safari, opera
Options I use
Filter by resolution:
gallery-dl --filter "width >= 1920" "URL"
Download specific range (good for re-running):
gallery-dl --range 1-50 "URL"
gallery-dl --range 51-100 "URL"
Skip files you already have:
gallery-dl --download-archive downloaded.txt "URL"
Custom filenames:
{
"extractor": {
"pixiv": {
"filename": "{artist_id}_{id}_{title}.{extension}"
}
}
}
Issues I ran into
403 Forbidden with cookies configured
Turned out the cookie format was wrong. Some extensions export as JSON but gallery-dl expects Netscape format. "Get cookies.txt LOCALLY" exports in the right format.
Permission denied on Windows
Some filenames had characters Windows doesn't allow. Fixed with:
{
"extractor": {
"filename": "{mangle:.50}.{extension}",
"path-replace": "_"
}
}
Getting rate limited
Pixiv started blocking after ~200 downloads. Added delays:
{
"extractor": {
"sleep": 3
},
"downloader": {
"parallel": 1
}
}
Instagram images still low res
Needed to explicitly enable original quality:
{
"extractor": {
"instagram": {
"include": "posts,stories",
"videos": true
}
}
}
Supported sites
The --list-extractors flag shows everything. Main ones I've used:
- Art: Pixiv, DeviantArt, ArtStation
- Social: Instagram, Twitter/X, Reddit
- Pinterest: Boards, profiles, individual pins
- Booru-style: Danbooru, Gelbooru
- Misc: Imgur, Flickr, Tumblr
Notes
- Don't hammer servers - use reasonable delays between requests
- Downloaded content is for personal use
- Cookie files contain login credentials - don't share them
- Cookies expire eventually - re-export if you start getting 403 errors
Gallery-dl repo: github.com/mikf/gallery-dl