Danbooru

Code 403 while fetching posts

Posted under Bugs & Features

Hello, I run a discord bot that fetches posts from certain tags of Danbooru every hour or so, but ever since early/mid january it has stopped working, giving me a 403 and prompting a captcha by Cloudflare. I've read some other threads about it which mentioned changing the User Agent, I've tried but it didn't work.

I am using the library PyBooru to fetch the data but it doesn't work even if I request it myself. Only my server is affected by this, if I run the program on my own computer it's not blocked. Am I IP banned? Is there a way to solve this?

DM me your server’s IPv4 and IPv6 addresses and I’ll check the IP ban list for you.

Btw, how does your bot handle timeouts and other errors reported by Danbooru? Does it just try again or does it wait until the next run an hour later?

Having the exact same issue. For me it's not a Discord bot though, but a Website I host locally on my PC. There is also no captcha by Cloudflare.

Every time I try to fetch some posts it returns a 403. Adding my API key doesn't make a difference.

The fetches worked fine until a few days ago, so maybe something changed that isn't present in the docs yet.

Alzariel said:

Having the exact same issue. For me it's not a Discord bot though, but a Website I host locally on my PC. There is also no captcha by Cloudflare.

Every time I try to fetch some posts it returns a 403. Adding my API key doesn't make a difference.

The fetches worked fine until a few days ago, so maybe something changed that isn't present in the docs yet.

See the responses in topic #22838, topic #22822, and topic #23284.

Talulah said:

See the responses in topic #22838, topic #22822, and topic #23284.

Well there's not really an option to change the user-agent in the browser (at least in Chromium-based ones). But the browser already sends a user-agent header with every request (Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36 OPR/95.0.0.0).

So my guess is that this user-agent is part of the ones that got banned

Your network was blocked due to abuse by other users. In particular, someone was trying to download the entire site one post at a time using a ton of different IPs and randomly generated user agents.

I've unblocked your network, but it may be blocked again if the abuse continues.

(For anyone watching: don't try to download the entire site one post at a time. Use the database dumps or at least use /posts.json?limit=200 to get 200 posts at a time. And don't use a million IPs to try to get around rate limits or bans. If you do that your whole network or hosting provider will be blocked.)

Talulah said:

You said "website [you] host", so I assumed programmatic access. Without knowing what you are doing for sure it's hard to solve an issue like that.

No you're right with your assumption, it's a website I wrote myself and I'm hosting it locally. The site simply fetches the last couple of posts every few minutes and displays them. (same as the danbooru site itself, just without having to refresh manually)

The problem is just that browsers prevent you from setting the user-agent header (see https://bugs.chromium.org/p/chromium/issues/detail?id=571722), so that workaround doesn't work for me.

Guess there's always the possibility of writing something outside the browser

evazion said:

Your network was blocked due to abuse by other users. In particular, someone was trying to download the entire site one post at a time using a ton of different IPs and randomly generated user agents.

I've unblocked your network, but it may be blocked again if the abuse continues.

(For anyone watching: don't try to download the entire site one post at a time. Use the database dumps or at least use /posts.json?limit=200 to get 200 posts at a time. And don't use a million IPs to try to get around rate limits or bans. If you do that your whole network or hosting provider will be blocked.)

Thanks, it's working again!

Just for the future: ist fetching 20 posts every minute too much? I can always increase the page size or fetch less frequently if it is

1