IRCaBot 2.1.0
GPLv3 © acetone, 2021-2022
#saltr
/2024/05/22
dr|z3d you could spider your site and grab a list of urls and just put all the prohibited ones in the list.
dr|z3d however, if your site links to those resources, and those links are visible, then just take down the links.
dr|z3d http blocklist is mostly intended for blocking access to non-public (ie non-linked) resources on your site, or links that aren't there but keep getting probed by a vuln scanner.
not_bob_afk I just don't put anything on a public server that I don't want the world to see.
cumlord Just paranoid of someone trying to go somewhere they’re not supposed to but seems like multiple ways to do it
not_bob_afk And, if I do, it's for a very, very limited time.
dr|z3d vetting what's actually visible or accessible on your site is the first thing to do.
cumlord That is the smartest way to do it notbob
dr|z3d here's an example.. you run a blog with a login feature, but the login links have been purposely removed from the site, and you access the login over localhost.
not_bob_afk Or, in the case of my blog, there is no way to edit anything via http
dr|z3d so you might want to add /login as a block, and then any time someone tries to access it, their dest will be logged if not already logged, and you can pass the dest log to the tunnel filter so they get an instant ban.
not_bob_afk It's just not an option, so it can't be exploited.
dr|z3d I'm specifically responding to cumlord, not_bob_afk.
cumlord That is kind of what I have going, probably overthinking it
dr|z3d another possible use is adding a hidden, non-visible link to a non-existent resource on your site that people won't see.
dr|z3d "why would I do that?" you might ask..
cumlord There is no way to edit anything without ssh to local and ip tables block everything
dr|z3d well, here's the thing.. a web spider will see the link, attempt to load it, and voila! no more spider.
cumlord I saw someone seems to have done that on terminus, very good idea
cumlord Might just do that
dr|z3d you have to weigh that against the benefits of promotion on an i2p search engine.
dr|z3d in any event, it seems a no-brainer to block a list of urls that are being constantly accessed only by a vuln scanner. there's a fairly large list, and they only have to hit on block and then they're gone.
dr|z3d > /sql.dump and friends.
dr|z3d an alternative to adding links to the site is to add a Disallow rule to your robots.txt file.
dr|z3d well-behaved spiders will obey the rules, badly behaved spiders won't, so you can target the badly behaved spiders that way.
snex i mean what do you mean by "block" these urls? if your http server returns 404 on them, what else do you want?
dr|z3d or rather, badly behaved spiders may explicitly attempt to load all disallowed urls
dr|z3d 404 is more information than you might want to share with an attacker.
dr|z3d better if the attacker doesn't even get as far as the webserver.
snex if you are giving "special" responses for certain urls, this tells the attacker extra info as opposed to just 404 as you would for any other junk url
cumlord , almost opposite of clearnet minus I think my first website was defaced within a day when I was a wee lad or lass
cumlord Appreciate the run down I’ll have to take a closer look at robot.txt
dr|z3d not really, snex.
dr|z3d because that "special response" is denied access to the entire site.
snex great the attacker knows you are protecting that url for some reason. now they get a new IP and find a way to access it
dr|z3d and the special response isn't that special at all, from the attacker's perspective. just a hung connection :)
dr|z3d all they know is that access to a site stopped working. remember, we're talking about i2p here, so new ip means nothing.
dr|z3d and why do I have to be protecting a url? I don't, maybe I've seen a vuln scanner attempt to access a non-existent url and decide it's in the blocklist.
snex serving up a 404 takes basically no resources and trips no alerts for attackers
dr|z3d a 404 will confirm or deny the presence of a file on the server. that's more than enough info.
snex no it wont
dr|z3d it will in terms of public access.
snex you can 404 for things that are actually there but require auth
dr|z3d oh, sure, you can fire off a 404 instead of a 403 or whatever.
dr|z3d but either way, you're giving the attacker info that you could just eliminate entirely.
snex doing something abnormal gives them more info than doing something normal
dr|z3d if you're trying to make a point, I'm not clear what it is.
snex standard security practice is to treat it like any other request for a non-existent resource
dr|z3d have you witnessed the catalog of vuln/exploit urls constantly being spidered on _this_ network?
snex not specifically but ive run public http servers for at least a decade and see what they try to do to me
dr|z3d (usually by up to 4 dests in tandem)
snex lots of wpadmin.cgi or whatever the fuck. just 404 and move on
dr|z3d do what you like, no one's telling you to enforce a blocklist.
dr|z3d but if you want the added protection of a blocklist that kills the scanning dests dead, it's now available.
dr|z3d I've seen the scanner too many times all over my sites to be happy with just 404'ing it away.
cumlord Yeah in clearnet that makes sense I’m not as accustomed to what to look out for here
dr|z3d I can give you a blocklist derived from observing the behavior of the vuln scanner if you want it, cumlord
dr|z3d you'll want to check it and ensure it won't block any urls your site is legitimately serving, but for the most part they're obvious signs of a script looking for weaknesses.
dr|z3d with just a blocklist, the default behavior is to close the connection socket so the request just hangs/times out, but you can pipe the resulting dests to a tunnel filter if you want to band the requesting dests.
cumlord That’d be helpful please do send it
dr|z3d feel free to share *privately* with anyone else that wants to test.
dr|z3d link via pm.
cumlord As long as it keeps them away, maybe experiment with some sort of bot trap, haven’t tried that yet either
cumlord Will do
dr|z3d if you want to block the dests, you'll need a tunnel filter definition with a deny rule pointing at .i2p/http_blocklist_clients.txt
dr|z3d your tunnel filter file (which needs to be configured per server tunnel) will minimally look something like this:
dr|z3d deny file /home/i2p/.i2p/http_blocklist_clients.txt
dr|z3d new update pushed to /dev/ mesh with more anti-throttle checks.
dr|z3d looks like you came back quick after a router restart, snex?
snex no, just standard irc choke
dr|z3d oh, ok.
dr|z3d thought you might have updated.
dr|z3d I'm finding router.blockOldRouters=false helps on one specific router where the floodfill count is/was declining after a while.
dr|z3d dunno if that'll help your situation, might do.
dr|z3d there's also more tolerance for older routers in the latest builds, irrespective of enabling that config.
dr|z3d (which also may/may not help your situation)
dr|z3d also in latest build, regex characters in url strings in blocklist should now be handled correctly, and http_blocklist.txt now supports #commented lines.
dr|z3d nice tarpit image, not_bob_afk. only the hands!
dr|z3d (give it away)
not_bob_afk dr|z3d: I Know....
dr|z3d I know you know :)
not_bob_afk Hands are hard.
not_bob_afk I've had a few people complain "Where are my posts!?"
not_bob_afk I know where they are, in the bin.
not_bob_afk But, I wanted a polite way to say it.
dr|z3d two words.
dr|z3d quality threshold.
not_bob_afk Correct.
dr|z3d you should give in-painting a go to fix hands.
dr|z3d that image, though, quality stuff. looks mostly legit, scanned from a manga comic.
not_bob_afk Yep, that's why I used it.
not_bob_afk I was like "frell the hands!"
not_bob_afk And, it's just the one hand really.
dr|z3d still waiting for sd3 models. it may be a while :|
dr|z3d currently stability is scrambling around for a buyer. I don't think releasing SD3 is top of their list of priorities.
not_bob_afk Now I understand.
dr|z3d I heard Sean Parker mentioned as a possible buyer, or part of a group.
dr|z3d (Co-founder of facebook)
not_bob_afk It's a very useful tool.
dr|z3d just hope they don't lose that open source ethic. :|
not_bob_afk That would not be good.
not_bob_afk Trying an inpaint now.
not_bob_afk Lets see if it helps.
StormyCloud second outproxy switched over to new stack, please shout out if you notice any issues
StormyCloud second outproxy server*
dr|z3d still smells of eggs and cabbage, StormyCloud :)
not_bob_afk Does it come with a free kitten?
StormyCloud I mean... Im not opposed to sending users free kitten's. Unsure how the post office will feel about this new venture tho
not_bob_afk Likely not well.
dr|z3d bonzai kittens!
dr|z3d or rather, bonsai kittens.
not_bob_afk That site is still funny.
StormyCloud yall showing your age there :P
not_bob_afk Before 1913 you could send children in the mail.
not_bob_afk You can still send bees in the mail.
StormyCloud pour driver that drops that package
not_bob_afk Even more silly. You can ship a mailbox in the mail.
snex re: recent attacks, it seems like tor is also experiencing a lot of bullshit lately
dr|z3d Tor's been super flaky for at least a couple of weeks
uop23ip those "new" 2.5.1 routers are gone for me in netdb
dr|z3d yeah, they get banned when they're encountered now in +
dr|z3d takes a while, and occasionally you may see 1 or 2 reappear.
dr|z3d I see approx 1400 banned locally here.
dr|z3d curl 127.0.0.1:7657/profiles?f=3 |grep -i unban | grep Invalid |grep 2.5.1 | wc -l