IRCaBot 2.1.0
GPLv3 © acetone, 2021-2022
#i2p-dev
/2025/07/27
&zzz
+R4SAS
+RN
+RN_
+Stormycloud
+T3s|4
+dr|z3d
+hagen
+hk
+mareki2p
+orignal
+postman
+qend-irc2p
+snex
+wodencafe
Arch
BubbRubb
C341
ContestI2PTeam
Daddy
Danny
DeltaOreo
FreefallHeavens
HowardPlayzOfAdmin
Irc2PGuest62770
Irc2PGuest67581
Irc2PGuest82088
Onn4l7h
Onn4|7h
Over
Sleepy
SlippyJoe
Teeed
aargh
acetone_
ardu
b3t4f4c3___
cumlord
death
dr4wd3_
f00b4r
nilbog-
nnm--
not_bob_afk
ohThuku1
onon_
phil
poriori_
profetikla
r00tobo
rapidash
shiver_
solidx66
thetia
u5657
uop23ip
w8rabbit
weko_
wew
x74a6
zzz Your ROOT_URL in app.ini is "https://i2pgit.org/", it's unlikely matching the site you are visiting.
zzz Mismatched ROOT_URL config causes wrong URL links for web UI/mail content/webhook notification/OAuth2 sign-in.
zzz This instance is configured to run under HTTPS (by ROOT_URL config), you are accessing by HTTP. Mismatched scheme might cause problems for sign-in/sign-up.
zzz eyedeekay, can't log in in-net. cleared cookies, didn't work. tried b32, didn't work
zzz tried clearnet, unresponsive
eyedeekay son of a gun sorry I missed your message earlier, it looks like it's back up now
eyedeekay next time it goes down it's going to go down for a few minutes while I apply an update
zzz eyedeekay, still unable to log in
zzz with the ROOT URL message as above ^^^
eyedeekay Yeah sorry I'm actually in there right now with it shut down which is why it's 502ing, just a sec I'll put it back up
zzz no not 502, error message about ROOT URL
zzz but if you're in the middle of it just holler when it's back
eyedeekay oh weird I was getting 502 until a second ago
zzz Your ROOT_URL in app.ini is "https://i2pgit.org/", it's unlikely matching the site you are visiting.
zzz Mismatched ROOT_URL config causes wrong URL links for web UI/mail content/webhook notification/OAuth2 sign-in.
zzz This instance is configured to run under HTTPS (by ROOT_URL config), you are accessing by HTTP. Mismatched scheme might cause problems for sign-in/sign-up.
zzz and if I try to log in anyway it just returns to login page
eyedeekay Yeah it's supposed to be getting that from a header but I turned it off chasing the mystery memory leak, I can turn it back on because it's clearly not the culprit
eyedeekay OK I just logged out and logged back in in regular Firefox, now I'll try it in Private Browsing and Chromium, see if that changes anything...
eyedeekay let me see if I've got some kind of weird session stuff hanging around here on the server too, maybe that's it
eyedeekay be a few more minutes
eyedeekay OK it's going to 502 for a while, backups and doctor are running before I update it again
eyedeekay OK you should be able to log in again zzz
zzz similar banner but logged in successfully
zzz Your ROOT_URL in app.ini is "http://i2pgit.org:3000/", it's unlikely matching the site you are visiting.
zzz Mismatched ROOT_URL config causes wrong URL links for web UI/mail content/webhook notification/OAuth2 sign-in.
eyedeekay Yeah it's still not setting it from the header like it should but it will let you log in with http again
zzz yup, thanks
eyedeekay No problem I'll keep at it
eyedeekay Going to have something exciting to show off in go-i2p soon too
eyedeekay Like maybe today
zzz 503 again ((((((
zzz after trying to create a PR
eyedeekay OK maybe the log will say something useful about the PR though
eyedeekay well that's fuckin' weird... it didn't oom. didn't crash at all.
eyedeekay Something different...
eyedeekay Not only that I think the PR is there too
dr|z3d LOL at exciting. zzz *loves* exciting.
zzz the 503 comes from i2ptunnel ofc, if the socket to the server times out
zzz that timeout is pretty short
zzz do you have inline hooks that are taking a long time?
zzz yes, the PR is there
zzz gitea is supposed to be fast, yet everything from ssh to web is now just so darn slow, maybe you and drz can compare configs?
zzz i2ptunnel local socket timeouts should show up in the router logs
dr|z3d here's what I'd do: 1) make github the canonical source. 2) mirror all required repos from github onto gitea. 3) create a custom tunnel that maps github ssh to localhost. 4) profit.
zzz when we first switched over it was ducky
zzz then we got stuck on some nightly we couldn't back out of that was hangy/crashy, and we were waiting for a new release. that was months ago
zzz is the release out yet?
dr|z3d I pull from git regularly (gitea), never seen any issues the likes of which you're experiencing.
eyedeekay No not inline hooks, the thing is asking for an obscene amount of memory and dying, like it's trying to allocate 800GB of RAM and I am chasing reasons why
dr|z3d oh, and 5) lose the complexity. I think we discussed before, don't do local CI, rely on github's.
eyedeekay We tried the release version and it kept doing the same thing
eyedeekay Local CI has been shut down this whole time
dr|z3d turn off indexing for now.
zzz hmph
eyedeekay I turned off indexing yesterday
zzz I know nothing, so I'll stay out of it except to report issues
eyedeekay Keep doing so, eventually one of them will have to be the clue I need
eyedeekay Thanks for reporting them and bearing with me as best you can
dr|z3d purge all your local archives, eyedeekay
eyedeekay Did that too
dr|z3d ok. what about nodejs/npm. did you let gitea handle that, or you installed via the OS?
eyedeekay package manager supplied npm and node, from debian stable-backports
dr|z3d had some issues with gitea not compiling properly with OS-installed nodejs iirc, when I removed that, things resolved.
dr|z3d and how are you compiling gitea after updates?
eyedeekay I didn't observe any compile-time issues but I'm doing the build in github CI on ubuntu 22.04 with gitea's `make build` target
dr|z3d ok, `make build` is fine.
eyedeekay Yeah that's the target they use for production, my whole plan was to make a production grade zero-configuration gitea-for-I2P variant and the worst part is that it's something in the data on our server that's doing it
eyedeekay If I deploy it and start using it on a different server with different data, it works fine, runs for weeks
eyedeekay So it's either in the config or the repositories themselves
dr|z3d could be the indexer (db), maybe it's still using the index and all you did was turn off the runner in the configs.
eyedeekay Hm, maybe...
dr|z3d apparently the search index doesn't get deleted unless you delete the repo.
dr|z3d old issue, but the info seems valid.
dr|z3d might be worth installing phpMyAdmin if you haven't already and taking a look at the db to see if there are any obvious entries that could be screwing things up.
eyedeekay check this out, it's possible it goes all the way back to mtn: warning: refs/tags/i2p_0_6_1_11: refMissingNewline: misses LF at the end
eyedeekay ^ That does not happen with post-mtn tags
dr|z3d if you think that's the issue, may try deleting the repo, mirroring it from github, then converting to a local repo while you ponder whether or not to make github the canonical source.
dr|z3d you'll want to backup first, of course, because you'll want the issues and everything else you're not storing on github.
dr|z3d but given that things were initially working fine, I'd suggest the LF issues are a red herring.
eyedeekay The bad tags are on github too
dr|z3d ok, well, that's probably not the issue you want to be chasing right now.
eyedeekay Could be I guess re: red herring
eyedeekay Well there's another data point here though, which is that the git operations that are consuming excessive ram and deadlocking are all diffs between $tag and master
eyedeekay which gitea calls when someone or something views a page that shows the diff
eyedeekay Which is why I thought caching might help
dr|z3d > As a workaround, just find all commit-graph-chain.lock files and delete them.
dr|z3d not sure that's entirely on point, but I guess it can't hurt to see how many of those files you've got hanging around.
eyedeekay Yup that's in the reboot script actually
dr|z3d I think gitea has its own caching mechanism, might be worth bumping the lifetime of that.
eyedeekay That's for caching dependencies in CI
eyedeekay but that PR looks pretty similar to what I'm seeing
dr|z3d yeah, things like archivces and stuff. generating those can definitely hold things up.
dr|z3d iirc you can turn that off, archive generation.
eyedeekay Nope. It's on-demand the first time and cached afterward
eyedeekay Which was the first thing that went wrong, now we hande that with a cronjob and quotas
eyedeekay It filled up the disk
eyedeekay But this is different, this is looking for memory, and like, deeply abnormal amounts of it
eyedeekay Server Uptime
dr|z3d yeah. it'll do that if left unsupervised. I just turn that feature off entirely. you want an archive? github will supply.
eyedeekay 2025-07-27T20:59:38Z
eyedeekay Current Goroutines
eyedeekay Current Memory Usage
eyedeekay 131 MiB
eyedeekay Total Memory Allocated
eyedeekay 254 GiB
eyedeekay Memory Obtained
eyedeekay 423 MiB
eyedeekay What's the config for that? I did not find such a thing
dr|z3d [repository]
dr|z3d DISABLE_DOWNLOAD_SOURCE_ARCHIVES = true
dr|z3d also have a look at:
dr|z3d [repo-archive]
dr|z3d ENABLED = false
eyedeekay Oh it was already there, I guess I did find it and forgot.
eyedeekay Consistent with the UI though the archive downloads are gone
dr|z3d in case you decide to turn the indexer back on, you might want to set something like:
dr|z3d MAX_FILE_SIZE = 5242880
eyedeekay Mine was approximately 1/5 that when it was on
dr|z3d ok, good.