logoalt Hacker News

INTPenistoday at 2:44 PM14 repliesview on HN

Lazy has nothing to do with it, codeberg simply doesn't work.

Most of my friends who use codeberg are staunch cloudflare-opponents, but cloudflare is what keeps Gitlab alive. Fact of life is that they're being attacked non-stop, and need some sort of DDoS filter.

Codeberg has that anubis thing now I guess? But they still have downtime, and the worst thing ever for me as a developer is having the urge to code and not being able to access my remote. That is what murders the impression of a product like codeberg.

Sorry, just being frank. I want all competitors to large monopolies to succeed, but I also want to be able to do my job/passion.


Replies

embedding-shapetoday at 2:53 PM

Maybe I'm too old school, but both GitHub and Codeberg for me are asyncronous "I want to send/share the code somehow", not "my active workspace I require to do work". But reading

> the worst thing ever for me as a developer is having the urge to code and not being able to access my remote.

Makes it seem like GitHub/Codeberg has to be online for you to be able to code, is that really the case? If so, how does that happen, you only edit code directly in the GitHub web UI or how does one end up in that situation?

show 6 replies
freedombentoday at 2:54 PM

I've had the same experience.

Philosophically I think it's terrible that Cloudflare has become a middleman in a huge and important swath of the internet. As a user, it largely makes my life much worse. It limits my browser, my ability to protect myself via VPNs, etc, and I am just browsing normally, not attacking anything. Pragmatically though, as a webmaster/admin/whatever you want to call it nowadays, Cloudflare is basically a necessity. I've started putting things behind it because if I don't, 99%+ of my traffic is bots, and often bots clearly scanning for vulnerabilities (I run mostly zero PHP sites, yet my traffic logs are often filled with requests like /admin.php and /wp-admin.php and all the wordpress things, and constant crawls from clearly not search engines that download everything and use robots.txt as a guide of what to crawl rather than what not to crawl. I haven't been DDoSed yet, but I've had images and PDFs and things downloaded so many times by these things that it costs me money. For some things where I or my family are the only legitimate users, I can just firewall-cmd all IPs except my own, but even then it's maintenance work I don't want to have to do.

I've tried many of the alternatives, and they often fail even on legitimate usecases. I've been blocked more by the alternatives than I have by Cloudflare, especially that one that does a proof of work. It works about 80% of the time, but that 20% is really, really annoying to the point that when I see that scren pop up I just browse away.

It's really a disheartening state we find ourselves in. I don't think my principles/values have been tested more in the real world than the last few years.

show 4 replies
frevibtoday at 4:00 PM

OP is about Github. Have you seen the Github uptime monitor? It’s at 90% [1] for the last 90 days. I use both Codeberg and Github a lot and Github has, by far, more problems than Codeberg. Sometimes I notice slowdowns on Codeberg, but that’s it.

[1] https://mrshu.github.io/github-statuses/

show 2 replies
kjuulhtoday at 2:57 PM

My own git server has been hit severely by scrapers. They're scraping everything. Commits, comparisons between commits, api calls for files, everything.

And pretty much all of them, ByteDance, OpenAI, AWS, Claude, various I couldn't recognize. I basically just had to block all of them to get reasonable performance for a server running on a mini-pc.

I was going to move to codeberg at some point, but they had downtime when I was considering it, I'd rather deal with that myself then.

show 1 reply
ori_btoday at 3:38 PM

> But they still have downtime

Thank God GitHub is... oh.

https://mrshu.github.io/github-statuses/

prmoustachetoday at 3:02 PM

The whole point of git is to be decentralized so there is no reason for you to not have your current version available even when a remote is offline.

show 2 replies
zelphirkalttoday at 3:45 PM

Probably has happened at some point, but personally, I have not been hit with/experienced downtime of Codeberg yet. The other day however GitHub was down again. I have not used Gitlab for a while, and when I used it, it worked fine, and its CI seems saner than Github's to me, but Gitlab is not the most snappy user experience either.

Well, Codeberg doesn't have all the features I did use of Gitlab, but for my own projects I don't really need them either.

iamkonstantintoday at 5:15 PM

> for me as a developer is having the urge to code and not being able to access my remote

I think that's the moment when you choose to self host your whatever git wrapper. It really isn't that complicated to do and even allows for some fun (as in cheap and productive) setups where your forge is on your local network or really close to your region and you (maybe) only mirror or backup to a bigger system like Codeberg/GitHub.

In our case, we also use that as an opportunity to mirror OCI/package repositories for dependencies we use in our apps and during development so not only builds are faster but also we don't abuse free web endpoints with our CI/CD requests.

nfrederickstoday at 5:13 PM

I agree. I switched to Codeberg but switched back after a few months. Funny enough, I found there to be more unreported downtime on Codeberg than GitHub.

maelitotoday at 5:11 PM

> Lazy has nothing to do with it, codeberg simply doesn't work.

Been working on it for months now, it does work, lol.

z3t4today at 5:02 PM

I find irony in that Git was made to get rid of central repos, and then we re-introduce them.

show 1 reply
mixmastamyktoday at 3:34 PM

[flagged]

youarewashedtoday at 3:11 PM

[flagged]

show 1 reply