• 4 Posts
  • 392 Comments
Joined 2 years ago
cake
Cake day: July 1st, 2023

help-circle






  • it’s a bit of a straw man from your side to act like the discussion is about multiplayer

    The top of this thread:

    If a multiplayer-only game turns down official servers, and you can’t self-host within the game, they should owe players a separate server binary they can run, or a partial refund for breaking the game. It should not be hard, especially if it’s a known constraint when they develop the game.

    Emphasis mine.



  • You’re cool with it until you realize that they only want to do this to personally gain from it. And guaranteed will protect their own IP, and the IP of every large corporation.

    It’s just that you yourself and small businesses will no longer have the benefit of intellectual property. Megacorps can steal whatever they want with impunity since they are the only true holders of intellectual property.

    That sounds good on paper until you look at the long history of these people and how everything they do is entirely focused on their own benefit over that of others. They gain something to win here, guaranteed they aren’t going to let themselves lose on anything either.

    It’s the same sort of situation as AI regulation. Sam Altman and openai want the United States to crack down and make it extremely difficult to develop new models. Why? So that they don’t have any competition. They already got their foot in the door they want to close the door for anyone else.

    This is very likely the same sort of situation.


  • No it isn’t this is a crazy ignorant comment that just hand waves the problem I presented away because it’s not convenient enough for your stance.

    If you’re going to comment don’t comment in bad faith, that’s not the kind of discussions we need on lemmy.

    The problem begets the solution. And damn near every modern MMO has a significant set of challenges that they have built technological solutions for which drive more complicated infrastructure.



  • That’s a good call out.

    There are a few things I do right now:

    1. All of my public DNS entries for the certs point at cloudflare, not my IP.
    2. My internal Network DNS resolver will resolve those domains to an internal address. I don’t rely on nat reflection.
    3. I drop all connections to those domains in cloudflare with rules
    4. In caddy, I drop all connections that come from a non-internal IP range for all internal services. Additionally I drop all connections from subnet that should not be allowed to access those services (network is segmented into VLANs)
    5. I use tailscale to avoid having to have routes from the Internet into my internal services for when I’m not at home.
    6. For externally accessible routes, I have entirely separate configurations that proxy access to them. And external DNS still points to cloudflare, which has very restrictive rules on allowable connections.

    Hopefully this information helps someone else that’s also trying to do this.


  • I just:

    1. Have my router setup with DNS for domains I want to direct locally, and point them to:
    2. Have a reverse proxy that has auto- certbot behavior (caddy) connected to the cloud flair API. Anytime I add a new domain or subdomain for reverse proxine to a particular device on my network a valid certificate is automatically generated for me. They are also automatically renewed
    3. Navigation I do within my local network to these domains gives me real certificates, my traffic never goes to the internet.





  • We have a principal engineer on our team that is pushing this sort of style, hard.

    It’s essentially obfuscation, no one else on the team can really review, nevermind understand and maintain what they write. It’s all just functional abstractions on top of abstractions, every little thing is a function, even property/field access is extracted out to a function instead of just… Using dot notation like a normal person.


  • You might not necessarily have to fork BitTorrent and instead if you have your own protocol for grouping and breaking the data into manageable chunks of a particular size and each one of those represents an actual full torrent. Then you won’t necessarily have to worry about completion levels on those torrents and you can rely on the protocol to do its thing.

    Instead of trying to modify the protocol modify the process that you wish to use protocol with.