• 14 Posts
  • 307 Comments
Joined 6 months ago
cake
Cake day: November 5th, 2024

help-circle



  • The internet tankies are just a small, mostly self-isolating group of authoritarian west-haters who wank each other off about how much theory they’ve read. They do not reach out to engage with the rest of the internet for any other reason than because their usual circlejerk spankbank ran dry and they need to refill it with new material. That entails baiting unsuspecting people into arguing with them by spouting rehearsed and reused, easily disproven talking point so the unsuspecting person is engaged with them for a few replies until they’re deep enough into a comment thread that nobody else is going to read it. That’s when their friends show up and they begin jerking each other off over you.

    If you get to this point, you lost. Further replies add to their spankbank. Not replying proves them right and adds to their spankbank.

    The only winning move is not to reply.





  • Der er et problem som folk ikke taler om her: Der er mange der ikke kan få ro i deres hjem. Det er fint at sige at folk skal slappe af med koncerter, men en konstant lav tone giver altså også tinnitus i længen. Biler, naboer og hvidevarer og hvad har man. Den bygning jeg bor i har en konstant lav dronen og den driver mig til vandhvid og der er intet jeg kan gøre ved det. Mit køleskab resonere med mine vægge. Mine undernaboer spiller højt musik næsten hver dag gerne ud til klokken 2. Mine overnaboer har børn der løber på væggene imellem 20 or 23. Jeg bor ud til en parkeringsplads hvor jeg har blevet nød til at gå ud og råbe af folk klokken lort om morgenen fordi de sidder bare i deres tændte bil og glor. Der er mange brian-biler med deres tunede udstødning her som køre i cirkler rundt om min lejlighed. Isbilen holder lige ud for mit vindu og hamre den klokke på de mærkeligste tidspunkter. Der er aldrig ro!








  • Okay but I wasn’t arguing morality or about children posting nudes of themselves. I’m just telling you that works submitted into the public domain can’t be retracted and there are models trained on exclusively open data, which a lot of AI haters don’t know, understand or won’t acknowledge. That’s all I’m saying. AI is not bad, corporations make it bad.

    The law isn’t a reliable compass for what is or isn’t right.

    Fuck yea it ain’t, I’m the biggest copyright and IP law hater on this platform and I’ll get ahead of the next 10 replies by saying no it’s not because I want to enable mindless corporate content scraping; it’s because human creativity shouldn’t not be boxed in. It should be shared freely, lest our culture be lost.



  • TOR is just slightly harder to keep up on as far as being listed on the same tables as commercial VPN hosts because it’s so dynamic. Anyone can spin up a node and be a relay or, for the brave/foolish, an exit node in a few minutes.

    Actually Tor relays and exits are published, public knowledge and you will be on every list that cares about listing those within hours of spinning up a relay or exit.


  • I can’t make you understand more than you’re willing to understand. Works in the public domain are forfeited for eternity, you don’t get to come back in 10 years and go ‘well actually I take it back’. That’s not how licensing works. That’s not victim blaming, that’s telling you not to license your nudes in such a manner that people can use them freely.



  • I actually think it’s very interesting how nobody in this community seems to know or understand how these models work, or even vaguely follow the open source development of them. The first models didn’t have this problem, it was when OpenAI realized there was money to be made that they started scraping the internet and training illegally and consequently a billion other startups did the same because that’s how silicon valley operates.

    This is not an issue of AI being bad, it’s an issue of capitalist incentive structures.



  • I think his argument is that the models initially needed lots of data to verify and validate their current operation. Subsequent advances may have allowed those models to be created cleanly, but those advances relied on tainted data, thus making the advances themselves tainted.

    It’s not true; you can just train a model from the ground up on properly licensed or open data, you don’t have to inherit anything. What you’re talking about is called finetuning which is where you “re-train” a model to do something specific because it’s much cheaper than training from the ground up.