• 0 Posts
  • 42 Comments
Joined 11 months ago
cake
Cake day: July 25th, 2023

help-circle





  • You mean the always-on GPS-enabled internet-connected microphone and camera which is also likely Bluetooth and NFC beaconing and contains all of my most personal data including my name, contacts, unencrypted chats facilitated by major cell phone carriers, photos, emails, and other personal files which are also likely synced with a cloud service operated by major multi-national corporations, and also stores biometric data such as facial recognition, fingerprints, time spent sleeping, and even heart rate and number of steps taken assuming you have “fitness” features enabled?

    With those last couple items, these massive companies that regularly share data with law enforcement are literally tracking your every step and nearly every beat of your heart.

    Well don’t worry about that, I’ve got Express VPN.





  • While you make a valid point here, mine was simply that once something is out there, it’s nearly impossible to remove. At a certain point, the nature of the internet is that you no longer control the data that you put out there. Not that you no longer own it and not that you shouldn’t have a say. Even though you initially consented, you can’t guarantee that any site will fulfill a request to delete.

    Should authors and artists be fairly compensated for their work? Yes, absolutely. And yes, these AI generators should be built upon properly licensed works. But there’s something really tricky about these AI systems. The training data isn’t discrete once the model is built. You can’t just remove bits and pieces. The data is abstracted. The company would have to (and probably should have to) build a whole new model with only propeely licensed works. And they’d have to rebuild it every time a license agreement changed.

    That technological design makes it all the more difficult both in terms of proving that unlicensed data was used and in terms of responding to requests to remove said data. You might be able to get a language model to reveal something solid that indicates where it got it’s information, but it isn’t simple or easy. And it’s even more difficult with visual works.

    There’s an opportunity for the industry to legitimize here by creating a method to manage data within a model but they won’t do it without incentive like millions of dollars in copyright lawsuits.




  • Oh man, I thought it would be your typical excessive force types of issues. Nope.

    “One of the SROs had been terminated by the Chicago Police Department in 2019 for sexual misconduct with a minor.” “One of the officers, Darius Alexander, 50… was caught witnessing a drug transaction, arrested 2 males and took 2 females from the same incident and coerced one of the underage girls into contact…”

    Sounds like the perfect authority figure to place in a school, right? Maybe we wouldn’t need a SRO if the schools weren’t actively inviting predators in.







  • From a legal standpoint, I sort of get it. One risk of the fediverse is that data is cached locally from federated servers. That could put server owners in legal jeopardy for hosting illegal content. However, if the server is actively moderated and owners respond responsibly to take down requests, they should be okay - in the US at least, and assuming current protections for service providers remain intact.

    I think a good option (if technically feasible) could be to have the choice to de-cache communities or servers that are questionable and make it so that data is transmitted live from the federated server when requested by a client. That would add load to both the local and federated servers though, especially if volume is high.