>put messages into someone else's system
>don't read privacy policy
>someone else uses your messages
surprisedpikatchu.jpg
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
>put messages into someone else's system
>don't read privacy policy
>someone else uses your messages
surprisedpikatchu.jpg
Seriously. What would be surprising is if they were not. Proprietary System gonna Proprietary System.
Just the other day, me and @rottingleaf@lemmy.zip "designed" a new messenger to combat things like this: https://lazysoci.al/comment/9619656
Your idea doesn't sound too difficult to implement but I don't know if people would want to store all these messages locally when the vast majority of people are used to having their shit be stored elsewhere. Additionally, if you wanted to target enterprise users, they would want to likely have all their messages centralised for auditing purposes
Other than that, I think its a pretty neat idea
I think that's the issue. We're all so used to the idea of free storage and we're not cognizant of the consequences. If we start holding some of our chips in our own hands, all these corporations won't be able to sell us out and abuse us so easily.
Also thank you!
It's what's going to happen. It's what always happens. And on a side note, by the way, I guaran-fucking-tee you that it's what's going to eventually happen with Discord as well. I have zero doubt about it.
I'm surprised that Slack beat Discord to it.
But yeah, you're right. We need to invest our time, energy and support into self hosted solutions.
Do you know how to break the cycle? Use open-source software. Use standard protocols that aren't locked behind some greedy corporation.
Why not take the features from Discord/Slack and integrate it into a new IRC or Jabber protocol?
Has anybody tried Revolt? It looks really cool. Like a proper open source alternative to Discord. But I never had the opportunity to try it with anyone, so I don't know.
Interesting, the name sounds familiar, it's not based on Matrix and they're planning encrypted messages. Oh, you can self-host it!
Mumble hasn't fed any of my data to a megacorp!
Technically, being open source or free ala GPL isn't enough. Protocols aren't enough.
You need a guarantee that you own your data.
I mean, if the Threadiverse has enough volume to be useful, someone -- probably many people -- are going to be logging and training things off it too.
That's the nature of public shit posting.
The real issue is that tech creeps and other pests think they own my shit posting.
At this point, I think the genie is out of the bottle. I feel like unless you're on some p2p encrypted chat, anything typed into the internet is getting scraped. I'm sure everyone at this point has had at least one comment scraped and used for language model stuff.
I don't like it. But it seems like corporations will always find ways to make money off of other people no matter what
To be honest, if someone thought that public things on the internet are not getting scraped, I am not sure what to tell them… Search engines have been doing it since the beginning of search engines, it is no wonder that the same would be done to train AI.
Corporations are bad and yet still follow laws (in the west). The bigger issue is state actors. Especially the non-democratic ones.
It’s not no matter what. It’s under the system we have they are not only not punished for doing so, they are heavily incentivized to do so. There are ways to punish bad actors that de-incentivize other potential bad actors, our politicians actively choose to prioritize these bad actors ability to do harm over the well being of the population.
I honestly don't get the outage over that. I feel like I'm in the minority on that, though. I don't care if linguistic statics are gathered from my public comments. Knock yourself out.
This story is about "private" messages on a free hosted service, and I think their users are just being naive if they think this is beyond the pale. But I get the feeling of violation at least a little.
I know of a few security companies that use slack to work together that includes a shitton of privat data, source codes and confidentional information
Guess whoever introduced the company to slack service fucked up by not reading their policies.
Or they're using the paid tier
I'm working in fintech, and we share pii through DMs all the time (for investigation purposes). I'd be really surprised if the AI would need to train on that.
This was definitely a fuckup from Slack but as I've understood it, the "AI training" means that they're able to suggest emoji reactions to messages.
Not sure how to think about this, but here's some additional info from slack: https://slack.engineering/how-we-built-slack-ai-to-be-secure-and-private/
Edit: Just to pick main point from the article:
Slack AI principles to guide us.
AI training to suggest emoji reactions? Really? 😂
Interesting how MS is the reasonable one here where all their copilot stuff clearly separates paying business from free consumer stuff for training / not training.
However slack has gone and said they will train on everything, and ONLY the paying companies can request to opt out.
Too bad so sad for all those small dev teams that have been using the "free" version of slack.. No option to opt out.
Wasn't there a competitor named Mattermost?
a FLOSS competitor?
We need to watermark insert something into our watermark posts that watermark can be traced back to its origin watermark if the AI starts training watermark on it.