this post was submitted on 04 Jun 2024
2 points (100.0% liked)
Technology
59672 readers
3207 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Iirc chrome stores your local cookies/session in a place malware could also attack. Probably the same idea for other browsers.
I'm not sure I fully understand the issue here. If we're ok with that info being trivially retrievable by a bad actor, why isn't this ok?
Like I get you may not like it, and it's a target, but there are already lots of targets that have gotten a pass based on user permissions. Is it just the breadth of potential info? With the cookies you could potentially log into someone's bank account.
First, false equivalency.
Second, we’re not okay with cookies and session being in a place that could leak — it’s why we’re doing everything possible to stop that from happening (I mean GDPR alone is one effect of this).
Third, the fact that you can’t see a difference between cookies, which actually can be secured via proper encryption and signing, and a literally unencrypted database driven by OCRed screenshots (taken every couple of minutes) that requires an opt-out and is a very small slippery slope to that data making its way back to Microsoft’s own servers for their own greedy pursuits….then I’m not sure what to tell you.
Recall is wrong. And it’s indefensible. Period.
If you think it’s okay, then feel free to open everything up to Microsoft of who you are and what you do on your Copilot+ PC. I, for one, among many, will choose to secure my information as best as possible, including never using another Microsoft product again, if at all possible. And I’ve already done so for myself.
GDPR has little to do with this. People use site cookies to remember sessions and not have to login again, etc. I'd guess most browser users use and want to use this functionality. If you're fully opting out to not even have persistent sessions, I'm guessing you're in the far minority of users here.
I'm not aware of any non-trivial readily available built-in encryption for cookies. There are easy to find libraries that exist to just pull out cookies (stored locally including session tokens).
To clear up a bit more misinformation from your response: this is an offline feature. The data doesn't go back to Microsoft. It works even if your computer is disconnected from the internet. If you consider their word to be a lie on this part, that's you're right to believe, but until proven, isn't a fact.
Not at all true, GDPR is the exact reason why you see all of the sites these days letting users know that their site stores cookies and requesting acceptance of it. Hence why I said we, as a global society, are trying to do something about this, even if it's something as simple as cookie use disclosure on sites -- it's a start.
Never once said I did.
You're correct, data-at-rest encryption doesn't exist for cookies, but data-in-flight does with SSL. Also, signing cookies and samesite origin is a thing being done these days, which makes them quite improbable, if implemented properly, to be hacked for any actual use in terms of leaking logins to said sites.
For the moment, that's what they say, yes. And that's the problem, especially since it's turned on, by default. This -- is not -- something -- Microsoft has earned trust for.
But you are free to believe them all you want -- the rest of us who have seen what Microsoft has done these past 40 years use that as a guide to judge -- and history is usually a very good judge.
browser data is a potential liability, sure, but you have tools to manage it. you can delete pages or entire websites, you can use private windows, you can purge history older than 6 months or something like that, and at least a few browsers have a "forget" button that wipes out the last two hours of history. similar deals with cookies and other data, and we've collectively decided the benefit of having browser data is worth the risk.
not so here. Recall is a record of everything you've ever done on your PC. you can't selectively delete things like you can with browser history, the app and website exclusion is only as good as whatever Recall is using to detect apps and websites, and you can't redact sensitive info after the fact. people are generally okay with browser history and data because they know they have fine-grained controls to manage it, controls Recall doesn't have
So if they had a ui with buttons to 'pause for X length (could be forever)', buttons to 'forget last X length (once again could be forever), but everything else stayed the same, would it be acceptable?
Like I'm genuinely curious here.
if i were designing a recall program, here's how i would do it: it would take a screenshot every five seconds, OCR it, then run it through local quantized image recognition and word association neural networks, and then toss everything into a CryFS vault. when launching the recall program, you have to provide the password to unlock the vault so it can read and write to it. it can only run in the foreground (so you have to keep the window open for it to run, no closing it and forgetting about it) and it will display a status indicator in your system tray that provides a menu to pause or stop recording. afterwards, you can mark any text or region of the screen for redaction, and it'll redact it across all screenshots and delete it from the database; you can delete individual screenshots or entire periods of time; and there will be an easily accessible self-destruct option that shreds the database (i.e. overwriting it with random garbage 21 times before deleting it off the disk). this is all offline and the application will not request network access
i'm just making this up on the fly, so there are absolutely security and privacy considerations I absolutely forgot about, but this is the bare minimum i would like to see
IIUC it wouldn't be able to be automatically started then, right? I mean I guess you could drag it to startup but it would need the password to start. From a security minded perspective that's good, but from a user perspective kind of sucks. I already unlocked the computer: as a user id just want it to 'work'.
There is always a tug of war between best level of security and user experience. I guess the best security is to get rid of the human element though.. so eh.
Always forced to foreground makes it even less convenient and kind of odd. I dig the status tray control though. I don't see this functionality as being useful if you have to remember to turn it on. If I remember what I was doing enough to turn it on, I'd write down what I'd forget. To me it's about allowing the user to pick their comfort level.
I figure the cryptfs could be a bitlocker volume with a different key than the base C drives key to get similar protection. In theory it could also be based on the C drives bitlocker for a less secure, but still hardware level secured middle ground. Id have to think about it more.
The other stuff mentioned is basically what it does locally in terms of OCR and recognition.. just with proprietary local recipes.
Thanks for your thoughts.
that's true, but since this is a record of everything you've ever done, i feel this is the irreducible minimum for security. a separate password prompt would signal to the less technically-minded users that this is Serious
this is a design pattern i borrowed from Linux (my OS of choice). modern Linux apps require your explicit permission to run in the background, so most of them don't even bother with running in the background at all. that said, i suppose it can run in the background, as long as the status indicator is sufficiently noticeable, but you'd have to go into the settings and flip that switch yourself
i imagine that it would become a habit, or you'd set it to run on startup. my use case would be turning it on for specific tasks like research or shopping, where you might only later remember that that one thing you saw was actually really valuable
can a user-installed app do that?
When you go on the internet you are accessing content on other people's computers. You are saying, "I want such and such document". There's an inherent lack of privacy in browsing the internet. You can try to be private about it, but ultimately you're not changing that you're requesting data from other people's computers and sending them data.
When you are doing something else on your PC besides browsing the web, Recall is still taking screenshots and tracking you. What apps you use, pictures you view, and many other things that might be completely offline and you don't necessarily want a history of stored on your PC, with screenshots and searchable summaries. Do you want each and every one of your fap sessions recorded? Why would you want any of your offline activity recorded?
What if you forget to pause this feature and someone finds these screenshots? Who cares, right? What if your a closeted gay teen living in a conservative country and your family finds the history?
Then there are people who don't understand computers using offline business software for accounting, or whatever, and even if they store their data files on an encrypted drive or something, Recall is taking screenshots of everything they do. If they don't even know its happening, their PC could have years of data that could be stollen from them at any point in the future. Even if they never open those encrypted files again. Obviously, if their computer is pwned, then the hackers could just take the enencrypted files when they're next accessed, but Recall snapshots everything all the time, even if you delete it.
Edit a self nude photo on your PC and forget to turn off Recall, and then layer decide to delete the photo... Too bad, Recall still has it.
It's a feature that's... ok if you want it, but it should not be part of the operating system, and it definitely shouldn't be opt-out. It should be an app that you install with deliberate purpose if and only if you want itand understand the security and privacy risks.
Microsoft instead wants to install it by default and probably turn it on by default. Even if it ends up being opt-in, MS has a long history of asking people to enable features in misleading ways. And the vast majority of Windows users don't understand computers!
I tend to agree with a lot of what is said here. Though it is (assuming they're honest) local only to be clear.
If it was an opt in feature with robust configurations including encrypting the db based off your login session and was auto locked up on log off/reboot (until login again): is that good enough, or would folks then say we should assume the account is also compromised?
I'm trying to disambiguate between generalize ai dislike, Microsoft dislike, windows dislike, distrusts, etc. to consider a world where this exists in Windows and people who would use the feature would feel comfortable
In other words, consider an app that did the same thing. What security constraints would be expected?