So imagine you're on PornHub and then out of nowhere, Clippy shows up and says "hmmm looks like you need some help pleasuring yourself", then starts flicking through similar nude pictures and videos to what you've been looking at before. The idle animation of the AI assistant even changes to Clippy morphing into the shape of a penis and shagging a rolled up piece of lined paper is if it were a fleshlight. You can't tell if Microsoft are mocking you for being a coomer, nor can you tell whether to find Clippy's sexual deviancy funny or creepy.
Somehow that hypothetical dystopia of Clippy watching you masturbate is only slightly worse than what Microsoft plan to do with Recall. If the mere thought of a machine learning AI taking screenshots of your desktop every few seconds and learning from your computer usage habits isn't absolutely fucking terrifying... Then imagine that these are likely being uploaded to a server for the perusal of advertisers, intelligence agencies and any hackers skilled enough to break into Microsoft's servers.
Even if it was stored locally, all it takes is one dodgy web link for you to inadvertently send all your Recall data to a hacker and have it ransomed.