It's a nightmare scenario for Microsoft. The headlining feature of its new Copilot+ PC initiative, which is supposed to drive millions of PC sales over the next couple of years, is under significant fire for being what many say is a major breach of privacy and security on Windows. That feature in question is Windows Recall, a new AI tool designed to remember everything you do on Windows. The feature that we never asked and never wanted it.
Microsoft, has done a lot to degrade the Windows user experience over the last few years. Everything from obtrusive advertisements to full-screen popups, ignoring app defaults, forcing a Microsoft Account, and more have eroded the trust relationship between Windows users and Microsoft.
It's no surprise that users are already assuming that Microsoft will eventually end up collecting that data and using it to shape advertisements for you. That really would be a huge invasion of privacy, and people fully expect Microsoft to do it, and it's those bad Windows practices that have led people to this conclusion.
It seems to me that we've reached a crossroads. I've been very aware of the data mining, garden walls, data trading, privacy violations, security issues, ownership issues, etc. - for roughly 30 years. I regularly make the choice to be exploited for the benefits I extract, largely because the data they've gotten from me thus far I don't highly value. But the necessity to develop strategies to keep the devil's bargain beneficial has reached a fevered pitch. I want to train my own AI and public AIs. I want to explore the vast higher dimensional semantic spaces of generative models without API charges. APIs are vanishing as we speak, anyway, companies fearful of their data being extracted without compensation. Can't really sit on the Open/Closed fence anymore.