• gravitas_deficiency@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    83
    arrow-down
    1
    ·
    edit-2
    4 months ago

    It’s entirely a nonstarter for entire fucking industries. That’s not hyperbole. I work in one of them.

    Edit: scratch that - If any infosec team, anywhere, in any industry, at any corporation or organization, doesn’t categorically refuse to certify for use any system that is running MS Recall, they should be summarily fired and blackballed from the industry. It’s that bad. For real: this is how secrets (as in, cryptographic) get leaked. The exposure and liability inherent to this service is comical in the extreme. This may actually kill the product.

    E2: to the title’s implication that such trust can be earned: it kinda can’t. That’s basically the point of really good passwords and secrets (private keys, basically): nobody else knows them. To try to dance around that is fundamentally futile. Also: who am I kidding, this shit will sell like hotcakes. Everyone’s on fucking Facebook, and look how horrifically they exploit everyone’s data for goddamn everything. This isn’t much worse than that to the average mostly-tech-illiterate consumer.

  • Juki@lemmy.world
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    2
    ·
    4 months ago

    For all the invasive problems this feature causes, what the fuck does it actually do? The ability to ask an ai what website you were on last Thursday? Who needs this garbage

    • XTL@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      2
      ·
      4 months ago

      The most evil company that ever existed needs it. So you will have it by default.

    • Z4rK@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      ·
      4 months ago

      The concept is useful. A well known idea capture of it is the famous “As We May Think” article from Vannevar Bush all the way back in 1945, which conceptualized a machine “Memex” that would enhance humans capabilities with for example memory and recall. A lot of humans needs help with this and use devices for this daily, with notes, map lookups of where you parked, find my things for devices, analytics for photo libraries etc etc etc.

      The only issue here is the implementation.

    • Zeppo@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      27
      arrow-down
      2
      ·
      4 months ago

      It doesn’t transmit the data; it supposedly stores it locally. The issue is it’s a huge convenient plaintext trove of information if the system is compromised.

  • Dariusmiles2123@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    23
    ·
    4 months ago

    When I read this, I’m glad I ain’t using windows anymore.

    If it was turned off by default, it would be different as people would be consciously choosing. But turned on by default should be illegal.

    As some people are saying, a lot of this isn’t gonna be legal in some countries.

  • MyOpinion@lemm.ee
    link
    fedilink
    English
    arrow-up
    13
    ·
    4 months ago

    What this opens the door to is MICROSOFT will be able to get your database and be able to ask it questions as if it was talking to you. An AI agent of you that they can do what they like with. This is insanely dangerous.

    • ripcord@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      4 months ago

      Particularly since they’re requiring everyone to log in using credentials via their infrastructure.

      They absolutely have a way in.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    3
    ·
    4 months ago

    This is the best summary I could come up with:


    This, as many users in infosec communities on social media immediately pointed out, sounds like a potential security nightmare.

    Copilot+ PCs are required to have a fast neural processing unit (NPU) so that processing can be performed locally rather than sending data to the cloud; local snapshots are protected at rest by Windows’ disk encryption technologies, which are generally on by default if you’ve signed into a Microsoft account; neither Microsoft nor other users on the PC are supposed to be able to access any particular user’s Recall snapshots; and users can choose to exclude apps or (in most browsers) individual websites to exclude from Recall’s snapshots.

    This all sounds good in theory, but some users are beginning to use Recall now that the Windows 11 24H2 update is available in preview form, and the actual implementation has serious problems.

    Security researcher Kevin Beaumont, first in a thread on Mastodon and later in a more detailed blog post, has written about some of the potential implementation issues after enabling Recall on an unsupported system (which is currently the only way to try Recall since Copilot+ PCs that officially support the feature won’t ship until later this month).

    The short version is this: In its current form, Recall takes screenshots and uses OCR to grab the information on your screen; it then writes the contents of windows plus records of different user interactions in a locally stored SQLite database to track your activity.

    Data is stored on a per-app basis, presumably to make it easier for Microsoft’s app-exclusion feature to work.


    The original article contains 710 words, the summary contains 260 words. Saved 63%. I’m a bot and I’m open source!