Users' Windows 11 Recall database and screenshots may be accessed with another account

Security concerns surrounding one of Microsoft's new AI features have begun to pop up.

Microsoft
27

Later this month, Microsoft will release the first batch of Copilot + PCs, laptops that are designed with AI in mind and have several unique AI programs installed out of the box. One of these is Recall, which essentially allows users to go back and view their previous activity across any app or browser on their computer. As users have begun to test the Recall feature, some are pointing out how easy it is for personal information to fall into the wrong hands.

In a recent opinion piece, Ars Technica held a magnifying glass over Microsoft’s new Recall feature. While Microsoft has stated that there will be proper encryption on the Copilot + PC devices that come with Recall, this isn’t the case for those who are testing the feature out on other hardware. In a blog post, researcher Kevin Beaumont explained the massive security risk that comes with Recall.

The Microsoft Copilot + PC logo next to a laptop display.

Source: Microsoft

The way Recall works is that it constantly takes screenshots of whatever’s on your screen, storing them in a database that you can search to find exactly what you were doing on a certain date/time. In its current form, this information can be easily accessed by someone using the same computer, even if they’re logged into a different account. This database can also be accessed as the result of a virus infection.

We won’t know just how big of a security risk Recall is until the Copilot + PCs ship later this month. That said, stick with Shacknews for important stories out of the tech industry.

News Editor

Donovan is a journalist from Maryland. His oldest gaming memory is playing Pajama Sam on his mom's desktop during weekends. Pokémon Emerald, Halo 2, and the original Star Wars Battlefront 2 were some of the most influential titles in awakening his love for video games. After interning for Shacknews throughout college, Donovan graduated from Bowie State University in 2020 with a major in broadcast journalism and joined the team full-time. He is a huge film fanatic and will talk with you about movies and games all day. You can follow him on twitter @Donimals_

From The Chatty
    • reply
      June 4, 2024 10:06 PM

      [deleted]

      • reply
        June 4, 2024 10:06 PM

        [deleted]

      • reply
        June 4, 2024 11:26 PM

        Yeah, I read through the analysis and I frankly can't believe MS is actually implementing it this way.

        Breaches are going to happen and they're going to get sued.

        • reply
          June 4, 2024 11:34 PM

          I'm sure they'll improve it, but why not fix the obvious problems before turning it on by default? This seems a bit crazy.

          • reply
            June 4, 2024 11:35 PM

            And not only do they turn it on by default, there's no way to turn it off during setup; it'll always run at least a bit.

            The only option during install will bring up the settings later, once it's already running.

            • reply
              June 5, 2024 7:35 AM

              [deleted]

              • reply
                June 5, 2024 10:18 AM

                Even there the admin needs to set it to disabled for the profile. It defaults to on.

                • reply
                  June 5, 2024 10:19 AM

                  Though I'm not suggesting that part is a big deal; it's just one more things for admins to know about.

          • reply
            June 5, 2024 1:01 PM

            Announcing it in this form still is a huge mistake. I don’t know what they’re thinking.

            I wonder if the devil’s incarnate responsible for Teams made this. It would make total sense. It’s a product that does the opposite of every good convention.

        • reply
          June 5, 2024 12:20 AM

          They won't get sued, there are a ton of breaches and exposure caused poor coding on windows machines, it's so common that hardening practices are now as common as the exposures themselves. You would just disable recall via a GPO or SCCM policy.

          • reply
            June 5, 2024 12:58 AM

            We'll see. I think as implemented this is irresponsible enough that you'll see action.

            • reply
              June 5, 2024 12:59 AM

              And frankly more people should be held responsible for breaches to begin with.

        • reply
          June 5, 2024 7:34 AM

          [deleted]

      • reply
        June 4, 2024 11:32 PM

        Wow that seems to have a number of design flaws

      • reply
        June 5, 2024 12:10 AM

        Wow that is almost unbelievably. I’d definitely never use it on a machine in don’t have 100% control over

      • reply
        June 5, 2024 12:57 AM

        Can you tell if it's on? Could an employer enable this on employees machines without them knowing? I suppose they can do stuff like this already if they want.

        • reply
          June 5, 2024 7:41 AM

          [deleted]

        • reply
          June 5, 2024 7:52 AM

          it needs that copilot shit installed. If you don't have that, you are safe

        • reply
          June 5, 2024 7:52 AM

          Technically the machines belong to the company, not the employee. There is likely already all sorts of shit on there for protection and monitoring.

        • reply
          June 5, 2024 7:56 AM

          Any org that does that would be stupid, it would be a discovery nightmare. I can’t imagine most places would be ok with enabling this at all.

          • reply
            June 5, 2024 10:02 AM

            It's an absolute compliance nightmare for any regulated industry like finance or healthcare.

            • reply
              June 5, 2024 10:36 AM

              Yeah, exactly what you want. A shit ton of non tagged information about your company and its clients spread throughout every machine in your organization. A data governance nightmare too.

      • reply
        June 5, 2024 9:49 AM

        [deleted]

      • reply
        June 5, 2024 10:23 AM

        gnu history is all i need

      • reply
        June 5, 2024 10:30 AM

        I love how this was implemented in the world's most popular OS exactly the way an entry level developer would have done it.

      • reply
        June 5, 2024 10:34 AM

        I can't really think of a time when I'd ever need this feature.

      • reply
        June 5, 2024 10:40 AM

        I wouldn’t trust any company enough to use a feature like this, let alone Microsoft with their recent and historic security failings.

        • reply
          June 5, 2024 8:45 PM

          They haven't earned this level of trust. But speed to market is everything now so damn the torpedoes

      • reply
        June 5, 2024 11:15 AM

        [deleted]

      • reply
        June 5, 2024 11:55 AM

        [deleted]

      • reply
        June 5, 2024 1:05 PM

        [deleted]

      • reply
        June 5, 2024 1:32 PM

        Who fucking comes up with this shit?

        • reply
          June 5, 2024 2:18 PM

          LLM and AI blitz folks, similar thinking to BLOCKCHAIN levels of frenzy

          Stable diffusion and copilot and what not need fresh meat to “improve” so here we go with OS’s across all kinds making content on behalf of their users to then point cloud AI tools at

        • reply
          June 5, 2024 4:20 PM

          [deleted]

        • reply
          June 5, 2024 4:21 PM

          [deleted]

    • reply
      June 5, 2024 1:05 PM

      I like the idea of a high battery life windows laptop, can't I just disable all this AI shit and use it as a normal computer?

    • reply
      June 5, 2024 2:14 PM

      Can you image the goons in these meetings that just want to steal all of your information, and passing it off as AI, lol this is pathetic!

      • reply
        June 5, 2024 2:20 PM

        It’s clear the LLM sources are already drying up and they need new ones

        So each device will start doing this, and MS is just at least being honest

    • reply
      June 5, 2024 2:45 PM

      [deleted]

Hello, Meet Lola