Leaked Docs Show What Phones Cellebrite Can (and Can’t) Unlock

The leaked April 2024 documents, obtained and verified by 404 Media, show Cellebrite could not unlock a large chunk of modern iPhones.

Cellebrite, the well-known mobile forensics company, was unable to unlock a sizable chunk of modern iPhones available on the market as of April 2024, according to leaked documents verified by 404 Media.

The documents, which also show what various Android handsets and operating system versions Cellebrite can access, provide granular insight into the very recent state of mobile forensic technology. Mobile forensics companies typically do not release details on what specific models their tools can or cannot penetrate, instead using vague terms in marketing materials. The documents obtained by 404 Media, which are given to customers but not published publicly, show how fluid and fast moving the success, or failure, of mobile forensic tools can be, and highlights the constant cat and mouse game between hardware and operating manufacturers like Apple and Google, and the hacking companies looking for vulnerabilities to exploit.

Analysis of the documents also comes after the FBI announced it had successfully gained access to the mobile phone used by Thomas Matthew Crooks, the suspected shooter in the attempted assassination of former President Donald Trump. The FBI has not released details on what brand of phone Crooks used, and it has not said how it was able to unlock his phone.

The documents are titled “Cellebrite iOS Support Matrix” and “Cellebrite Android Support Matrix” respectively. An anonymous source recently sent the full PDFs to 404 Media, who said they obtained them from a Cellebrite customer. GrapheneOS, a privacy and security focused Android-based operating system, previously published screenshots of the same documents online in May, but the material did not receive wider attention beyond the mobile forensics community.

For all locked iPhones able to run 17.4 or newer, the Cellebrite document says “In Research,” meaning they cannot necessarily be unlocked with Cellebrite’s tools. For previous iterations of iOS 17, stretching from 17.1 to 17.3.1, Cellebrite says it does support the iPhone XR and iPhone 11 series. Specifically, the document says Cellebrite recently added support to those models for its Supersonic BF [brute force] capability, which claims to gain access to phones quickly. But for the iPhone 12 and up running those operating systems, Cellebrite says support is “Coming soon.”

A SECTION OF THE IOS DOCUMENT. IMAGE: 404 MEDIA.

The iPhone 11 was released in 2019. The iPhone 12 was launched the following year. In other words, Cellebrite was only able to unlock iPhones running the penultimate version of iOS that were released nearly five years ago.

The most recent version of iOS in April 2024 was 17.4.1, which was released in March 2024. Apple then released 17.5.1 in May. According to Apple’s own publicly released data from June, the vast majority of iPhone users have upgraded to iOS 17, with the operating system being installed on 77 percent of all iPhones, and 87 percent of iPhones introduced in the last four years. The data does not break what percentage of those users are on each iteration of iOS 17, though.

Cellebrite offers a variety of mobile forensics tools. That includes the UFED, a hardware device that can extract data from a physically connected mobile phone. The UFED is a common sight in police departments across the country and world, and is sometimes used outside of law enforcement too. Cellebrite also sells Cellebrite Premium, a service that either gives the client’s UFED more capabilities, is handled in Cellebrite’s own cloud, or comes as an “offline turnkey solution,” according to a video on Cellebrite’s website.

That video says that Cellebrite Premium is capable of obtaining the passcode for “nearly all of today’s mobile devices, including the latest iOS and Android versions.”

That claim does not appear to be reflected in the leaked documents, which show that, as of April, Cellebrite could not access from locked iOS phones running 17.4.

The second document shows that Cellebrite does not have blanket coverage of locked Android devices either, although it covers most of those listed. Cellebrite cannot, for example, brute force a Google Pixel 6, 7, or 8 that has been turned off to get the users’ data, according to the document. The most recent version of Android at the time of the Cellebrite documents was Android 14, released October 2023. The Pixel 6 was released in 2021.

A SECTION OF THE ANDROID DOCUMENT. IMAGE: 404 MEDIA.

Cellebrite confirmed the authenticity of the documents in an emailed statement to 404 Media. “Similar to any other software company, the documents are designed to help our customers understand Cellebrite’s technology capabilities as they conduct ethical, legally sanctioned investigations—bound by the confines of a search warrant or an owner’s consent to search. The reason we do not openly advertise our updates is so that bad actors are not privy to information that could further their criminal activity,” Victor Ryan Cooper, senior director of corporate communications and content at Cellebrite, wrote.

“Cellebrite does not sell to countries sanctioned by the U.S., EU, UK or Israeli governments or those on the Financial Action Task Force (FATF) blacklist. We only work with and pursue customers who we believe will act lawfully and not in a manner incompatible with privacy rights or human rights,” the email added. In 2021 Al Jazeera and Haaretz reported that a paramilitary force in Bangladesh was trained to use Cellebrite’s technology.

Cellebrite is not the only mobile forensics company targeting iOS devices. Grayshift makes a product called the GrayKey, which originally was focused on iOS devices before expanding to Android phones too. It is not clear what the GrayKey’s current capabilities are. Magnet Forensics, which merged with Grayshift in 2023, did not immediately respond to a request for comment.

Cellebrite’s Android-focused document also explicitly mentions GrapheneOS in two tables. As well as being an operating system that the privacy-conscious might use, 404 Media has spoken to multiple people in the underground industry selling secure phones to drug traffickers who said some of their clients have moved to using GrapheneOS in recent years.

Daniel Micay, founder of GrapheneOS, told 404 Media that GrapheneOS joined a Discord server whose members include law enforcement officials and which is dedicated to discussions around mobile forensics. “We joined and they approved us, with our official GrapheneOS account, but it seems some cops got really mad and got a mod to ban us even though we didn’t post anything off topic or do anything bad,” Micay said.

There is intense secrecy around the community of mobile forensics experts that discuss the latest unlocking tricks and shortcomings with their peers. In 2018 at Motherboard, I reported that law enforcement officials were trying to hide their emails about phone unlocking tools. At the time, I was receiving leaks of emails and documents from inside mobile forensics groups. In an attempt to obtain more information, I sent public records requests for more emails.

“Just a heads up, my department received two public records request[s] from a Joseph Cox at Motherboard.com requesting 2 years of my emails,” a law enforcement official wrote in one email to other members. I learned of this through a subsequent leak of that email. (404 Media continues to receive leaks, including a recent set of screenshots from a mobile forensics Discord group).

Google did not respond to a request for comment. Apple declined to comment.

  • disguy_ovahea@lemmy.world
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    edit-2
    5 months ago

    I’ve read about the NSA exploiting third-party push notification systems from Snowden. I’ve only read the opposite of your claim about Apple providing a backdoor for the government. Do you have a source so I can read more about it?

    • poVoq@slrpnk.net
      link
      fedilink
      arrow-up
      11
      arrow-down
      11
      ·
      5 months ago

      Apple is a US based company. They are legally required to comply and not openly speak about it when a court asks them to do so.

      Same for Google obviously, but since their software on end-devices is much more diverse, it is much harder to do.

      • disguy_ovahea@lemmy.world
        link
        fedilink
        arrow-up
        17
        arrow-down
        4
        ·
        edit-2
        5 months ago

        Apple’s legal policy is very clear. They do not have access to data on customer devices, and therefore cannot provide said data to law enforcement or government officials.

        They’ve fought the backdoor request in court 11 times and won. They are not mandated to comply by law if they themselves do not have access.

        They can and will provide iCloud data in response to a warrant.

        https://www.apple.com/legal/privacy/law-enforcement-guidelines-us.pdf

        • notanaltaccount@lemmy.world
          link
          fedilink
          arrow-up
          4
          arrow-down
          9
          ·
          5 months ago

          This is an irrelevant and stupid comment. If Apple is required by a gag order not to disclose an exploit, their legal policy would not violate the gag order.

          You think if they did build an exploit after a demand they would write about it in their legal notices? That’s just dumb.

          • disguy_ovahea@lemmy.world
            link
            fedilink
            arrow-up
            14
            ·
            edit-2
            5 months ago

            You claimed that Apple intentionally places vulnerabilities in updates to allow governmental access, right?

            They’ve repeatedly claimed in court that they do not have access to on-device data or transmissions, as stated in their legal privacy policy. If they created a vulnerability, then they could exploit it, making the above claims both perjury and breach of contract.

            Please clarify if I misunderstood your comment.

            • poVoq@slrpnk.net
              link
              fedilink
              arrow-up
              3
              arrow-down
              9
              ·
              5 months ago

              No, I claimed that they are legally required to do so, should a court ask them to and would very likely also be hit with a gag order to not tell anybody about it.

              They can easy claim in retrospect that they can’t access data, but this says nothing about them being forced to change this for future updates. And besides that, for sure they have a clause in their end-user contract that exempts necessary legal compliance.

              Thus saying that you should always update because upstream always has your best interest in mind is certainly false in the case of Apple.

    • unexposedhazard@discuss.tchncs.de
      link
      fedilink
      arrow-up
      7
      arrow-down
      10
      ·
      edit-2
      5 months ago

      Its not a matter of “have they done it” its just that the US feds can legally force any corporation to do as they please under threat of violence while binding them with gag orders.

      Even if there is no proof so far, there is no reason to believe that they havent done so, because they legally can.

      Its also specifically disallowed to deploy any kind of system that automatically notifies the public in case law enforcement compromises a companies servers as part of an investigation. Apple used to have such a warrant canary but they removed it some years ago.

      See Warrant Canary

        • unexposedhazard@discuss.tchncs.de
          link
          fedilink
          arrow-up
          9
          arrow-down
          9
          ·
          edit-2
          5 months ago

          Its a closed source operating system dude, they can do whatever they want. Just because there are lawsuits, doesnt mean they dont have a backdoor to your phone.

          They can literally force apple to pretend that they didnt give them a backdoor. There are basically no limits to law enforcement powers as long as they say the word “terorrism”.

          • disguy_ovahea@lemmy.world
            link
            fedilink
            arrow-up
            9
            ·
            5 months ago

            I’m not for trusting corporations either, but you don’t have a source, only theories.

            Snowden is pretty much the authority on NSA vulnerabilities, and he hasn’t released any proof that Apple has a backdoor on their devices. The only thing he’s demonstrated is how the NSA has used MITM on third-party push notifications, which Apple has since encrypted and relayed to obfuscate.

            If you have a source, I’m down to read it. Otherwise, you’re just speculating.

            • BobGnarley@lemm.ee
              link
              fedilink
              arrow-up
              1
              ·
              5 months ago

              So, no evidence that they have compelled Apple to do so but that little bit about how the US government can force a companies hand and force them not to talk about it is ONE HUNDRED PERCENT true.

              So, it isn’t like the dude is just pulling stuff out of his ass. Apple did used to have a warrant canary as well, and they don’t anymore. You should watch the interview with Lavabit’s creator and how the feds wanted to take over his shit and fake his SSL and he took them to court about it and still can’t talk about it due to the (undemocratic) gag order.

              https://proton.me/blog/us-warrantless-surveillance

            • unexposedhazard@discuss.tchncs.de
              link
              fedilink
              arrow-up
              4
              arrow-down
              4
              ·
              edit-2
              5 months ago

              Yeah i am just speculating. But so are people believing that its secure. Just like not having proof against god, doesnt make god real. If something cant be proven to be secure, it should be assumed that its compromised. There is no solution to this other than proving that its secure by releasing source code.

              • disguy_ovahea@lemmy.world
                link
                fedilink
                arrow-up
                3
                arrow-down
                2
                ·
                5 months ago

                Well, they’ve built a trillion dollar business on that claim to privacy. If it turned out to be false, the class action suit and customer loss would be tremendous.

                • notanaltaccount@lemmy.world
                  link
                  fedilink
                  arrow-up
                  4
                  arrow-down
                  5
                  ·
                  5 months ago

                  That’s bullshit. They would be insulated from that since they were required to put in exploits. The government doesn’t say “You are required to do A B C” and then let people sue over A B or C.

            • notanaltaccount@lemmy.world
              link
              fedilink
              arrow-up
              2
              arrow-down
              4
              ·
              edit-2
              5 months ago

              This view is incredibly naive. If a backdoor exploit was added by one group of developers who did the code for the cellular modem and network parts of the operating system, there would only be certain people aware on a need to know basis. You could have other ignorant law enforcement officers unaware of the exploit making demands in court as well as Apple’s legal department fighting requests and the exploit is still there. It is incredibly naive and frankly stupid to be believe that a lack of a leak about closed source code means probably an exploit like that doesn’t exist. Demanding proof with closed source code, gag orders, and large development teams, only some of whom could know about an exploit and gag order, is just not really being realistic.

              • disguy_ovahea@lemmy.world
                link
                fedilink
                arrow-up
                2
                arrow-down
                2
                ·
                edit-2
                5 months ago

                Where’s the benefit? Apple would be putting their business model at stake to support this. They don’t collect or sell user data, so they’re not gaining any equity. What’s their motive to breach their customer agreement and commit perjury in front of Congress?

                • notanaltaccount@lemmy.world
                  link
                  fedilink
                  arrow-up
                  4
                  arrow-down
                  3
                  ·
                  5 months ago

                  They don’t need to incur a benefit. If the government contacta their legal department with an order saying we need to talk to your developers in charge of iOS as part of a court order, they are required to do it and can face jail and fines if they don’t comply.

                  If the government says “don’t say there’z an exploit” it’s not going to be disclosed in their policy and they will be protected for lying by omission if a court is requiring that.

                  What is their motive? Avoiding contempt.

                  Are you this naive about closed source code or the power of jails and fines to persuade people to lie? How would this threaten their business model? How would anyone know about the exploit given their code is closed source and law enforcement regularly use parallel construction? No one can audit the code. Do you know what closed aource code is or how a gag order works?

                  • disguy_ovahea@lemmy.world
                    link
                    fedilink
                    arrow-up
                    6
                    arrow-down
                    2
                    ·
                    5 months ago

                    Did you not read the articles I’ve linked above? They absolutely do not need to comply. That’s what they’ve fought and won 11 times in court.

          • jet@hackertalks.com
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            2
            ·
            edit-2
            5 months ago

            The conversation below this message goes really off the rails.

            If anyone wants to look into the actual legal theory of compelling businesses to do things, look for a national security letters.

            And yes, the legal framework exists, to compel businesses to do something for national security purposes, and not disclose that they’ve done that.

            I just want to remind everybody fighting below, security is about capabilities, not intentions. It doesn’t matter if a company wouldn’t do something if they had a choice, just that they can do it. That’s how you model your threats by what people can do not what you think they will do