Looks like no one’s replied in a while. To start the conversation again, simply ask a new question.

Dmg Security

Hi, I'm trying to determine this for many of my business clients who are considering switching away from Apple products to Open Source due to the recent NSA disclosures.



Does Apple's DMG format have a backdoor to allow Apple access to the data within them to comply with government?



I'm not suggesting that I believe the AES encryption itself has an inherent issue, I want to know if anyone has been able to independently verify that Apple's implementation with DMG is lacking a backdoor that can bypass AES.



We already know there's a backdoor for iPhones that Apple will access for law enforcement.



http://news.cnet.com/8301-13578_3-57583843-38/apple-deluged-by-police-demands-to -decrypt-iphones/



Is an Apple DMG also subverted with a backdoor as well?



Thank you.



P.S. This is a sincere security question for clients that have proprietary information and want to know what their risks are with Apple DMGs. Please don't bring politics into this.

Posted on Sep 16, 2013 1:47 AM

Reply
Question marked as Best reply

Posted on Sep 16, 2013 7:42 AM

First of all, iPhones have nothing to do with DMG files. The default passcode on an iPhone is 4 numerical digits. Assuming it takes one second to attempt to enter a passcode, one could try all possible passcodes in less than 3 hours. I am sure that Apple has the ability to try passcodes faster than that and has the ability to disable automatic wiping after 10 failed attempts. Only Apple has access to the hardware and software designs to do this.


DMG files allow full passphrases that would take tens of thousands of years to decrypt that way. iPhones support those too but few people use them because they are hard to type on a small device.


Those same government agencies that need Apple's help to break even this simple passcode are, however, able to infiltrate secure open-source networks like Tor and the Internet all by themselves. Therefore, it seems like Apple offers a far more secure option for your clients. People debate whether or not the NSA and the FBI have the legal authority to break into these networks, but obviously they have the technological means. With a valid search warrant, government agencies have the legal authority to break into Apple encryption but still can't do it by themselves.


There are always risks when one is trying to use technology whose details are beyond one's technical abilities. Using open source would multiply those risks many times because, even without encryption, it is so much harder to use and understand. You may be worried about a backdoor in Apple technology, but there are 10,000 other things to be worried about with open source. For all you know, your open source software is uploading your passwords in plain text.

17 replies
Question marked as Best reply

Sep 16, 2013 7:42 AM in response to Cowicide Moo

First of all, iPhones have nothing to do with DMG files. The default passcode on an iPhone is 4 numerical digits. Assuming it takes one second to attempt to enter a passcode, one could try all possible passcodes in less than 3 hours. I am sure that Apple has the ability to try passcodes faster than that and has the ability to disable automatic wiping after 10 failed attempts. Only Apple has access to the hardware and software designs to do this.


DMG files allow full passphrases that would take tens of thousands of years to decrypt that way. iPhones support those too but few people use them because they are hard to type on a small device.


Those same government agencies that need Apple's help to break even this simple passcode are, however, able to infiltrate secure open-source networks like Tor and the Internet all by themselves. Therefore, it seems like Apple offers a far more secure option for your clients. People debate whether or not the NSA and the FBI have the legal authority to break into these networks, but obviously they have the technological means. With a valid search warrant, government agencies have the legal authority to break into Apple encryption but still can't do it by themselves.


There are always risks when one is trying to use technology whose details are beyond one's technical abilities. Using open source would multiply those risks many times because, even without encryption, it is so much harder to use and understand. You may be worried about a backdoor in Apple technology, but there are 10,000 other things to be worried about with open source. For all you know, your open source software is uploading your passwords in plain text.

Sep 16, 2013 10:56 AM in response to Cowicide Moo

I do not know why you are worried about this. You have spent too much time talking to conspericy theorists.


I'm not suggesting that I believe the AES encryption itself has an inherent issue, I want to know if anyone has been able to independently verify that Apple's implementation with DMG is lacking a backdoor that can bypass AES.


I'd think you could verify this. Find some other implementation of AES encryption. Create some file. Run both encryption methods against file. See if the resulting files are the same. If same, Apple must be using same algorithm.


Another though.


let's say there was a backdoor. Some one would find out about it eventually. Apple would be sued. Apple products would be abandoned in mass. Apple would be in danger of going out of business.


Adding a backdoor isn't worth it in anyway for apple to do this.

Sep 16, 2013 10:40 PM in response to etresoft

First of all, iPhones have nothing to do with DMG files. The default passcode on an iPhone is 4 numerical digits. Assuming it takes one second to attempt to enter a passcode, one could try all possible passcodes in less than 3 hours. I am sure that Apple has the ability to try passcodes faster than that and has the ability to disable automatic wiping after 10 failed attempts. Only Apple has access to the hardware and software designs to do this.


And, that's my point. It's not that I think the protection or challenge to break in is the same, it's the concept of a backdoor that may be similar. If Apple has created backdoors to disable wiping, etc. in iPhones, then it's also possible they've created a backdoor within DMGs that bypasses the AES encryption or make it easier to break.


I've also contacted various experts who are working on reverse engineering the DMG format (again). Hopefully that'll glean new light on it as well.


Using open source would multiply those risks many times because, even without encryption, it is so much harder to use and understand. You may be worried about a backdoor in Apple technology, but there are 10,000 other things to be worried about with open source. For all you know, your open source software is uploading your passwords in plain text.


I agree that one shouldn't be lulled into a false sense of security with open source solutions and certainly not all open source projects are all alike, but at the same time there's compelling reasons to choose open source over proprietary code as long as you make a calculated, educated decision.


Here's a good example of the pros and cons. Bruce Schneier has more on open source here as well. If one isn't properly educated on the value of open source, then you might miss those advantages. IMO, beyond the marketshare myth, one of the reasons that Mac OS X is more secure than Windows is that its kernel is based upon open source code that's been battle-hardened over the many years.


Thanks again for your input.

Sep 16, 2013 10:48 PM in response to Mark Jalbert

Thank you, Mark. I'd seen that article some time ago, but it doesn’t appear to show that there's a backdoor as much as there's some workarounds for brute force attacks. Then again, the writer was able to give the cracker clues as to what the missing digits of the password were and I'm sure that vastly helped to crack the password.


It's a very interesting read nontheless, thank you for looking into it!

Sep 16, 2013 11:23 PM in response to rccharles

I do not know why you are worried about this. You have spent too much time talking to conspericy theorists.


Some of my various clients work with data that has significant financial value and it would severely damage their business interests if said data ended up in the wrong hands. Whether it be from an unscrupulous government employee or a competitor who also happens to be a quasi-governmental entity with special access, their data needs to be protected as best as possible. Many of my clients also have to share their valuable data with those that are authorized to do so. Therefore, it's not feasible to expect that none of their encrypted data will ever get accidentally picked up by others along the way whether by an accidental loss of a laptop or via corporate espionage, etc.


It's just the way it is.


Find some other implementation of AES encryption. Create some file. Run both encryption methods against file. See if the resulting files are the same. If same, Apple must be using same algorithm.


I'm thinking there may be something within the DMG format (and the way it's integrated with OS X) that can bypass that entirely. In other words, without using the backdoor, it'll be the same challenge.


I should also make it clear that I really don't strongly suspect that Apple has a backdoor in their DMGs especially considering the subject is currently being allowed to be discussed here, but at the same time I'm not one to go on beliefs, etc. when it comes to protecting my client's assets.


Adding a backdoor isn't worth it in anyway for apple to do this.


That may be so, but it's also known that the government will attempt to make it worth their while with various incentives including outright payouts.


let's say there was a backdoor. Some one would find out about it eventually. Apple would be sued. Apple products would be abandoned in mass. Apple would be in danger of going out of business


What corporations have done in the past when they're caught is deny it's a backdoor and claim it's an accidental security flaw and issue a fix for it. The gov will never let them admit it anyway and will issue NSLs and the like to stop disclosure. I've had clients burned by this in the past.


If there's multiple backdoors yet to be discovered, then there's still access even after closing the discovered "accidental" hole, of course. Also, recently exposed classified docs have exposed that Microsoft, Google, etc. all cooperate on varying levels to weaken their own security for the government and they aren't losing much business. The fact that there's not a lot of choice when it comes to mainstream operating systems helps, I'm sure. I doubt anyone is going to go out of business anytime soon. There's just no many other places for customers to go. Then again, if the large corps keep it up, they do risk losing more customers to open source alternatives down the road.


Thanks for your input. It's very much appreciated.

Sep 17, 2013 5:42 AM in response to Cowicide Moo

Cowicide Moo wrote:


Thank you, Mark. I'd seen that article some time ago, but it doesn’t appear to show that there's a backdoor as much as there's some workarounds for brute force attacks.


It gives you support that there is none, He has 15,000 followers on twitter, and none knew of a "back door" (The author of that article made an effort to find a way into his DMG when he lost his password and only had success when he narrowed down the forgotten string to 6 forgoton chars.)



BTW, if a backdoor was found, you would see the story on the NYT, not on the Apple Discussion Boards. It's safe to say that you will not find an answer here.


The best way to protect sensitive data is to keep it off of public networks (i.e. the "cloud") If that is not feasable, then there will always be a risk (...and there always is a risk -- An unscrupulous employee can sell it to a competitor,)

Sep 17, 2013 5:45 AM in response to Cowicide Moo

Cowicide Moo wrote:


And, that's my point. It's not that I think the protection or challenge to break in is the same, it's the concept of a backdoor that may be similar. If Apple has created backdoors to disable wiping, etc. in iPhones, then it's also possible they've created a backdoor within DMGs that bypasses the AES encryption or make it easier to break.


That is not a backdoor. A backdoor is something that a vendor ships to customers. Manufacturers of hardware devices like routers and printers are notorious for shipping backdoors. These companies do only hardware, not software. Any software they do is done poorly.


It is also not a backdoor to just try every password to unlock a phone or decrypt a file. I was just speculating on how it might be done. It might also be possible to just copy the image of the phone's memory and decrypt that. The only reason Apple would be involved is because this is a hardware device and Apple can do it faster, cheaper, and more efficiently than individual law enforcement agencies.


I agree that one shouldn't be lulled into a false sense of security with open source solutions and certainly not all open source projects are all alike, but at the same time there's compelling reasons to choose open source over proprietary code as long as you make a calculated, educated decision.


It is only a few political activists that use "open source" as a critiera. Most people use cost, reliability, support, and install base. Sometimes open source has the best solution and sometimes it doesn't. The idea that open source is more secure is based on the so-called "Linus's Law". Only Torvalds never said that and it is a proven myth. Some projects are good and some are not. Some are open source and some are not. That's really all you can say.


one of the reasons that Mac OS X is more secure than Windows is that its kernel is based upon open source code that's been battle-hardened over the many years.


Not true at all. OS X is based on UNIX, which is inherently more secure than Windows because it was designed to be so. The OS X kernel is not "battle-hardened". It was just a forgotten college research project until Apple based its new OS on it. The Mach kernel brings a lot of good things to OS X, but security is not its focus. If your criteria for security is "battle-hardeded over many years" then you would expect other operating system such as Solaris, Linux, and even Windows to be secure as well. And, in fact, modern versions of these operating systems are secure.


I don't know what you are trying to prove here. You are claiming that OS X is secure, or rather, it would be if Apple didn't fill it full of backdoors. Your arguments are not logical. Statistically speaking, there are very few people that even use Macs. Of those, there are very, very few that know how to create encrypted DMGs. Why on earth would Apple put a backdoor into the operating system so that the NSA could have access to the handful of encrypted DMGs that a few people have? That just doesn't make any sense. That is a huge risk for virtually zero return. If you want a high return on your hacking investment, you focus on the most popular (Windows), the most available (servers running Apache and PHP), and the most transparent (open source). That is exactly what hackers and various governments have done.


If you are worried about DMGs then don't use them. If you think Truecrypt is the gold standard, then use that. Being open source, you don't have to worry about Apple backdoors in Truecrypt. Anyone can download the source and create their own backdoors. You don't need court orders to force Apple to help and you don't need to wait for Apple to get around to it. All you need is data, time, money, and cryptographic expertise. Who has all of that?. Wait, weren't you worried about the NSA. Bummer dude.

Sep 17, 2013 6:07 AM in response to Cowicide Moo

Have a read of this

Reflections on Trusting Trust
Ken Thompson


A paper written by Thompson in '84. To summarize


Moral

The moral is obvious. You can't trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect.

(emphasis added)


So unless you built and programmed the machine yourself (or your clients did) there is no absolute way to confirm the trustworthiness of the system. Given that the bulk of the machines are assembled in China and I assume the microcode is loaded there, all bets are really off.

Sep 17, 2013 11:01 AM in response to Cowicide Moo

http://news.cnet.com/8301-13578_3-57583843-38/apple-deluged-by-police-demands-to -decrypt-iphones/


I'll add my two cents about this article. I'm with etresoft here.


This article does not prove there is a backdoor to the iPhone.


It is well know that if you have hardware access to the device you can get the data out. Apple's expertise would be copying out the data from the flash memory chips.


The article doesn't even claim there is a backdoor.

"It's not clear whether that means Apple has created a backdoor for police -- which has been the topic of speculation in the past -- whether the company has custom hardware that's faster at decryption, or whether it simply is more skilled at using the same procedures available to the government. Apple declined to discuss its law enforcement policies when contacted this week by CNET."



"Elcomsoft claims its iOS Forensic Toolkit can perform a brute-force cryptographic attack on a four-digit iOS 4 or iOS 5 passcode in 20 to 40 minutes. "Complex passcodes can be recovered, but require more time," the company's marketing literature says. But the iPhone 5 doesn't appear in Elcomsoft's list of devices that can be targeted. "


You need to remember these a lawyers and not using exact technical terms. There using 'unlocked'. Unlocked isn't a synonym for backdoor. Unlocked means getting the data out.


If your relying on a 4 digit numeric number to protect your data, you get what you deserve.


Apple doesn't have to blindly follow federal orders to do something like add a backdoor to their software. They have lawyers to invoke attorney client privileges to go and talk with their congressional representatives. I'm sure that would create a big fuss in congress.


Your whole worry about a backdoor in dmg files and ios devices means you haven't done your research.


Robert

Sep 18, 2013 8:22 PM in response to etresoft

Thanks for the response.

That is not a backdoor.


I see your point, it's not technically a backdoor, but one does have to wonder why Apple doesn't make it much more difficult to circumvent the "security wipe" that anyone can accomplish with a jailbreak and some third party tools. While semantically it's not a "backdoor", it's still a flaw in the implementation. We may never know if that was done purposefully or not.


It is also not a backdoor to just try every password to unlock a phone or decrypt a file.


Agreed, but I've never implied that in the first place.


The only reason Apple would be involved is because this is a hardware device and Apple can do it faster, cheaper, and more efficiently than individual law enforcement agencies.


Recent NSA disclosures have shown there may be other reasons. Experts so far really can’t confirm whether Apple has an iPhone backdoor, better decryption capabilities, or just more advanced techniques they use. That's where it stands. It really depends on the security expert you talk to. Some lean strongly towards a backdoor while others lean strongly against the possibility. But, you won't find any real experts that will say they know for sure just yet.


I woudn't have leaned towards an iPhone backdoor until after recent NSA disclosures showed that there's more than meets the eye when it comes to government/corporate collaboration. Of course, I hope I'm wrong on this.


It is only a few political activists that use "open source" as a critiera.


I'm not sure what you mean when you say use it as a "criteria", but many corporations use Open Source technology. Times have changed. I've worked with a multi-billion dollar publishing company that uses it extensively for a host of reasons, including better security (in their case).


More on this: http://www.dmst.aueb.gr/dds/pubs/jrnl/2012-JSS-OSS-Industry-Use/html/SG11.html


Sometimes open source has the best solution and sometimes it doesn't.


Sure, that's basically what I've said already in my previous post.


The idea that open source is more secure is based on the so-called "Linus's Law". Only Torvalds never said that and it is a proven myth.


You linked to "Facts and Fallacies of Software Engineering" by Robert L. Glass. It's a bit more complex than what you imply. Linus' Law doesn't just pertain specifically to security bugs. Rather, it applies to all bugs including security bugs.


Studies have shown that Open Source projects have less bugs than proprietary projects. But, you can also find (often industry-sponsored like Microsoft) studies that show the opposite.


Like I said earlier, all Open Source projects aren't alike. Some projects don't have enough participation to be more secure (ratio to amount of lines of code) and other very popular projects have tremendous amounts of "eyeballs" in relation to the amount of code. Like I said, there's compelling reasons to choose open source over proprietary code as long as you make a calculated, educated decision.


If you have a small amount of lines of code and a tremendous amount of input, it's very likely to be more secure than propieatry code where you have to trust instead of verify.


Not true at all. OS X is based on UNIX, which is inherently more secure than Windows because it was designed to be so. The OS X kernel is not "battle-hardened". It was just a forgotten college research project until Apple based its new OS on it.


Your logic falls apart there. Apple didn't convert it over to a proprietary kernel until around 2006. It was most certainly battle-hardened over those years. I agree that's not literally the only reason it's more secure than Windows, but I didn't never said it that way. Even Apple itself says that embracing open source contributes to its security. Microsoft, on the other hand, to its detriment has had a much more difficult relationshiop with open source over the years, to say the least.


Being open source, you don't have to worry about Apple backdoors in Truecrypt. Anyone can download the source and create their own backdoors.


That's not how it works, changes are peer-reviewed and verfied with a simple md5 check. You seem to be very focused on diparaging Open Source with inaccurate info, I'm not sure why.


Why on earth would Apple put a backdoor into the operating system so that the NSA could have access


Because the government may have required them do to so. It's well known that some (including the FBI) will use Apple products because of its better secuirty (for various reasons) that can thwart investigations, etc. The governmenet has even said as much.


You are claiming that OS X is secure, or rather, it would be if Apple didn't fill it full of backdoors.


That's innacurate. I've never claimed that OS X is full of backdoors nor even a single backdoor. I'm merely asking about the possibility of a backdoor in the DMG format and/or OS in light of recent NSA disclosures. I'm by far not the only one asking these questions since the disclosures were made public, including many security experts worldwide.


If you are worried about DMGs then don't use them.


I'm personally not worried about DMGs. I have clients that have implemented the technology and I'm investigating in forums all around the world to gather consensus. This Apple thread is only one of about 30 where I've initiated the discussion. Unfortunately, out of all the other forums this is the only one where I've been met with hostility and derision, by the way. But, I'm not one to be detered by bullies.


If you think Truecrypt is the gold standard, then use that.


I use many more security implementation aside from TrueCrypt. I've never implied it's a "gold standard" in all situations. It does have more features that DMG like plausible deniablity, etc. - But, for all I know, the DMG format is just as secure or more secure overall.

All you need is data, time, money, and cryptographic expertise. Who has all of that?. Wait, weren't you worried about the NSA. Bummer dude.


Please be polite, thanks. When you resort to an insulting demeanor it only makes me take you less seriously.


Most top secuirty experts have said that if certain encyrption is implemented carefeully and properly even the NSA can't crack it. The math behind encryption is sound and the NSA can't magically crack the best implementation depsite their vast resources. Most experts say that teh NSA very likely relies on mistakes in implementation and possible backdoors.


I do appreciate your challenging input, overall. Thank you.

Sep 18, 2013 8:40 PM in response to Tony T1

BTW, if a backdoor was found, you would see the story on the NYT, not on the Apple Discussion Boards. It's safe to say that you will not find an answer here.


I'm posting this issue in many forums, not just here. I'm gathering consensus. In some private forums and in emails exchanges where programmers have reverse engineered the DMG format (as best they can), I'm getting considerable info on this. Also, the more the questions are asked, the more likely a whistleblower may come forwards, etc. That is, if there's anything to hide in the first place. But, like I've said I find that unlikely and I'm only performing due diligence for clients.


It gives you support that there is none, He has 15,000 followers on twitter, and none knew of a "back door" (The author of that article made an effort to find a way into his DMG when he lost his password and only had success when he narrowed down the forgotten string to 6 forgoton chars.)


I agree, the more that people dig into the format the more it certainly does support that they may not have a backdoor. You're perhaps also making an inadvertant justification that the DMG should be Open Sourced since even more people could inspect the code for flaws, etc.


I've already stated that I think it's unlikely that there's a backdoor in the DMG format and that the article on cracking the format didn't support that there's a backdoor. I'm simply trying to rely less on guesswork and more on solid evidence and gathering more consensus. You should also note that I previously expressed that I was well aware that he provided clues to the crackers of what his password was and that's why it was much easier to crack.


Thanks for your input.

Sep 18, 2013 9:07 PM in response to rccharles

This article does not prove there is a backdoor to the iPhone.


I agree, but the jury is still out on this if you consult security experts as I have.

The article doesn't even claim there is a backdoor. : "It's not clear whether that means Apple has created a backdoor for police -- which has been the topic of speculation in the past -- whether the company has custom hardware that's faster at decryption, or whether it simply is more skilled at using the same procedures available to the government. Apple declined to discuss its law enforcement policies when contacted this week by CNET."


Right, it's not clear. I should have worded it differently when linking to that article. I think what's is clear is that there's a certain circumvention that Apple employs as apposed to an outright backdoor. While it is unlikely, it's still not really clear if there's an outright, hidden backdoor in place or not.


If your relying on a 4 digit numeric number to protect your data, you get what you deserve.


Agreed.


Apple doesn't have to blindly follow federal orders to do something like add a backdoor to their software.


That's debatable. Recent NSA disclosures and other events show otherwise. It all depends on if the company or corporation is willing to fight or not (or how much).


http://www.wired.com/threatlevel/2013/09/lavabit-appeal/


http://news.cnet.com/8301-13578_3-57593538-38/how-the-u.s-forces-net-firms-to-co operate-on-surveillance/


http://news.softpedia.com/news/The-NSA-Forces-Most-Software-Makers-to-Build-Back doors-into-Their-Products-380920.shtml


http://www.washingtonpost.com/investigations/us-intelligence-mining-data-from-ni ne-us-internet-companies-in-broad-secret-program/2013/06/06/3a0c0da8-cebf-11e2-8 845-d970ccb04497_story.html



Then again, now that there's been NSA disclosures and it's forced a national dicussion, there's now hope that companies like Apple will fight back with others and there'll be far more transparency in the future:


http://www.macrumors.com/2013/07/17/apple-to-team-up-with-tech-companies-to-ask- for-greater-nsa-transparency/


Your whole worry about a backdoor in dmg files and ios devices means you haven't done your research.


Welp, that's why I'm here and about 30 other forums. Thank you for you participation!

Sep 18, 2013 9:14 PM in response to Frank Caggiano

So unless you built and programmed the machine yourself (or your clients did) there is no absolute way to confirm the trustworthiness of the system. Given that the bulk of the machines are assembled in China and I assume the microcode is loaded there, all bets are really off.


That was very prescient (and ironic) for him to say that in 1984. He was very right, this was just revealed the other day:


http://www.extremetech.com/extreme/166580-researchers-find-new-ultra-low-level-m ethod-of-hacking-cpus-and-theres-no-way-to-detect-it


I agree it's difficult, but I woudn't go the extreme to say it's impossible to secure data to a reasonable expectation. I've never had any illusions (nor should anyone else) that there's such a thing as absolute security in this world, but what we should do is try our best whether we're with a small business, large corporation or within the government.


Thank you very much for your link and input.

Dmg Security

Welcome to Apple Support Community
A forum where Apple customers help each other with their products. Get started with your Apple ID.