Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I'm guessing there are two things Apple is worried about. The first is using hot code push to change the purpose of the app after release, e.g., switching a business app into a video game. The second is using hot code push to violate app store review guidelines, like the use of private APIs.

You can do hot "code" push techniques that allow the first but not the second, by letting apps update HTML and JS that calls back into pre-existing native code. That's what Cordova / PhoneGap does. I'd guess that Apple will just ban the app and the developer if they catch it.

It appears that Rollout started using some API that would enable it to do the second, and Apple is preemptively making sure that it doesn't happen. The wording of the rejection is based on passing computed parameters to introspection routines.



> I'm guessing there are two things Apple is worried about.

I'm sure it worries about them but there's a much larger, riskier scenario.

Once you start downloading and executing binary code from untrusted sources (i.e., not the App Store) anything can go wrong.

1. An iOS app doesn't care about security, and it hot-loads code from some non-https source and gets man-in-the-middle'd 2. An iOS app hot-loads code in a secure manner, but the server from which the code is served becomes cot mpromised 3. A malicious employee at an iOS app vendor pushes harmful code out via her company's app

Now, I'm not a fan of Apple's policies. I think there should be a "guys, I know what I'm doing" mode where I'm allowed to download code from untrusted sources. Just like Android or MacOS.

However, I sympathize with them here. For nearly a decade people have been downloading code from the App Store with the understanding that it is safe to do so. Even I appreciate this much of the time... I'm an engineer but I'm busy. I can't audit every app I download. I wish there were other options, but I find huge value in the fact that I don't have to worry about an App Store app screwing my device. And it's a big reason why I recommend iOS despite its flaws to older family members.


> Now, I'm not a fan of Apple's policies. I think there should be a "guys, I know what I'm doing" mode where I'm allowed to download code from untrusted sources.

This exists today and has for a long time, it just costs you money for this "privilege".


You can compile and run apps on your own devices without a paid developer membership: https://www.google.com/amp/s/9to5mac.com/2015/06/10/xcode-7-...


They've nerfed this so that the app will only run for a week before you must push another build.


You can drop a .ipa file into iTunes and load/run the app on a phone that's syncing to that copy of iTunes...


Getting the .ipa signed still costs money (or has a week-long timebomb), and you can't run unsigned code.


> The second is using hot code push to violate app store review guidelines, like the use of private APIs.

I've never understood this part. Why doesn't iOS simply prevent apps from calling private APIs?


You aren't supposed to call private APIs in your code, but your app is definitely making private API calls all the time since the libraries provided by the platform are running in-process.


Preventing it in a technical way is far from easy: If your app calls public API X, which as part of its implementation calls private API Y, your compiler only needs a declaration of Y to output the function call / ObjC message send. Nothing in the language prevents it, and the code is executed natively unlike Java.


They control the software and the hardware though: seems possible to allow a specific region of memory (aka their public API) to call a specific region of memory (private API) and segfault for all the rest that does that?


At best all they could do is change literally every single private function call everywhere to inspect the return address and see if that return address is in a system framework or is in the app image. But this would be huge overhead, a real pain in the ass, and also not even reliable, because all you have to do is pick a public function, figure out the address of the `ret` instruction, push your $pc onto the stack, and then call the private function passing the address of that `ret` instruction as the return address. The private function will see that this address is in a system framework, and so will work, and then it will return, passing control to that `ret` instruction which immediately returns back to your real caller function.

So no, there's no way for Apple to technologically make it impossible to call private functions. The only actual solution there would be to completely rewrite the OS such that literally every call into a framework that an app makes actually goes over IPC (so that way apps can't even attempt to invoke private functions since they won't be linked into the app), but that would probably be crazy slow which is why nobody does that.


Many private APIs are methods on objects which are part of public APIs, so there's no "region" of memory which cleanly corresponds to private APIs.


But they can theoretically create that as they own the entire chain; dev env, tools, OS, software, hardware (CPU included). I know it is not currently the case, sure, but they can do it was my point.


They "own" a compiler, but not all of them.

Say they implement the scheme you mention in clang and the LLVM linker, so the function bodies of their public APIs end up placed in that privileged region of memory, and those of their private APIs end up in the restricted region.

Nothing prevents gcc from producing object files that tell the linker "this user function is part of Apple's public APIs". And nothing prevents people from using a different linker anyway, one that would put private API body functions out of the restricted region of memory.

The only real way to achieve that would be to move all their frameworks to the kernel, which would be all sorts of problematic.


> Say they implement the scheme you mention in clang and the LLVM linker, so the function bodies of their public APIs end up placed in that privileged region of memory, and those of their private APIs end up in the restricted region.

Agreed that this design is fundamentally flawed, but that's because the coder is providing the implementations of private code. Providing that is Apple's job.

Put privileged code into a dynamically-linked library that Apple provides. Only code in that block of memory can call private APIs. Pretty straightforward to implement, and requires nothing fancy from the kernel.

Of course this only works if you can prevent the attacker from corrupting memory.


I don't know if iOS does randomization of loading addresses, but if so, that'd be a disadvantage.

And well, in any case they need to maintain compatibility with current apps for who knows how many years.


> I don't know if iOS does randomization of loading addresses, but if so, that'd be a disadvantage.

Such a scheme wouldn't stop ASLR. The loader just needs to tell the verification code where it put the privileged libraries.

> And well, in any case they need to maintain compatibility with current apps for who knows how many years.

Do they? I think Apple could easily order everyone to switch over to a more secure compiler with a one year deadline.


Presumably because many private APIs are used behind the scenes by public APIs and the security model must allow applications to run them.


Because some of these APIs are useful in an enterprise app setting that aren't distributed via the App Store. Like Disney applications on their turnstile devices at Disney World.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: