Android vs iOS: A Developer’s Perspective

Since its inception, Whereoscope has been an iPhone-only shop. The genesis of this iPhone-fetishism goes all the way back to the very first iPhone: in a way, it was the potential that device tantalizingly dangled before us that pushed Mick and myself to pick up our shovels and give this whole entrepreneur thing a go. I still believe that the iPhone is a truly paradigm shifting device; that by releasing it, Apple fundamentally and irrevocably changed the course of computing. But as the song goes, “the times they are a changin’.” Android has made considerable gains on iPhone in terms of functionality, polish and mind-share. The time had come for us to put aside our prejudices, and actually put together an app to see what would happen. What could possibly go wrong?!

“I hate this thing!”

We were lucky enough to score a couple of developer handsets from Google, through Y Combinator (we’re YC S2010 alumni). I took them home, unwrapped them, and started playing with one of them (a Motorola Droid). About ten minutes later I was on the phone (my iPhone — no way was I going to be gracing this thing with phone-calls) to Mick saying “I hate this thing!” I just didn’t get it. My iPhone had always seemed so intuitive, but on Android I had to shift my mental model: I had always assumed that the buttons on the bottom of it were optional extras, in much the same way that you can generally use the phone with just the touchscreen and not worry about the trackball thing. It definitely took some getting used to, and even now when I show Whereoscope on Android to iPhone users, I need to explain the basics of navigating an Android phone to them before they can use it.

The good news is that things definitely got better from there. Whilst I still think that Android’s initial user experience needs to be improved (it works well, but I really feel like they’re going to lose so many people in those first 30 seconds of use), I have to say that developing on Android after having worked on iPhone is a bit like waking up from a vivid nightmare — the kind where your dog, Fluffy, is being chased by killer robots who are having a bad hair day — and realising that actually, things are ok.

Garbage Collection

iPhone doesn’t have it. Seriously, Apple: What The? Various luminaries have waxed geeky on why garbage collection is so great, so I’ll keep this brief. Basically there’s two reasons it’s awesome: I really don’t have time to learn the rules of memory ownership and when to free, and all that stuff for yet another language. Secondly, I just can’t think of another language feature that accelerates development as much as garbage collection. Android has it, and for anything not pushing the envelope in terms of what the hardware can do (ie, almost everything), it works really well. I might have understood why Apple didn’t include it in the first couple of iPhones — it does come at a computational premium — but they really don’t have an excuse any more. It’s a big enough deal that I would advise the first-time mobile app developer to start on Android for this reason alone.


Android has really great documentation. And I don’t mean superficial interface definitions and the like, I mean the really meaty stuff. Part of this is that the Android approach is fundamentally to expose everything to the developer, rather than try to hide important stuff on the (somewhat condescending) assumption that the platform developer knows better than you. I won’t go into details, but it is absolutely no exaggeration to say that we’ve spent weeks devising and performing increasingly peculiar experiments to figure out how to get iOS to do what we want. We’ve implemented hideous — and I really mean that — hideous hacks on iPhone to achieve what is trivial and well documented on Android.

I think this is what people mean when they say that “Android is open.” I don’t know if that’s a fair characterisation or not, but there’s something to it. I think it might be fairer to say that Android is very explicit in its communication with the developer. iOS swaps that control and communicativeness for presumptive simplicity. There’s reasons for both approaches, but to my mind what this means is that both platforms make the mundane easy, but Android makes the difficult possible, while iOS makes some difficult things easier, but does so to the exclusion of use-cases unimagined by Apple.


Somewhere inside Apple, there’s a guy who is receiving untold, nay, unspeakable pleasures by inflicting on the development community a kind of suffering that is as acute as it is pointless. That pain comes in the form of a series of hoops that one is forced to jump through in order to turn your phone into a development handset. There’s provisioning profiles, ad-hoc builds, certificates, and countless screens that I clicked through, not really caring what they did, because they brought me closer to being able to run my code on my phone. On Android, you check one option in preferences. That’s it.


Having been battered and abused by the iPhone Appstore review process, it came as a bit of a shock to upload my binary to the Android Marketplace and have it appear only seconds later on my phone. In a way it was a bit anti-climactic. Despite the palpable sense of relief at my realisation that clicking “Publish” really is the last step, I’m actually not sure if this is a good thing or not. Broadly, I think that being able to rapidly get new versions to users is a good thing, but looking around the Android marketplace, my observation has been that the software is of lower quality, on average. I can’t back that up with hard numbers, but I think the argument can be made that Apple’s process, for all its warts, does encourage better software. I know we have spent time making sure things are “just right” on iPhone, where I think we might not on Android; it’s a lot easier to think “we’ll just push another version tomorrow.” I’ll be interested to see how this plays out.

IDEs, Simulators, The Usual Suspects

To develop on iPhone, you need a Mac. This sucked, as it meant that the first step on my journey to iPhone fame and fortune was to drop $2K on a computer that I didn’t really want (I still view it as a bit of a toy. However, you’ll have to pry my Thinkpad from my cold, dead hands). iOS development uses XCode, which some people love. I can tolerate XCode, but at the end of the day there’s something I can’t quite put my finger on that just rubs me the wrong way about it. It might just be that I haven’t had that epiphany that so many Mac owners seem to, where they just can’t imagine using anything but their Mac. Anyway, Android development is done in Eclipse (you can do it other ways, but Eclipse is the recommended path). I have to admit that I expected to hate Eclipse (Vim is my weapon of choice), but I’ve found it surprisingly good. The Java UI toolkits are still a bit weird, but overall it’s actually fairly pleasant to use, assuming you have enough RAM to use it (in fairness, my Mac has twice as much RAM as the Thinkpad I do Android dev on). And you don’t need a Mac to run Eclipse.

The emulators provided by both platforms are also kinda interesting. The iPhone developer platform ships with what they call a “Simulator” — it’s basically a build of the iOS runtime for 64-bit Mac OS on Intel. To run your code in the simulator, you actually have to build a separate binary, and the code all executes basically at the full speed of the host computer. We’ve actually been bitten by this before, because it’s really easy to believe that your code is crazy fast when your main interaction with it is on a quad-core Core i5 chip, instead of a single-core ARM chip. The Android dev kit takes a different approach: they actually use qemu to emulate the whole system — you make a disk image, and it boots up, and you’re actually running Android inside a virtual machine on your computer. You use the same binaries in the emulator as you do on the phone. Except that you don’t: because it’s a full emulated environment, running stuff in the emulator is crazy slow, so you end up running all your code on a phone all the time. Which I think is wise — you’ll know exactly how fast your code can be expected to run in production that way.


In an attempt to vindicate our decision to develop for Android, I wandered into an AT&T store the other day and had a chat with the sales assistant there. She said to me that they sell about 1 Android handset for every 3 iPhones. For her, Android was still a bit of a fringe platform. But you add to that all the carriers in the US that don’t have iPhone, and even anecdotally, that’s a lot of units. Despite quipping at the start of this article that we have somewhat of an iPhone fetish, the truth is that we go where the users are (as any entrepreneur worth their salt will). There comes a time when it would be ok for Android development to be 5 times harder than iPhone, but you’d still do it because without that, you’d fail. I don’t know if we’re at that point yet, but the equation for me was surprisingly tilted in favor of Android thanks to the simplicity of developing for it. Having gone through the process, I have a clearer understanding of Google’s strategy: I think they hope to win because their platform is so much more developer friendly. It’s a strategy that Google knows well, because that’s what the web is: it is dramatically simpler to prototype a web application than it is to hack something up in C++ (especially Visual-C++, which taught me that there is a reason they’re called “compilers” and not “compliers”). And y’know, it really could work. It’ll be interesting to see how this all ends up.

Whereoscope is available on the Android Marketplace (tap here if you’re reading this on your Android phone). It’s free for now, so get it while it’s hot!

Whereoscope is available on the Android Marketplace (tap here if you’re reading this on your Android phone).

If you use AppBrain you can download it from here.

You can also download Whereoscope from the iPhone App Store.

Download Whereoscope

It’s free for now, so get it while it’s hot!

About these ads

193 Responses to Android vs iOS: A Developer’s Perspective

  1. Alan Hogan says:

    I’m sorry, but your second word is a typo. Should be “its.”

    • James says:

      You’re absolutely right. How embarrassing. I’ve fixed it and uploaded the corrected article. Thanks for pointing this out!

      • Chris Davies says:

        I’m sorry, But Alan Hogan is flat-out wrong. James you are absolutely correct: The possessive apostrophe must NEVER be used when the “its” could be reasonably replaced with the word “his” or “her”. (Your usage makes sense, even though an application doesn’t have a sex.)

        This is a common mistake made by people with little formal education.

        Its is entirely correct. Stick to your guns and don’t let morons try to ruin your written English.

      • James says:

        Hi Chris, I’m pretty sure Alan was right: the initial version had a redundant possessive apostrophe. I’ve since updated the article, so the current version incorporates Alan’s comments. It sounds like we’re all in agreement!

      • Space Gorilla says:

        The easy way to remember this is there are only two forms of the word, the possessive (its) and the one that means it is (it’s).

      • Russell says:

        I love this. People that care about using proper grammar and spelling. Normally, it seems that writers don’t want to hear about their mistakes and so I rarely bother. Thanks for wanting to get it right. (Now I’m praying I haven’t committed some mistake!)

  2. thrill says:

    Can you check the link/QR? I keep getting ‘could not be found’ with either one.

    • James says:

      The link and QR code works on my handset — a Nexus One. I just re-checked them. You’ll need Android 2.2 to be able to run it — is it possible you’re running into issues there?

      • thrill says:

        Sadly, that’s the problem I suppose, as I have a Verizon Galaxy S, provide by a phone company that likes to update their software slightly less frequently than the sunspot cycle – though it seems the system should instead give a ‘version too low’ or some similar error.

      • James says:

        Sorry to hear that. We’re also pretty new to developing on Android, so it’s entirely possible that something else is at play. Let us know if you do get to the bottom of it. I’m happy to email you a binary if you’d like to try it out — shoot me an email at and I’ll hook you up (though we still don’t have support for anything older than 2.2).

  3. Guy says:

    I’m new to Android (using a Galaxy S–Android 2.1) but so far the UI seems quite intuitive and very comparable to my iPod Touch. I can’t imagine that an actual iPhone would be that different. So I don’t really understand your comment that people will be turned off within 30 seconds of use… ?

    • James says:

      It could very easily just be me, and the fact that my mind has been contaminated by Apple thinking. However, my experience has been that people who haven’t used Android handsets before (irrespective of whether they’ve used iPhone or not) find the 4 buttons down the bottom confusing. There’s no visual cue to hint that that’s what they should be using; people look for stuff to appear on the screen when they’re using a touch based UI. Once you realise that you really do need to use the buttons, and to what extent application developers treat the UI like a stack, you “get it” and it becomes intuitive. I just think that’s not the way people think when they first approach the device.

      I actually think it would improve matters considerably to have a mode where the 4 buttons are mirrored actually on the touch screen. Show that to beginners, and prompt them to turn it off later.

      • Justin says:

        I totally agree.

        I used an ipod touch as my smart device for three years and android was seriously strange to me. But then again Blackberry has hard navigation buttons too so it’s not that out of line. But yeah anybody coming from the world of iOS will find themselves pointlessly looking in the top left for that back button.

  4. Loving my iPhone and loving garbage collection is what pushed us to port Mono to the iPhone.

    Today there are some 1,200 apps on the AppStore using Mono. Either using Unity3D where Mono is used as the scripting engine for the game, or using MonoTouch, which is mostly a C# binding to the native iOS APIs.

  5. Shocked says:

    Gee, someone who hates macs, and can’t be bothered to learn memory management in objective-c — which is actually easier than dealing with Java’s cruft and overhead– decides that he likes android better? Color me shocked!

    It sounds like I’ve spent more time writing java code than you’ve been programming. No way competent programmers are able to develop faster for android than iOS. The 5 times worse figure sounds like an understatement, actually.

    • James says:

      To be fair, I also can’t be bothered to learn memory management in Java or Python or C#. I’m pretty even-handed in my apathy.

      And I don’t hate my Mac — it’s great for watching Netflix!

      • Confused says:

        Saying you lack the time to learn proper programming techniques like memory management make me worry about your products on embedded systems. Yes, even with Java there are ways to write memory efficient code. The garbage collector should never be considered a god, because especially in Java, it isn’t.

      • virgil says:

        @confused – you really should change your atitude. It seems to me that you did no serios programming in anything other than C, otherwise I can’t explain it.
        I teach compiler design, I have a course on garbage collection, so yes, I know very well the tradeoffs. And I fully support James on this: GC is essential for developers – going back to C++ from C# I absolutely hated the lack of GC. I’ve got used to it (again), but I still dislike it.

        GC penalty is nowhere nearly as bad as most people assume – not with a generational garbage collector. And the GC may always improve – your memory management will not.

        (BTW, the GC is not “in Java” – it’s in Dalvik, or JVM , or whatever VM. Big distinction. The HotSpot VM’s GC is actually pretty amazing).

        One last thing – an anecdotal evidence: my first program (more complex than “hello world”) that ran perfectly on the very first attempt to run it (after the compilation errors were all fixed) was a homework I did in the university. Incidentally, it was also the very first program I did in Java – a new language that I decided to “try out” for that homework (a semester project). I was fully convinced of the merits of GC ever since.

      • jeremiah says:

        “shocked” and “confused” are just that. Java is a fine development platform, and it has its nuances, just like any other development platform.

        Anyone that says Obj-C is easier to develop with than Java doesn’t know Java to the extent they know Obj-C, or simply doesn’t like Java for whatever reason. Neither are valid reasons to profess that Obj-C is a better language.

      • Apuku says:

        Having written a fair amount of both Java and ObjC, I have to say I prefer Java. Manual reference counting – c’mon – it sucks. iOS has some nice stuff in it, but the memory management is painful.

      • Nate says:

        Lol… I remember when my Mac couldn’t even do that! Now they’ve got Steam and everything. *sniff* they grow up so fast…

        Too bad you can’t say the same about the rude fanboys.

      • DP says:

        I agree. As someone who went from C++ to C# I see no need in modern programming languages to not have GC. There are bigger problems that you need to worry about.

        That said if you really like it then you still have your avenues to handle it. But I think it speaks highly of his products that he has a bigger picture view of software then just memory management.

    • Some Guy says:

      If it takes you more than an hour to read and understand the Cocoa Touch memory management rules, or more than a day to understand Objective-C, then you’re in the wrong line of work. You should be writing VB apps for middle managers at a drywall wholesaling company.

      • virgil says:

        Some Smart Guy – I get your point that “real programmers do explicit memory management”. But let me tell you that you’re a softie – in fact, I’ve heard it from veterans that real programming use in fact assembly language!!!

  6. steve says:

    Yeah, I’m a die-hard vim user. The vi plugin for Eclipse is your best bet to remove all those extraneous j’s and k’s from your source. It is ever so worth the small price of $20 (or whatever 15 euro is now).

    It has the most useful bits of vim … there’s one thing I always forget is missing (and am forgetting now). The :g commands are not there, but it’s easy enough to duck into to vim since those are usually some sort of task-based dealio and not a day-to-day editing thing (for me at least).

  7. Memory ownership rules are the same everywhere: they’re object ownership rules. You learn them once, and you know them forever. Then you learn to apply them correctly, and you never leak or overrelease again.

    I don’t mean that garbage collection isn’t useful, it certainly is! But if you can’t learn memory ownership rules you’re either very stupid (which I doubt) or you haven’t really tried. Developers are paid for what they can do, which is tightly related to what they know. Nobody’s going to pay a developer for what he can’t be bothered to learn.

    If learning was easy, the world would look a lot different.

    I’m sorry if this sounds really harsh or trollish, it wasn’t meant to.

    • James says:

      Hi there. Valid points, definitely. I guess to my mind, I view memory management as the same kind of skill as being able to remember all the phone numbers that you need to call. Once upon a time, this was important, arguably a critical part of being able to make effective use of your mobile phone. But these days, mobile phones come with perfectly good address books, obviating the user of the need to do that work.

      Now, I’m totally capable of learning the rules of yet another language, and indeed I had to to be able to do the small amount of iPhone dev that I did do (the iPhone client has been Mick’s domain). My point is only that this is a task that the platform can handle for me, so I don’t understand why it is that Apple have chosen not to provide this function.

      Whatever you think of what’s worth paying for, skills-wise, I don’t think it makes sense to pay for skills if the computer can do the work.

      • I won’t say I love managing memory, but I do appreciate the deep understanding it’s given me of what’s going on when my application is running. And that understanding is why I’m glad I learned it.

        You’re never going to get away from doing memory management entirely. There are quirks and specifics to every system, including (maybe especially) collected ones. Indeed, a quick Google search shows lots of people fighting memory leaks on Android.

        Because I spent a few hours to learn object ownership rules once, years ago, my code in that future environment where neither of us does explicit memory management will probably leak less than yours. Indeed, that’s probably true right now: I have demonstrated that my code has zero memory leaks. Not “one or two I can’t do anything about,” or “I think it’s okay,” but a demonstrated zero.

        As to why Apple didn’t provide garbage collection: My guess is that the original iPhone (and probably the iPhone 3G) simply didn’t have the power to do it well. I mean, there were enough complaints about the phone being slow as it was! I suspect we’ll see GC in a future iPhone, though I’ve no idea when. And when that day comes, I’ll still understand exactly what’s going on in my code and know I have no leaks.

    • MrMan says:

      You are thinking like a ronin, an employee. Pridefully listing your accomplishments. James is thinking like a business owner – “how can I reduce the cost of X?” Where cost is human resources (time and $$), the price of a Mac, etc. I am proud that you can do correct memory management, but even done correctly, is it more efficient, ie less costly, overall, to do it by hand? For the vast majority of apps, the answer is no.

      • Yes, it is. Because someone who understands memory management rules will churn out code that works on my first try, and someone not used to GC collection will churn out code that needs multiple releases to fix.

        This is classic software development: The sooner you fix the problem, the cheaper it is to fix. It’s *all* about money.

      • I mean someone not used to memory management rules. :) Even with GC, it doesn’t save you. If anything, it’s just another few feet of rope with which to hang yourself and your users.

  8. Chris B says:

    It’d be nice if iOS development allowed Objective-C’s garbage collection, and I expect it will some day. I am not a big fan of having to us ObjC, but, I think the more interesting issue here is that you’re claiming to pursue Android because there are more users. I’ll be interested to see how your downloads compare.

    There are certainly more Android handsets out there, but it seems that the iOS App Store is a far more appealing store to be in. When was the last time you heard someone rave about an Android app? I don’t think I’ve heard it a single time. I have a slew of friends using Android phones, but the only reason anyone ever gives for using one is that it’s not AT&T. Whereas with iPhone, you constantly hear people talking about cool apps. Daring Fireball did a pretty good article on this. I’d love to see Android compete in earnest with the iPhone, but I just don’t think it’s really even close today.

    As for dev tools and such, we all love to argue the merits of different ones. As said, I’m not a big Objective-C fan, but I have no interest in going back to writing Java either. I fully agree that the iOS deployment process is a royal pain. The first time you start doing that stuff, with all the provisioning profiles, and having to put UDID’s of phones in, and all that, man, it’s a massive buzzkill! I do agree that it helps limit the crap in the App Store, but clearly plenty still gets through, and I’d like to see Apple make this process easier.

    Finally, in terms of alternatives, I would suggest folks take a look at Appcelerator Titanium – whether for iOS or Android. We just submitted our first app to the app store, and used Titanium for it. Our app is NOT cookie-cutter Titanium, and we spent a lot of time polishing it, and so on. We wound up writing the whole thing in Coffeescript (which I prefer to JavaScript :) which was a nice improvement. Titanium is pretty good, and the key is that, depending on your background, you can get up to speed a lot faster, if say you’re familiar with JavaScript (more than ObjC), and/or you want some cross platform stuff, and so on. You still need to understand the platform you’re targeting, but it’s a great way to get in the game if you aren’t familiar with ObjC. It won’t work for every app, but as you mention, it will for “most” apps. So, it’s a consideration for folks for their first app, or as a way to prototype an app faster, etc.

    • James says:

      As you can imagine, we’re also pretty interested to see how Android uptake compares to iPhone.

      Most of the apps I hear people rave about are games at this point, but there’s definitely still a place for specialist apps like ours.

      Titanium sounds interesting!

    • dieselmachine says:

      “I have a slew of friends using Android phones, but the only reason anyone ever gives for using one is that it’s not AT&T.”

      Funny, I know a bunch of people with Android phones, and the reason they give is “it’s not Apple”.

      • Chris B says:

        Yes, whether it’s “it’s not AT&T” or “it’s not Apple”, same diff really. I mean, they’re buying a phone/device not *for* that device, but because they don’t like the company/companies of another device. In other words, they aren’t buying it based on the merits of the device itself, but because they don’t want a particular carrier or company. That is a fine reason, but it doesn’t do anything to show that an Android is a “better” device. The Apple haters can be what they want, and we need them, but if the day finally comes that the iPhone is on say Verizon or something other than AT&T, it’ll be curious to see how many of the folks that bought Android only to get on another network, will switch back.

        This won’t hold true for the Apple haters, and again, that’s fine. I want to see competition, and I’d love to see the Android phones do really well, as I think they have some great tech in them. I just think if you’re judging this from a business point of view, Android isn’t that appealing yet.

  9. k says:

    Which thinkpad do you use? Can you post your spec? I right now have an old tseries ;and since I am graduating, I am looking to upgrade. Thanks.

    • James says:

      Thinkpad X61. Core 2 Duo. 2GB RAM. Runs Ubuntu 64 bit (64 bit was a good call). 750GB spinning rust disk. Nothing special by today’s standards, and probably about due for an upgrade. The X201 looks like a likely candidate.

      • JasonK says:

        Let me guess…you bought a lower-end Macbook at the time and compared it to a more expensive Thinkpad right? I have a Macbook Pro…2008 model…the last one before the unibody machines came out. Its a far nicer machine than any of the thinkpads I’ve used at work so far and I’ve used three different models….currently on a T410. It runs OS X..I boot camped Win7-64. I run Ubuntu 64bit in vmware and can run the bootcamp partition in vmware as well. Toy my ass.

      • James says:

        You guessed wrong. My Thinkpad is both older and lower-specced than my MBP (both of them; after all the failures the first one had, they replaced it with a new one). The delta in reliability is what makes it a toy for me: as I noted elsewhere, I’ve had to send my MBP back to the shop 3 times, after treating it with kid gloves (it just looked fragile to me). You wouldn’t believe the punishment my TP has withstood.

        My MBP is well-specced (crazily so), and it can do some cool stuff. But if it’s doing that cool stuff at an Apple repair center…

      • Jeff Barbose says:

        he delta in reliability is what makes it a toy for me: as I noted elsewhere, I’ve had to send my MBP back to the shop 3 times, after treating it with kid gloves (it just looked fragile to me).

        You need to look up the definition of “toy”. Maybe your particular purchase is a lemon, but Santa doesn’t bring a gamut of technological, mechanical and entertainment devices whose only commonality is poor QC to all the girls and boys at christmastime.

        No Mac for sale today is a “toy”.

        The last time the appellation “toy” was used for Macs was when people were still using DOS and called the original Macintosh a “graphical toy”, saying that “that silly graphical interface” was for “toy computers” and would never go anywhere and that “real computer users” used the command line.

      • James says:

        So Mick and I both bought the same model of Mac. We’ve both had the mainboard fail (and I’ve heard that story from a bunch of others who bought MBPs of similar vintage — apparently the cooling isn’t designed well or something?). We’ve both had fans fail (which is worrisome). Mine had the graphics chip fail, and then the display fail. Maybe it’s not a toy, but it’s kinda fun to play with, and empirically has not met the reliability standards I expect of a critical work tool.

        I’m not denying that they’ve crammed a lot of cool technology into it. But when it was discovered that people spill liquid on their computers (of course, you try not to, but it happens), Apple’s response was to put moisture sensors in the machine so that they wouldn’t be liable for repairs. Lenovo’s response was to put drain holes in the bottom of the machine so that would keep working. To my mind, that’s Apple acting like it’s a toy rather than something they expect you to rely on.

        But whatever. I’m glad you like your computer. I only ever intended to share my own experiences, and frankly I’m glad yours have been different to mine. Maybe macs aren’t toys, but empirically, mine have not been industrial strength tools either.

  10. Michael says:

    If the lack of selectivity in the approval process of Android apps becomes a real problem, someone will step up to offer something the like of the Appstore. Another great thing about openness is that it doesn’t have to be all google all the time.

  11. jinushaun says:

    I’m gonna have to call BS on the documentation argument. One of my biggest gripes with Android is documentation, which I feel is nothing more than a list of methods and properties explaining what I already know. Thanks for nothing, Google. I can use Eclipse’s code completion for that. Where are the examples? Where are the programming guides on how to use a particularly important UI control? Telling me that a parameter is an int doesn’t tell me anything. What does that named constant do?! Android docs don’t tell me this. iOS docs do. The Android docs have NEVER once helped me out during development. I’ve always had to rely on either Eclipse’s code completion for the API “documentation” or web searches on how to use SDK classes. The iOS docs, on the other hand, are invaluable and I find myself keeping it open all day only.

    And don’t get me started on the nearly worthless emulator. The only thing that is good for is verifying that my app doesn’t crash on start up. Good luck using it for anything else.

    As for Eclipse, I personally despise that IDE and Android’s dev tools in general. It all feels so duct taped together. Made by DIY Linux geeks for DIY Linux geeks. That’s not a compliment. However, that’s not to say that the iPhone tools are perfect. The separation between XC and IB doesn’t make sense and the UI conventions in both apps are “unorthodox” to put it kindly.

  12. jabwd says:

    no gc? oh noz! we cannot make our app run slower, damm you apple!
    seriously take those 10 minutes you took for writing this article and read NSObject class reference. you are just a lazy jerk

  13. mk says:

    The feeling you get with XCode is that it slows you down. I’m a java developer by day, iPhone dev by night. Eclipse/Java are _light years_ ahead of XCode/Objective-C. My gripes with XCode include everything from the build process, the God-awful debugger, Interface Builder (what a POS!), and all the way down to little minutiae like key bindings that just don’t make sense (try selecting a block to indent it, every other IDE in the world uses tab, XCode uses Cmd-] wtf?).

    Not to mention Objective-C, which, as you allude to with garbage collection, is a far, FAR more inferior language than java. There’s things like passing undefined messages to objects which only generate warning at compile time, and sometimes those warnings don’t appear in XCode — so when your code doesn’t work, you’re left scratching your head. String management is A.W.F.U.L, why isn’t the + operator overloaded for concatenation?

    Finally, Objective-C is a very awkward language to use at the keyboard. All the symbols you have to use like object notation [] and nested calls [[UIView alloc] init], string designation @, are bypassed in java, making typing in the actual language much faster.

    Thanks for the article, now off to download the android SDK! :)

    • jay says:

      Overall you make fair points.

      Coming from iOS I find the layout process in InterfaceBuilder to be better than Android’s awkward and convoluted layout mechanisms. (which is understandable given that Android needs to run on so many different layouts, screen sizes etc)

    • Jean-Denis Muys says:

      Funny. Take your post. Invert it. That’s my take (and experience). Want to torture me? Ask me to do Java. Want to give me a breath of fresh air? Give me Objective-C (or Smalltalk).

      And please, don’t – ever – ask me to work with Eclipse again.

    • Matej K. says:

      Calling interface builder a POS pretty much puts your entire comment into perspective.

    • David Liu says:

      On the otherhand…

      Eclipse/Java runs using a Java IDE, which chugs along at a snails pace on my dev computer. I have to wait for everything, from switching files, to changing to editor or Debug mode. Writing XML for doing layouts? How silly!

      Java throws exceptions for calling methods on null! It requires me to write a getter/setter myself!

      And Eclipse’s autocomplete pales in comparison to Xcode’s autocomplete! Ugh!

      Honestly, a lot of the things you mention are actually quirks I appreciate in XCode. And I don’t get what you mean by the debugger; works fine for me in every case that I’ve used it. In Android however, try debugging a service that runs on a separate process. Headaches abound.

      Once you understand the tools you’re using, everything’s smooth sailing.

    • Erik says:

      I think there is a lot of unfair criticism of Objective-C and xCode here. Most of it sounds too much like coming from people who have worked in other development environments and just recently switched only to discover everything isn’t exactly has they were used to.

      > My gripes with XCode include everything from the build process, the God-awful debugger, Interface Builder (what a POS!)

      Could you be more specific? I think Interface builder is the best interface designer I ever used. It makes it really easy to follow UI guidelines. I don’t know of any designer that makes it that easy to deal with the Model-View-Controller pattern.

      >here’s things like passing undefined messages to objects >which only generate warning at compile time, and >sometimes those warnings don’t appear in XCode — so when >your code doesn’t work, you’re left scratching your head.

      Objective-C is a dynamic language just like python or ruby. I don’t hear people calling those language crap because the compiler don’t tell you about passing undefined messages. Dynamic typing gives advantages and disadvantages. Objective-C is a happy middle because it allows you to give type hints so the compiler can warn you about sending the wrong message.

      > String management is A.W.F.U.L, why isn’t the + operator overloaded for concatenation?
      It is a little awkward in places but NSString has in many ways more flexibility than java strings. E.g. you can do quite a lot with:
      NSString *foobar = [NSString stringWithFormat:@"%@%@", aStringObject, anotherStringObject];

      And with categories you can easily add your own string methods to NSString. Something you can’t do in Java.

      > key bindings that just don’t make sense (try selecting a block to indent it, every other IDE in the world uses tab, XCode uses Cmd-] wtf?).

      This is just pure ignorance. Those are the standard keybindings on Mac for indentation in ALL apps. Nothing to do with xCode. Why should xCode break the standard.

      > All the symbols you have to use like object notation [] and nested calls [[UIView alloc]

      And IDE will usually complete “UiView alloc]” for you. But yeah Objective-C is a bit verbose. I guess it is a tradeoff. Takes longer to write but is faster to read and understand later.

      • mk says:

        To all the people asking why I don’t like IB, I’ll give you a very good example why I gave up on it. If you happened to add images to your project using the “create folder references” option instead of the other one (which creates files in a single directory on disk, and makes managing a gazillion image files damn near impossible), you now lose the ability to see those files in IB when you try to add it to a UIImageView.

        Try it: add an image using “create folder references” to your resources folder. Then go to IB, create an UIImageView and assign your image to it, using the reference. In IB, the UIImageView won’t actually contain the image (just a blue “?”), but if your run it (and you set your path correctly) it will show up in the simulator. Fine, I can have empty images in IB for simple projects, but for anything a little more complex, this is useless.

        I didn’t realize that cmd-[ is OS X-wide. If that's the case, it's a bad standard and I maintain that [] are really awkward to type.

        “Java throws exceptions for calling methods on null! It requires me to write a getter/setter myself!”

        If you are calling methods on null, I’m sorry, but you are a bad programmer in any language. Regarding getter/setter, Eclipse has a macro that does this for you. In XCode, you have to switch between .h and .m files and write @synthesize and @property manually for each field.

        Tell me, how much easier is this:
        // in .h
        UIView *abcd;
        @property (retain) UIView* abcd;
        //in .m
        @synthesize abcd;


        private UIView abcd;
        //Go to Source->Generate getters/setters->check abcd (and all other fields)->OK. Done.

        Anyway, I don’t pretend I know anything about Android dev – I write enterprise apps for my day job, so everything I said about Eclipse applies only from my perspective dealing with Eclipse in the enterprise world (beans, ant, app servers, etc).

        It’s funny but the reaction to my post exemplifies exactly what I always thought about Macs (in general): fully embrace their technologies, and you’ll be happy as a clam; attempt to do things your way, you’ll end up feeling like me, unhappy, grumpy and with less hair, wishing there was something nearly as polished as OSX/iPhone so you can finally breathe.

  14. Fredrik Olsson says:

    No time to learn memory management? Just how long does it take to memorize:
    • Alloc, copy and retain = I own it
    • Don’t take ownership of delegates.

    Longer than writing the try-catch for an IOException I imagine?

  15. Dimitris says:

    I will only agree on the documentation argument.

    It seems to me that you have only been tolerating Mac/iPhone, that you never “got” the beauty of it. You yearn for your beloved managed enviroments (java, etc.) that allow you to be a lazier developer at the cost of the user runtime. As you also noticed Eclipse eats tons of RAM exactly because of the crappy managed platform it is based on.

    Personally, I have been writing java enterprise apps for nearly a decade now. I also have been playing with Objective-C for about a year and I am loving it. The best C based OO language IMO. As far as the GC argument goes, Objective-C 2.0 has it so you will see it in iOS sooner or later. Until then however, having to think about clearing your own mess makes you a better programmer and your users happier, don’t you think?

    • Danny says:

      “You yearn for your beloved managed enviroments (java, etc.) that allow you to be a lazier developer at the cost of the user runtime. ”

      Exactly. I cannot think of a single managed-application (Java or otherwise) that wasn’t either mediocre or downright horrible. Developing in a managed environment means giving responsibility away to the runtime-vender and no two VMs are alike.

      It’s simple – if you want to write successful, high-quality software, LEARN YOUR TOOLS.

      Java promised development on the cheap and failed on the desktop because the result was invariable badly-designed, slow programs that consumed unnecessary resources. Java was the buzz word of the late 90s and the result was the generation of developers who don’t know jack about how to really program. Let’s not perpetuate this fallacy to the mobile platform!

  16. snk_kid says:

    Garabage collection should be optional because at the moment for games garabage collectors are a hindrance and I know this as a fact because I’m in the industry we talk about these things all the time and I know someone who has just gone through the pain of having to deal with the android/dalvik’s garabage collector.

    Games have very predictable memory patterns, GCs cause pauses in games so most serious games written in GC’ed languages use object pools to reduce the pause times but this doesn’t catch all cases of (hidden) allocations and you can not completely disable the garabage collector (depeding on the langauge) so in the end all most all current garabage collectors just get in the way rather than be useful.

    You do realize that Android 2.3 platform/sdk was very recently released and it’s now pushing more support for native developement for game dev using the NDK that means not using any garabage collector or Java check here:

    Having said that garabage collectors are great for most other applications that don’t need soft real-time requirements.

    • James says:

      Yeah, I totally agree. I think it will eventually become a non-issue, but right now optional is a good way to go. Apple just doesn’t provide the option, so you’re stuck dealing with memory management for tasks where it really doesn’t make any difference to performance, so the only cost is in wasting precious developer time.

      • Tharsman says:

        There is always a difference in performance. You may not see much speed improvement, but the GC still is running there, in the background, nibling and chewing away at the battery life of the mobile device.

        Perhaps there should be options to allow developers to still do such programs, and leave it to the user to decide if they want to use them or not. Apple relaxed development rules a while back, so I’d not be shocked if eventually we see things like Real Basic compile iOS apps. At that point, I would hope, as a consumer, that Apple visually identifies non-native apps in the app store.

        For the time being, if the choice is pure native code without GC, or a VM language with GC, I rather they go the former path.

        Oh and one more thing: When I jump into iOS development, I got me a cheap Mac Mini for 500 bucks. Given that no cellphone is faster than the mini, there was no reason to go any higher than that, and even then it still would make no much sense to go any higher if you just want it as a dev machine. I do regret it a bit, but simply because I have come to love the OS so much I now refuse to use Windows for anything but games. Had I known I would like the OS so much I would had gone for a superior iMac or MacBook model.

  17. William Roe says:


    There doesn’t appear to be an icon for the Whereoscope app in the market, so it would look like a prototype to people searching for apps.

    Nice article, thanks

  18. Daryl Teo says:

    My biggest gripe with XCode is the fact that it doesn’t help me alot.

    Why doesn’t it generate stubs for me when I am writing interface files?
    Why can’t I generate stubs when I’m implementing protocols?

    Maybe XCode 4 will fix these.

  19. sichy says:

    Try Monotouch, has all you need and much more for iOS development with C# and GC and all the nice .NET stuff.

  20. [...] läsning. Part of this is that the Android approach is fundamentally to expose everything to the developer, [...]

  21. mliving says:

    Funny, the manager at my local Telus store told me when I purchased my Android powered HTC Desire that almost 75% of the people coming in looking for an iPhone left with an Android phone after a proper introduction to the Android powered product line.

    Oh and how about that iPhone compression!” NOT! That’s right kids… even those snappy new iPhone4s still do not have any data compression what-so-ever. Explains why AT&T love the iPhone and their need for huge and expensive data plans.

    Android will win the day. Maybe not as fast as everyone would like but most are getting really tired of Apple’s growing pissy corporate attitude and are starting to “think different”. (:

  22. David Bloom says:

    Thanks for the article, this is really cool!

    I wandered into an AT&T store the other day and had a chat with the sales assistant there. She said to me that they sell about 1 Android handset for every 3 iPhones.
    AT&T’s Android phones are not as nice (or heavily promoted) as the ones offered by other carriers. I’d expect to see more enthusiasm about Android in, say, a Verizon or T-Mobile store :-).

  23. [...] Android vs iOS: A Developer’s Perspective Source: Whereoscope Blog Excerpt: [...]

  24. Manoj Waikar says:

    Nice to read the points you mentioned. I never owned an iPhone and have directly started using Nexus One – maybe the user experience could be a little less than an iPhone, but for me, a phone whose OS gets updated regularly trumps everything else.

    By the way are you aware of this –

    I am not sure, if 100% of the APIs are exposed this way, but if they are, this could be a killer language (and it’s a dialect of PG’s favorite language too) :)

  25. Clintonio says:

    I just want to point out that Android has a better market share than Apple worldwide.

    • James says:

      Yeah, I hope so. At least for the purposes of this dev work. I’ve found it hard to get meaningful figures from sources I can trust on actual market share. Apple has accused Google of misreporting, and I suspect it’s also the case that most of the “iOS devices” they count are iPod Touches, which is not so useful to us.

      Part of the point of this is to find out what the market share really means. I’ll likely post a follow-up at some point once we have some actual data.

    • Oliversl says:

      Are you talking about “market share sales” or “market share”?

      Android is going to surpass Apple in the near future, but today there are more iOS devices than Android devices in the world.

      This is a delicate subject, so that all I’m going to say about it. Or not :)

    • abu says:

      Android is clearly overtaking iOS in smartphones market share, but the iOS device market is still quite bigger than Android (roughly, Apple has sold around 70mil iPhones + 45mil iPod touch + 10mil iPad).
      If your apps aren’t phone-specific, this has to be taken into account.

      Also the Appstore is available in around 70 countries, vs 32 for Android Market

      • Tharsman says:

        I ponder how things will change if Apple opens up. Right now they are heavily handicapped due to being locked up with AT&T.

        Also, the iPhone (I think) just got released in China, one of the biggest potential markets where Andorid already had been for a while, and I hear they can not meet demand.

        Would be interesting to look at world-wide numbers by next Summer, specially if the fabled Verizon iPhone comes out.

  26. Bill says:

    Do you guys see a similarity between Android and the Palm Pilot as far as software development goes? So much crap available and so much of it untested.

    • James says:

      We haven’t really looked into Palm that much. At the moment, iPhone and Android seem to be where it’s at.

      • Bill says:

        I’m sorry; I should have been more clear. Palm _Pilot_ software, as in for the device us old guys carried around ten years ago.

      • James says:

        Gotcha. Sorry. And now I feel old!

        I used to have a Palm V at the turn of the century, and I thought it was really cool. There were a lot of shitty apps, for sure. But back then, the whole concept was sufficiently revolutionary that I was willing to overlook basically any wart. Of course, that didn’t really end up working out so well for them.

        There’s some parallels, I guess, but I don’t know that you can really make an informed comparison: the app stores of today simplify the process so much that it’s just a totally different dynamic than the old sites you went to for Palm Pilot software. That means you get different users, with different priorities, and you need to solve different problems to keep your users happy.

        So, in summary: I’m not sure :)

        But thanks for giving me something to ponder!

  27. steve says:

    Well, Garbage Collection isn’t supported on the iPhone because it doesn’t work for long-running applications.

    (It is the same reason on Mac OS X.)

    So while a short round of an iPhone game wouldn’t be a problem, having some application open for some days can pose a problem.

    • James says:

      I was under the impression that Obj-C 2.0 on OSX did do GC?

      • steve says:

        Sure, but even Apple says “don’t use it for anything that has to run for a longer period of time”, because GC in Objective-C cannot be described as “shit”, but better as practically “non-existing”.

        Their garbage collector has no compaction, which means it is only a question of time until it eats all the memory and crashes.

    • Have you got a reference for Apple making that claim? I’ve seen no such warning, and it’s certainly not in Apple’s documentation on the tradeoffs of GC vs. reference counting.

    • Jean-Denis Muys says:

      Sorry Steve, but this claim is simply false. GC on Mac OS X works fine thank you very much. You can leave my GC’d App running for months on end, with no problem whatsoever. I have instrumentation to prove it.

      • steve says:

        And I have math and science to prove you false.

        This isn’t something about having an opinion, it is about facts.

        – Objective-C has a conservative Garbage Collector

        – Objective-C’s Garbage Collector doesn’t do compaction

        == Applications using it will run out of memory. Some will run out faster, some slower.

        Get over it.

      • Ken says:

        @steve The Objective-C garbage collector is not conservative, it’s exact except for the stack, which is conservatively scanned (and any raw buffers that have been marked as needing conservative scanning). Even if it was, that doesn’t imply what you’re saying.

        You say that even Apple says not to use it for anything long running? Where do you see that? I have never heard anything like that.

    • Danny says:

      “Well, Garbage Collection isn’t supported on the iPhone because it doesn’t work for long-running applications.

      (It is the same reason on Mac OS X.)”

      Absolute bollocks. Core Data applications use GC extensively and I’ve never seen anything like what you’re describing.

    • Matej K. says:

      That’s just BS. ObjC GC doesn’t do compaction because it can’t move chunks of memory as it could invalidate existing pointers. But you get the same problem – heap fragmentation with manual memory management.

  28. copernicus says:

    Careful, there are those in the mac community that take exception to the smallest criticism of their platform when they don’t share it.

    I agree with your article. Development for iOS is a pain, to put it mildly. It requires you to use OSX, which I personally find very unproductive, especially since I use multiple large monitors, and obj-c can be slow in surprising circumstances. I can’t think of anything polite to say about XCode either, except that it’s free as long as you provide your name, address, phone number, postal code, blood type, etc.

    Also, no flash.

    It’s pretty obvious that HTML5 cannot and will not compete with flash to those of us using the latest networking features of flash hero.

    The only good thing is that there is exception handling in iOS, whereas you have a couple of hoops to jump through to get it on Android. I’ve got development environments for all mobile devices now. The best, by far, surprised me: It’s windows phone 7, mainly because the emulator is supreme. It’s like the Android one except it runs at full speed, and it’s just as free (aka, I downloaded it without entering any information). iOS development is honestly a big big pain, and while I’m sure it will remain strong in the mobile market, I’m also quite sure that it will lose it’s dominance.

    • daniel says:

      obviously you don’t use a Mac, you sound just as pompous

      • copernicus says:

        I’ve used a Mac plenty. At one point it was my primary development environment. I got tired of fighting the interface to be productive, and had to resort to using the command-line. Once I got to that point, I just switched to linux. I virtualize Windows to maximize my software ecosystem. OSX *would* be a great OS if it weren’t for the handholding and the rigid adherence to Fitt’s Law (which it is applying incorrectly in this age of massive screen real-estate).

        The last time I tried using OSX as my primary development platform was 2 months ago. By day 3 I was so fed up I installed linux on the machine. Now that machine is again an OSX machine, and I only use for iPhone compiling, and sits headless in a corner of my office. My main development machine is now a linux box that costs less than a Mac Pro but is at least twice as powerful.

        The OP of this article was much more graceful in his criticism of OSX than I ever will be, but I assume because he hasn’t actually had to work on one for long enough. OSX did do one thing right, and that is to be a fully UNIX compliant system. On windows you play around with cygwin or msys and the effect is still lackluster.

    • Jean-Denis Muys says:

      Let me balance your take with mine.

      Development for iOS is a dream, to put it mildly. It requires you to use OSX, which I personally find very unproductive, especially since I use multiple large monitors, and obj-c can is fast in surprising circumstances. I can’t think of anything polite to say about Eclipse either, except that it’s free as long as you tolerate bloated, slow, cumbersome, multi-screen hostile etc.

      Also, no flash. (sigh of relief. My battery agrees)

      It’s pretty obvious that HTML5 cannot and will not compete with flash to almost everybody. I even deleted Flash from my computer altogether. Haven’t missed it for a sec ever since.

      One of many good thing is that there is exception handling in iOS, whereas you have a couple of hoops to jump through to get it on Android. I’ve got development environments for all mobile devices now. The best, by far, did notsurprise me: It’s Apple’s, mainly because the emulator is supreme. It’s like the Android one except it runs at full speed, and it’s just as free (aka, I didn’t pay a dime). iOS development is honestly a dream come true, and while I’m sure it will remain strong in the mobile market, I’m also quite sure that it will increase it’s dominance.

      • steve says:

        Can someone pull him out of Steve’s reality-distortion field?

      • Darwin says:

        Can someone pull you out of your juvenile junior college programmers mindset? No? Didn’t think so.

      • copernicus says:

        Congratulations. You just validated my opening remarks.

        Also, to correct you, flash has capabilities that are beyond those in HTML5, and beyond those that will probably ever appear in HTML6. This is not something people developing standard apps would know about. Your take on flash vs HTML5 is at odds with reality. Flash 10.1 has introduced networking capabilities far beyond anything offered in HTML5.

    • Darwin says:

      The no flash comment exposed you as the pretender you are.

      • mk says:

        Copernicus, while I agree with your sentiment about XCode and OSX in general, I do have to agree that your Flash comment was a little “wtf?”-inducing. It seems so far removed from any discussion about Eclipse or XCode, that I have to wonder what your motive is?

      • copernicus says:

        Stop trolling. You’re making all OSX users look ignorant, which I know isn’t the case.

        No flash in the browser is no flash. People aren’t going to install an app that depends on the AIR runtime. Remote rendering flash and serving a video stream is also overly complex and prone to problems.

        There is no browser-based runtime for iOS. I doubt there ever will be any time soon.

  29. nickprocaccini says:


  30. Oliversl says:

    Nice, but please rename the topic to:
    “Android vs iOS: A junior Developer’s Perspective”

    Seems to me that you are just starting to know: mac, eclipse, ios, android. Thats why I suggest putting that “junior” word in the subject. No big deal, no one born knowing. You can also “new developer” or “my first steps into mac/eclipse/ios/android development”

    BTW, Thinkpads ruls! Best notebooks out there.

  31. TTom says:

    I understand that it might be easier to develop for a single device in Android, but have you gone through the hoops of developing for the multiple devices with different screen resolutions/hardware/Android OS versions yet? Just the posts from the Angry Birds developers made me think that it would be more hassle than its worth, but they may have a harder time due to full screen game development being more screen resolution/hardware dependent. I’m assuming/hoping not all apps are that dependent.

  32. MyFreeWeb says:

    LOL. Developing for Android is a huge pain in ass. You have to support lots of different screen resolutions, non-multitouch screens, slower phones… And Java sucks.

    With the iPhone, it’s just Retina and normal display. And the Simulator is better than any emulators. Well, you have to test your app’s performance on a real device, but when you debug it on your Mac… it opens instantly. Starting the Android emulator takes FIVE minutes on my Mac mini. Damn, FIVE MINUTES! This is madness.

    • steve says:

      Uh … there is “Retina” and the “normal Display”. There is Apple’s “Simulator” and the other “emulators”.

      You shouldn’t take all these Apple PR pills at once.

      • cake says:

        The flip are you smoking?

        The simulator is irrelevant since it runs at the same size as the target resolution . And wtf is apple’s emulator?

  33. Sam says:

    I recommend the vrapper plugin for eclipse gives you the basic vi commands in eclipse. It’s still not perfect but it makes it much more bearable to use.

  34. Danny says:

    “I really don’t have time to learn the rules of memory ownership and when to free, and all that stuff for yet another language”

    Sorry, but with an attitude like that, you shouldn’t be calling yourself an experienced programmer. Do you refuse to use inheritance because you can’t be bothered to learn how it’s implemented? Or unicode strings because you can’t be bothered to learn about localization?

    Using GC as a magic bullet will hurt you in the long run and also prevents you from you using certain design patterns such as RAII or the command pattern which usually requires strong transfer of ownership rules. To an experienced programmer, using smart pointers, the lack of reference counting in C++ is NO handicap – quite the opposite.

    In C/C++, I can determine exactly when an object will be freed. GC removes that control and leaves me at the mercy of the runtime to do the right thing, which may be radically different between platforms. As a former VB and Java developer who had to contend with memory leaks in the venders runtime, I know how frustrating this can be.

    “It’s a big enough deal that I would advise the first-time mobile app developer to start on Android for this reason alone.”

    Memory management is a vital skill in programming- recommending that developers switch to a platform that *appears* not to require those skills it is like recommending that we all switch to VB because it has a drag and drop UI designer. No thinking required.

    • steve says:

      As long as he doesn’t plan to develop operating systems or drivers for it, wasting time with memory management is just … wasted time.

      There are so many things in computer science _happening now_ that it is stupid to look at things which were current 10 years ago, just because Apple can’t be bothered with actually developing a decent GC.

  35. red pepper says:

    thanks for share. it is very nice. i fallow you

  36. CU Illini says:

    Thanks for the great post, I really enjoyed reading about your experience. I am thinking about developing an android app, mostly as hobby not as a business.

    While I am familiar with programming and CS concepts, I am not familiar at all with android. So I was wondering if you had any recommendations to books/sites that you used to teach you android. I know is a good reference but, I was looking for more of an example based source.


    • James says:

      Hey. I’m pretty much of the mindset that the best way to learn is by doing. I just dived straight in and went googling when I got stuck. As I said, the docs are pretty good, but stackoverflow was a great resource when there were gaps.

  37. JulesLt says:

    For a free app, you go where the users are. For paid software, you go where the customers are. I see that as the big difference between the two platforms – for now.

    I’m sure the size of the Android market will mean that it will eventually contain a subset of higher-spending customers equivalent to the iOS market in size.

  38. Jason says:

    Garbage Collection is a good thing, and it’s a bit baffling because Objective-C on OSX has it but Objective-C on IOS doesn’t, but it’s not a deal breaker. Not having GC is an inconvenience, and poorly written programs do leak faster, and it’s like going back to the 90s, but you can work past all that.

    The code-signing on the other hand; that’s a frustrating experience. It’s simply not documented well enough by Apple. I had no idea what all the bits did until I read some blog entries by developers who had reverse-engineered it all. I’m still not entirely comfortable with code-signing on IOS.

    Your bit on simulators makes no sense; you like the Android simulator because it’s so slow and useless that you don’t use it, and that makes it better? Agreed the Apple simulator could be a bit more realistic when it comes to performance.

    • James says:

      I don’t really like either of them. But looking at it in terms of results, the Apple approach did bite us and waste development time. The Android approach didn’t, and I suspect it won’t, because you’ll always develop on the handset. To be fair, shipping with no emulator would have had the same outcome, and I think that would have been a perfectly valid choice.

      It’s fair to call it a rookie mistake. But with the pace of technology’s progress, we all become rookies with great frequency.

      They both suck, but Android happened to fall on the right side of that one. I suspect it wasn’t by design.

      • Jason says:

        I think it’s nearly impossible (at least terribly difficult) for a simulator to accurately model performance. The C64 emulators have taken years to get to that level of accuracy, and that’s with a nearly 30 year old piece of hardware.

        You always test performance on the actual device. Simulators are for testing functionality.

  39. Craig Hunter says:

    Interesting perspectives, thanks for sharing. Couple comments jumped out at me.

    1) if you think the iOS cert/provisioning process is painful, you should see what some of the other mobile platforms are like. Some of them require you to submit certificate requests and wait a couple days to get approved. At least Apple does it in a matter of seconds, and following their setup recipe is fairly easy. The process is definitely not Apple-like in terms of simplicity though. Thing is, there’s a reason for all of this, and it’s to prevent any old schlub or hacker from deploying apps on the device that can cause mischief. Android is a lot less secure in this regard. Someday I expect this will come back to bite them.

    2) I don’t have any real gripes with memory management in Obj-C. In fact, I’d say that having even a basic mastery of it helps you write very good code. Done right, manual memory management is very efficient. It’s not hard at all.

    3) I don’t have any allegiance to particular languages (started coding in Fortran for crying out loud). They all have pluses and minuses. After coding in Java on Android and Obj-C on iOS, I prefer Obj-C more and like Java less, but that can go either way for a given developer. I formed some negative opinions of Java back in the 90s, and they haven’t changed a whole lot (unfortunately).

    4) I wouldn’t count on an emulator/simulator/VM for any app development. It can give you the wrong impression in *both* directions. You mentioned that the iOS simulator can make things appear too fast, but it’s completely the opposite when writing OpenGL code that doesn’t get acceleration in the sim like it does on the device. That GL code will always run faster on the device unless there are CPU-bound issues in the code. Bottom line, I think it’s naive to use an emulator/sim/VM for anything more than simple evaluation on *any* platform.

    5) As far as IDEs go, I much prefer Xcode and its intimate integration into the SDK and documentation. Eclipse does work fine, but its more open/extensible architecture doesn’t give me that “plugged in” feel I get with Xcode. Google has good documentation across the board in general, but Apple’s is quite good too (and its integration into Xcode at the source level is nice). As you note, some stuff is simply undocumented in iOS, and it can be a pain to learn by feeling around in the dark.

    6) I wish you guys success on Android. In terms of the earnings to investment ratio for a quality paid app, I still feel like my precious developer dollars are better spent on iOS right now. In my limited experience on Android, I think financial success is more heavily dependent on having a high volume free app that brings in good ad revenue. That can be a disheartening goal to chase after.

    • Darwin says:

      I lol’ed when he said he likes developing in a VM. What a dumbass.

    • James says:

      Wow. There’s a lot there. I’ll have to give fairly brief replies. Sorry.

      1. I’m willing to believe that there’s worse than iOS, but that doesn’t absolve them of engineering it to be as complex as it is.

      2. Bottom line: while I can see arguments for allowing callouts to unmanaged code, I don’t think it’s a strength of the platform that it disallows managed code. I agree that learning to manage memory is a useful thing to learn for the purposes of understanding your computer (I do think C should be taught in university undergrad). But that’s not a good reason to slow down development of production code where the GC/no-GC decision won’t impact the performance of the app or the happiness of your customers.

      3. Yeah, I’m not in love with Java. There were a few things I really missed from Python when I started with Java, and it seems overly bureaucratic in places. I think it works pretty well for what they’re doing though.

      4. Good point. I wasn’t aware of the OpenGL thing. See my other comments on this above.

      5. Fair enough on IDEs. It’s subjective, and I tried to be clear that I was just sharing my experience of it. A lot of people do love XCode, so there must be something to it.

      6. Thanks! We’ll have to see.

  40. Woah, this is the first time I read someone actually saying they had a better experience developing for Android rather than iOS. I read all the time people hating Eclipse, the emulator and the overall experience…

    • Darwin says:

      Eclipse has been a bloated unreliable pig for some years. Too bad James the script kiddie doesn’t have enough experience to know that.

      • James says:

        It works fine for me. It’s definitely more bloated than vim, but I found it pleasant enough to work in. Not sure what that has to do with experience though — surely the only relevant experience here is that of my having used Eclipse?

        It also doesn’t take 9GB of disk space, like the current version of XCode.

      • Respect says:

        So let’s see…

        One one side we have someone that comes to the developer’s blog and drops comment spam to the tune of six replies and ad hominem attacks to both the developer and the platform they chose to support.

        On the other side we have a credible developer with credible real world success and a credible subjective comparison of the two platforms.


        You are providing an extremely valuable application to Android. Many appreciate your support, dedication, and interactivity.

  41. Jackifus says:

    You wrote:

    I really don’t have time to learn the rules of memory ownership and when to free, and all that stuff for yet another language.

    And then wrote:

    Part of this is that the Android approach is fundamentally to expose everything to the developer, rather than try to hide important stuff on the (somewhat condescending) assumption that the platform developer knows better than you.

    Sounds like the platform developer does know better than you.

    Aside from that desire for language simplification and simultaneous and contradictory desire for deeper access: (or maybe just a java environment is what you are asking for… )

    I appreciate the balanced perspective and assessment.

    • James says:

      To me the difference is this: opening up access to the GPS, etc, is something that will fundamentally make our app better. There is simply no amount of manual memory management that will improve the user experience of our app. For us, those are the sort of details that should be abstracted away.

      Now, I also appreciate that game developers (for example) can’t tolerate the unpredictability of having the garbage collector do its thing at random times. For them, memory management is a detail that can’t be abstracted away, and having control over it will make their app better. It’s also probable that for them, low-level access to GPS (etc) won’t improve the user experience of their app.

      Now, it’s not my position that there are no cases where manual memory management is the right call. I think the approach of managed by default, with the ability to call unmanaged code is probably the right balance here. But my point stands: iOS does not provide memory management. You are forced to endure the overhead even when there is no benefit to so doing (ie: most of the time). Arguably you are forced to endure needless complexity in working with location services on Android, but I just don’t see that as anywhere near the same order of magnitude: it’s isolated, and well documented, and it won’t increase in complexity as the rest of your program increases in size.

      • Jackifus says:

        >>To me the difference is this: opening up access to the GPS, etc, is something that will fundamentally make our app better. There is simply no amount of manual memory management that will improve the user experience of our app.

        Well put – and point taken.

      • Jeff Barbose says:

        There is simply no amount of manual memory management that will improve the user experience of our app.

        If you’ve written horrible code that hogs memory and taxes your GC, causing the UI to pause and sputter and lag, then I’d say your statement is patently false because of its absolutism.

        A fast, responsive UI is one of the most important aspects of a high quality user experience.

      • James says:

        If we did that, sure. But the app, as it stands, really doesn’t have a memory management bottleneck. The comment applies only to our app. I am sure that it’s possible to write an app that abuses the GC system as you suggest, but we try not to do that — I’m pretty sure the current app doesn’t do that anyway.

        Definitely agree that keeping the UI responsive is important. But GC isn’t getting in the way of that right now for us.

      • Jeff Barbose says:

        If we did that, sure. But the app, as it stands, really doesn’t have a memory management bottleneck. The comment applies only to our app. I am sure that it’s possible to write an app that abuses the GC system as you suggest, but we try not to do that — I’m pretty sure the current app doesn’t do that anyway.

        Well, then aren’t you doing 80% of the memory management work yourself, and relying on the GC to catch the spots you missed?

        I grant you this is not an insignificant amount of time savings in a project, but it certainly demotes GC from being a panacea and, given the simplicity of the rules for memory management in Cocoa, the original author’s overstatement is that much more over the top.

  42. [...] Android vs iOS: A Developer’s Perspective Since its inception, Whereoscope has been an iPhone-only shop. The genesis of this iPhone-fetishism goes all the way [...] [...]

  43. Darwin says:

    Yes many people will be and are turned off by Android at first use.
    You won’t make any money on Android.
    Enjoy the fragmentation.
    Being proud of not knowing memory management and saying “iPhone” doesn’t have it is the first really obvious clue that you don’t know what you are talking about.
    Not impressed.

  44. Darwin says:

    I think what we have all learned here is Steve is a juvenile dick who doesn’t know anything about programming. But he thought we would be impressed with his wisdom. Too bad there are so many people calling you out on your bullshit little Stevie.

  45. Jeff Barbose says:

    Memory Management in Cocoa on iPhone (Cocoa on Mac has GC).

    1. If you allocated the memory for an object, you’re responsible for releasing it.
    2. if you need to keep an object you didn’t allocate around for longer than the local scope, retain it.
    3. if you retained an object, release it when you’re done with it.

    OMG Memory Management is soooo hard.

    GC gives you the freedom write crappy memory-abusive code, but that doesn’t mean you shouldn’t know what’s going on even in a GC environment, largely in the form of the above 3 rules, when you’re writing code.

    If you don’t like spending all your time in Xcode, spend it in another editor and script it (that is, if it doesn’t already contain built-in support for the Xcode build system) to use the Xcode commandline tool ‘xcodebuild’

    Try TextMate or BBEdit, for example.

  46. McCarthy says:

    “Secondly, I just can’t think of another language feature that accelerates development as much as garbage collection.”

    Functions? If-statements? :-)

  47. psymac says:

    Despite the hassle of developing for the iPhone/iPad vs Android, hasn’t it been well documented that the profit return from iOS apps are far greater than Android?

  48. [...] Whereoscope guys have put up a well-publicised post explaining why they prefer Android development to iOS. One of the main gripes they had was the lack [...]

  49. phil says:

    You lost some cred when you said you think the mac is a toy and prefer the thinkpad. Battery life beats all other equivalent laptops, trackpad is unmatched, nice and thin, excellent build quality, most reliable hardware, great display, mag safe is very cool. PC hardware is clunky, still haven’t met PC hardware I like. Which model do you have? Perhaps I like my mac hardware so much because I have a 2010 15″ MBP 8 Gig i7 1680×1200 display. If you have a 3 YO plastic MB maybe I get it.

    Anyway, I mostly agree on all your other points. At the very least iOS should support optional GC. People can call you a dumbass, etc but fact is most apps would do fine with GC. And it would be good for end users as well as there would be fewer memory leaks.

    Not only is all the provisioning/cert stuff a very manual PITA, it’s also very buggy. Often the solution is to delete everything and start over again. Developer experience matters.

    XCode is 10 years behind other IDEs. I don’t really like Eclipse but it’s light years ahead of XCode. Only the most ardent fan-boy could defend the POS that XCode is. Try renaming a project in XCode and then tell me how great it is.

    • James says:

      Comments on MBP being a toy were based on it having had to go back to the shop for repairs on 3 separate occasions. I think of it as a toy because it just wasn’t reliable enough to do work on. To their credit, after the 3rd time I had to send it back, they replaced it with a new unit, but by then I’d already migrated anything important off it because I just couldn’t trust it. My TP has suffered some serious punishment (I always treat my MBP with kid-gloves because I expect it to break all the time), and just keeps ticking. I’ve heard similar stories from other owners of MBPs and TPs. I agree that the specs are nice and they’re pretty computers and “feel nice” and all that, but that’s not enough to pass the “toy” bar for me. It’s gotta be go anywhere, kicked, dropped, spilled martinis on, stepped on reliable. You try to treat them well, but if you’re using it > 14 hours a day, everyday, on aircraft, at cafes, etc, those sorts of things will happen.

      Mag safe is cool. I’ll definitely give you that. That’s an idea that really should have caught on more broadly.

      Also, what’s with those weird glass screens? My original MBP was a matte screen, and it was pretty good, honestly. I let the guys handling the replacement convince me that the shiny glass screen would be better than the matte one, but there’s just no way in which it’s better — I hugely regret that. So it’s an unreliable computer that I can’t use outdoors.

      • phil says:

        I guess mileage varies. I’ve never had a problem in the 3 years that I’ve had a macbook. Same for my wife.

        But I don’t spill stuff on mine either :)

        I read recently (consumer reports or something) that macs came in first for reliability but can’t find the link. Here is a link from last year (PC Mag):,2817,2368154,00.asp

        They should be the most reliable given how much they cost of course!

        I regret not opting for the matte screen on my latest MBP too. $50 extra I think (lame!)

  50. Alex Skorulis says:

    I’ve had a pretty similar experience going from iphone to android. Though when it comes to memory management sometimes android annoys me. Especially when creating applications that are image heavy I find android doesn’t give me enough control and the application runs out of memory.
    Actually I would probably be happy if activities got memory warnings like on iphone.

  51. jrock says:

    thanks for this posting. i’m an aspiring iOS dev and have some java experience.

    could not agree more on the complexity with the certs. i had a situation where i couldn’t get an app submitted due to cert issues and i had to rebuild my entire environment which was very frustrating.

    i don’t find the memory management in iOS to be a deterrent to my coding. if anything, i feel i have more control on the code’s performance. that isn’t to say that i haven’t encountered puzzling or erroneous memory leaks though.

    to each their own regarding the ide. i’m in in visual studio for my day job and i think that trumps any ide out there – :).

    • James says:

      Yeah, I haven’t used Visual Studio since VC++ 6. I assume it’s gotten better since then. :)

    • Erik says:

      Do you like the UI of visual studio or just the functionality provided? It has a lot of good functionality, but I can’t see why people would like the UI. It is overrun by toolbar buttons. Configuration options are hard to find. In short I think it is a lot more messy than say xCode or Netbeans which tend to show only those toolbar buttons relevant for what you are doing right now. Configuration options can be found by filtering/searching etc.

      • jrock says:

        i agree the toolbars can lead to ui overload, but they don’t bother me much. i find the intellisense (autocomplete) to be second to none. the ability to “pin” a variable which shows a value in the code as opposed to the output window is cool. basic functionality seems optimized in vs, like real time code searching and being able to traverse a call hierarchy. maybe its just me.

  52. cfaherty says:

    It’s the breakpoint support in the Android emulator which causes the apps to run slowly. If you terminate the app launched via debug, and run your app from the screen you should see it run near native speed. I use that often, because it still logs to the console.

    Having the Android OS sources available is fantastic. Stick them under android-8/sources (an example for API-8) and you will have full breakpoint single-step into them.

  53. [...] Masturbates on Hacker News for other reasons) started out with this lazy and inane blog post: Android vs iOS: A Developer’s Perspective.  I’m going to dissect and refute the post later on here, but what got me heated was that [...]

  54. guest says:

    What about how hard it is to sell Android apps. Google doesn’t care about developers when it comes to making some money out of apps. Bizarre behavior android market make is impossible for several users to download the app. I get several emails on daily basis from users complaining they are not able to download the app from Android market. The market is so buggy that a lot of my paid apps are not visible.
    Developing for Android might be easy, but making money is really really hard.

  55. Erik says:

    A note on string concatination which people frequently complain about in Objective-C.

    NSArray *strings = [NSArray arrayWithObjects:@"here", @"be", @"dragons", nil];

    [strings componentsJoinedByString:@" "];

    This will give the string @”here be dragons”. This is btw a lot more efficient than doing say: “here” + “be” + “dragons” in Java. Only one string buffer allocation is performed to put all three strings into one. While a temporary string buffer will be created for each + operator in the java case.

    • James says:

      Gotta say, it was a bit of a surprise to see that Java didn’t have a builtin equivalent of Python’s str.join() method. Not a big deal, but also a bit of a “what the?” moment.

    • McCarthy says:

      Eh, yes and no. Yes if you do it in a loop. But if you say “here”+”be”+”dragons” with string literals, as you did right there, the Java compiler builds exactly the same bytecode as if you’d typed “herebedragons” in your file. (Try it!)

      Java’s dumb but it’s not *that* dumb. :-)

    • jrock says:

      of course there’s also:

      NSString *strConcat = [NSString alloc] initWithFormat:@”%@ %@ %@”, :@”here”, @”be”, @”dragons”];

      and don’t forget to release the memory when finished – :).

      [strConcat release];

      • Jeff Barbose says:

        No, there’s

        NSString *strConcat = [NSString stringWithFormat:@%".."];

        and since you didn’t directly allocate the memory for it (the NSString class method did), the returned NSSString instance is autoreleased, so you don’t have to do anything to it, because at the end of this cycle of the runloop, the object will be released if no other object has retained it.

        So don’t overstate the complexity here.


        NSArray *strings = [NSArray arrayWithObjects:@"here", @"be", @"dragons", nil];

        “strings” is also returned as autoreleased, so you need to retain it if you want to keep it around. If you’re just printing it out or otherwise using it as a temp variable, you don’t have to release it as it will be released at the end of the current runloop cycle.

        Thar be no releases here. ;)

      • jrock says:

        i’m not trying to start a pissing match, but i try to avoid convenience methods when possible. i’d rather control the memory lifespan than rely on the autorelease pool. to each their own.

  56. freescifistories says:


    How long did it take to port from objective-c to java?

    My problem is that the engine of my app (as opposed to the interface) is coded in C for portability. I don’t really want to have to port the whole engine into Java.


    • James says:

      Not long — a few weeks. It’s probably also a bit generous to call it a port; I did the minimum I could to make something that works enough to test the market. The point about writing the core of your app in C for portability is interesting — once that would have been completely uncontroversial, but these days not even using a lowest common denominator language like C is enough. I’ll be interested to hear how you sole this.

      For us, part of the it was that so much of our functionality is implemented in web services on our server. Of course, that’s not a strategy that will work for everyone.

  57. [...] Android vs. iOS: A Developer’s Perspective ( [...]

  58. [...] Android vs. iOS: A Developer’s Perspective ( [...]

  59. harveststorm says:

    nice article and I do agree on the point why mac is needed to develop ios apps. Why not open up to pc too?

  60. Ricardo santos says:

    Garbage collection? Don’t have the time?

    Then dont program.

    Garbage collections waste memory and are slower than request memory and then free it. Since the iPhone just crashes the application when is out of memory, the decision of not having a garbage collection was wise.

    As for iPhone the worst part is that you need to be a member of the $100 a year club in order to make an app for your own phone. And then you need to renew the app certificate every 3 months or so. Why couldn’t they allow people to create apps for their personal use? Maybe because it would have almost impossible for them to have a monopolized store.

    • And yet, that has not been a problem for Unity applications (many of them on the top seller list every month) and Mono based applications.

      Garbage Collection helps developers focus on features, rather than administrivia. Sometimes you need the administrivia, but most of the time, you really dont.

      • jrock says:

        Hi Miguel – I know your name from the Mono.NET project. I’m going to plead ignorance when it comes to Mono, but I have a few questions if you don’t mind. One, how does the memory footprint of an app developed in Mono compare to an app developed in native iPhone tools and second, how long after new features are deployed by Apple are you able to incorporate them into Mono? Thanks!

  61. nick says:

    It’s’ great to see an article that arouses so much debate. As soon as the word apple is mentioned in a blog, all the zellots appear. I find strange that people would want to own a hand held computer that are generally less powerful than a low range p4 when most professionals spend most of their day around a pc or laptop.

  62. Kevin says:

    You hit a lot of the same points I did on my blog at

    I was coming from Android to iPhone but ran into a lot of the same issues.

    I still have not gotten my whole developer certificate issue resolved with Apple. I think they just stopped reading my emails. I have not yet been able to run my code on the actual device.

    Good thing the iPhone emulator is fast otherwise I would have given up. I debug my stuff right on my Android phone. This also means if a company guy wants to demo the Android version we just put it right on his phone. Not allowed to do that with the iPhone.

    There are a lot of things that annoy me about Xcode. I find it funny when I read iPhone dev post on how Xcode is great but now that Xcode 4.0 is coming out they will finally admit the old one has issues. Now that Apple as deemed them as something that needs to be fixed they can admit it was crappy all along.

    I think Apple only people think the PC is crap because PC people complain, and complain a lot. We do for sure but I think the Apple side does not complain enough, they run along happy with what they have instead of pushing for something better.

    Don’t like Eclipse? Try IntelliJ or NetBeans or JEdit or a number of other IDEs. Don’t like Xcode, hum, not much else you can choose from. PC devs push other PC devs.

    I enjoyed the post.

  63. [...] TalkA podcast by developers for developerslinks for 2010-12-09by delicious on December 9, 2010Android vs iOS: A Developer’s Perspective « Whereoscope BlogDeveloper Interview Series: TweetDeck for Android’s Max Howell | Android News, Reviews, [...]

  64. [...] Android vs iOS: A Developer’s Perspective « Whereoscope Blog [...]

  65. dtanders says:

    “I still believe that the iPhone is a truly paradigm shifting device; that by releasing it, Apple fundamentally and irrevocably changed the course of computing.”

    No, it wasn’t, and no, they didn’t. They hopped on board and gave mobile computing a nice shove in the direction it was already going by making an approachable product with a slick UX. They did the same thing with MP3 players with the iPod. Anyone already using a real smartphone to its fullest in the pre-iPhone/iTouch days would tell you the same thing: we could do all this and more before – it was just harder (sometimes much harder) to accomplish. Kudos to Apple for giving the industry a kick in the pants, but their mobile products are still overpriced toys as far as I’m concerned.

    • James says:

      Fair perspective. I also had a string of so-called “smart” phones before my first iPhone. I actually tried using one of them again after having had my iPhone for a year, and I just couldn’t manage it. I think I really overestimated how “smart” it was when I had it.

      My comments were really based on results. You’re right that a lot of (all of?) the technology already existed, but the general populus hadn’t really made that leap, IMO — everyone was using dumb, or at best, “moderately bright” phones. I think of the iPhone as paradigm shifting explicitly because it was the iPhone that actually succeeded in shifting people in that direction.

      The “harder, much harder” part is really the key to me. Unless it’s easy, people won’t bother. That’s where Apple succeeded, and that is why I think they deserve credit for being the ones to shift the paradigm.

      I mean “paradigm”, incidentally, in approximately the sense that Thomas Kuhn used it. He spoke of science being defined by what scientists do, and came under a lot of fire for it, because it’s kind of a useless definition — what’s a scientist? and what’s science? But to my mind, until the people involved are *actually* using the device, the paradigm hasn’t shifted; to my mind that’s the strongest test you can apply to his thinking: is a community of people doing something different? If so, you’ve got a paradigm shift.

      • dtanders says:

        Then I guess we diverge on our basic criteria. I want it to do more out of the box, even if there is a learning curve. Speaking of learning curves, I actually don’t find Apple products particularly intuitive, so it’s possible I’m just too different for them at this point.

        Oh and I feel your pain regarding XCode, but don’t understand the Eclipse love – its only redeeming feature is its price. Maybe XCode lowered your expectations so much that Eclipse looks amazing in comparison?

      • James says:

        “Maybe XCode lowered your expectations so much that Eclipse looks amazing in comparison?” — possibly. I don’t hate XCode, I just can’t get myself to like it either. XCode and Eclipse are the only two IDEs I’ve really used enough to form an opinion of. But it’s a fairly minor gripe, really. As long as I can get text in there *somehow* and hit “build”, I’m happy enough. I think it’s probably something to do with Eclipse’s shortcuts being easier for me to learn because they had a certain amount of familiarity for some reason.

        But that one is totally subjective. I don’t think it’s even worth speculating on. The only piece of data I’d put forward in support of Eclipse is that I can run it anywhere. Oh, that and it takes < 9.5GB of disk space (which I was shocked to see the latest build of XCode took. That's just insane).

  66. [...] Android vs iOS: A Developer’s Perspective « Whereoscope Blog [...]

  67. [...] [article] Android vs. iOS: A developer’s perspective Somewhere inside Apple, there’s a guy who is receiving untold, nay, unspeakable pleasures by inflicting on the development community a kind of suffering that is as acute as it is pointless. That pain comes in the form of a series of hoops that one is forced to jump through in order to turn your phone into a development handset. There’s provisioning profiles, ad-hoc builds, certificates, and countless screens that I clicked through, not really caring what they did, because they brought me closer to being able to run my code on my phone. On Android, you check one option in preferences. That’s it. [...]

  68. [...] 3. Android vs iOS: A Developer’s Perspective [...]

  69. SophieEvie says:

    hi…..your blog is really very much informative.Thanx for the blog post. Android Application Development

  70. [...] subscription is linked to your email address, so if you change iPhones or switch from iPhone to Android, your subscription will still be the [...]

  71. Lars Vogel says:

    Thanks for the summary. I also hope that the open platform will win against a “I tell you what to do” platform.

  72. [...] Aquí os dejo el enlace: Android vs iOS: A Developer’s Perspective. [...]

  73. [...] adoption Android has engendered. James Gregory, co-founder of application developer Whereoscope, talks about the shift from iOS thinking to Android thinking: Whilst I still think that Android’s initial [...]

  74. [...] engendered. James Gregory, co-founder of application developer Whereoscope, talks about the shift from iOS thinking to Android thinking: [...]

  75. [...] adoption Android has engendered. James Gregory, co-founder of application developer Whereoscope, talks about the shift from iOS thinking to Android [...]

  76. Tony says:

    A word on the “openness” of Android vs iPhone – while it’s true that Android doesn’t have the upfront review process, they will suspend a developers account without notice or warning, and importantly will not give a reason for doing so, other than saying that the terms and conditions have been breached. It’s then up to the developer to work out why/how/when/really?!. Too bad if you disagree, there’s no reversing the decision, no discussion entered into.

    I’d rather be told upfront what’s acceptable, there’s no worse feeling than having a successful app in store for a couple of months and then have it disappear all of a sudden.

  77. Ron says:

    You have just stated exactly what so many others of us were thinking when we undertook iOS development. But you said it much more eloquently than I am capable of doing. Thanks for the post.

  78. [...] become a verifiable genre of blog post unto themselves in the last few years (further examples: 1, 2). These subjects have been covered quite a bit, so I’m not going to rehash what I think is [...]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s


Get every new post delivered to your Inbox.

%d bloggers like this: