Image via CrunchBase
Sordid tales of Flash, Java, Blu-ray, and other things Apple doesn’t want to play with anymore
Almost every day lately, yet another story about Apple deprecating or not supporting major technologies comes to light. This week was particularly dramatic – the week Apple deprecated Java and removed Flash from Mac OS X. The Summer was replete with Flash references all over the press leading to an antitrust inquiry into Apple’s practices. Blu-ray support has been teased for several years. Of course it is the exception case when Apple comes right out and says it categorically will not support something. It prefers to tease its customers like a politician for as long as possible. It wants both sides to buy its products.
If you’ve watched Apple closely long enough, you already know what it’s going to do. Apple will try its best to sideline these three technologies while it finishes building its fully walled garden in the hopes of ensuring that no future technology like them comes along later for it to defend against. Each of these three technologies have managed to leak onto its platforms in some way despite Apple’s attempts to stop them. Apple wants to ensure that can’t happen again while also plugging the vestiges of the leaks, and this week made major progress towards that goal. (Note: I may use ‘platforms’ in the plural in this post, but Apple has only one platform which we’ll call iOS for now since that’s Apple’s latest name for it; however, that means I am defining it more broadly than they do at present – when I say iOS here, I generally mean the entire ecosystem of Cocoa-based operating systems from Apple.) It’s the same OS on the iPhone, on the desktops where it’s called Mac OS X, and even on AppleTV where Apple hasn’t yet publicized that apps can be written for it. They are separated only by minor source code branching internally.
Let’s review the history of each of these technologies and how it relates to Apple’s platform.
Adobe Flash has always been a bastard child. It was a hack that allowed simple animations and later quality web video needing a more dependable system than that provided by the web browser clusterf*ck of the late ‘90s and the early part of this century. Nobody really likes Flash, but it did solve what were a set of major deficiencies with the web in a way that didn’t require glacial standards bodies to spend a decade debating. Flash was slow, it was maintained poorly, it was rife with security flaws, but fundamentally it was needed for many sites, it dramatically improved the visual architecture of the web, and there was no competing web technology to challenge it. Not all of that has changed much since then. Meanwhile, HTML5 is not a serious replacement yet, and the rate at which a maximally political standard like HTML5 can be improved to compete with Flash means that wont change anytime soon.
During the early days of Mac OS X, Apple was weak. It was in its resurrection period with Jobs taking over a deeply troubled company. It was in no position to make demands. Adobe was acting like a typical corporate enterprise by de-prioritizing software for Apple’s platforms as it assumed Windows would be the future based on Apple’s near-death experience. Flash however ran just fine on Mac OS X and Apple included it by default on all Macs – because users expected it, everyone used it (although few real users even realized that was what made many websites work), and Apple needed to embrace as many partners as it could back then.
Time passes. Apple gives birth to a little ego boost known as the iPod and later the iPhone. The iPod was more of an experiment (and a rush job) – it represented no strategic progress on the software front beyond iTunes on the desktop. Not to take away from the iPod’s success, but it’s the iPhone that changed everything in 2007. What seems opaque to many is that the iPhone is almost pure Mac OS X. It’s almost exactly the same thing running on your desktop Mac. The iPhone was a direct port of Mac OS X to a mobile processor with a new user interface and a telephone application (yes, that’s somewhat simplified). The success of these products allowed Apple to assume a significantly less submissive behavior as it skyrocketed towards its current position as the second largest company in the world.
The iPhone was pretty clearly going to be a runaway success. The idea that Blackberry’s glorified ‘we’ve combined a calculator, pager, and telephone!’ OS or Symbian’s ‘so convoluted and badly designed it might as well be Windows’ OS would survive the onslaught of the integrated iPhone ecosystem and full Mac OS X underpinning that Apple had brought to the mobile game was comical to those who understood the underlying technologies.
Meanwhile, the initial iPhone had no “apps” so the lack of Flash, Java, and anything else seemed perfectly fine for a mobile considering that the major competing platforms at the time like Blackberry could hardly reproduce any web page much less make a beautiful rendition with multitouch. Apple’s strength was its platform – a full desktop platform but under major constraints in terms of memory and CPU on the iPhone so Apple did have to be very careful about what it allowed to run. Blackberry, Symbian, and Windows Mobile were all based on real mobile operating systems so they had to build upwards – a path that eventually became untenable because rebuilding web browsers and such from the ground up is always going to result in an underfeatured browser. Apple was able to build top-down. It already had the web browser running on Mac OS X. It just needed a new user interface that removed features rather than adding them. That’s quite a bit simpler.
A company in Apple’s 1997-2006 submissive mode could simply have been creative to ensure that Flash content ran on the OS. Apple could have worked with Adobe (or just implemented it themselves) to make portions of Flash run – for instance just the video elements but don’t waste time with the extended Farmville-style animation and sprites that are the real dogs of Flash. Suffice it to say that making Flash “run” for the most part on the iPhone has always been fairly easy without any of the problems that Apple purports could occur – and this path could still be taken at anytime. There are many gray areas of partial support to exploit and reasonable ways to ensure security is maintained if one is so inclined. The question has always been whether Apple wanted it to happen or not. It’s not an engineering, performance, or security issue. Do not be fooled. Yes, such issues exist in theory from a nonsensical direct port, but a proper design for mobile can of course eliminate them. All of this should now be obvious in retrospect as Google’s Android platform has broadly deployed Flash on the same kind of hardware as the iPhone. The fact that my wife’s phone has Flash yet she doesn’t care, and I care a great deal but can’t get Flash tells me something is very wrong.
It all goes back to the platform. The one piece none of Apple’s competitors had is its platform – iOS. Apple already had a first class web browser (which supported Flash on Mac OS X because anyone could build software for Mac OS X and add extensions to its browser). Mac OS X encouraged openness as an OS because it came of age during Apple’s make-friends period. Apple spent many years embracing the ‘community’ as part of the resurrection. Mac OS X is based on ‘Darwin’ which was another word for NeXT’s Unix-based OS that dates way back into the ‘80s but was later renamed Darwin as part of an ‘embrace the community’ open sourcing effort in the resurrection period.
Apple adopted Flash as a core component of Mac OS X shipped on every system. Apple adopted Java and dedicated engineers to it. Both of them were supported despite their many years of endless security flaws. Apple wanted to embrace and make friends. It sorely needed them. Apple made Java a mostly first-class development path for Mac OS X. Fundamentally, Apple needed and succeeded in creating a full alternative to Microsoft Windows that supported all the esoteric technologies and sub-platforms users wanted like Flash and Java.
In 2005 at the latter end of this make-friends period, Apple joined the Blu-ray consortium. This was entirely a defensive move as most of the standard was already in place. Microsoft had managed to gets its VC-1 video codec (basically a codeword for Windows Media) into the Blu-ray spec, and Apple feared getting left behind. Apple didn’t like Blu-ray. They never did. But they felt they had to get in front of the spec or get left behind as Blu-ray was adopting many potentially competitive technologies. QuickTime/H.264 video were a major push back then as Apple saw them as strategic consistent with its iPod roadmap.
Of course, Microsoft was behaving in exactly the same disingenuous way inserting standards into Blu-ray yet it would later play no role in Blu-ray even preventing the XBox from using it. Blu-ray may have adopted VC-1, but Java as the core platform was still too much for Microsoft to swallow after so many years of ‘embrace and extend’ Java wars with Sun. That same year, Blu-ray ratified Java as its platform. You could argue this was a victory for Apple because Apple had long ‘supported’ Java and at least Blu-ray hadn’t adopted some more odious Microsoft solution like they had with codecs, but in reality it was just another platform threat to Apple and they had to see it negatively. We can now see in retrospect that, despite Apple succeeding in getting H.264 on the mandatory list for Blu-ray, virtually everything is actually encoded using VC-1.
If Apple was to support Blu-ray, every technology adopted for Blu-ray would need to be licensed and implemented by Apple on their desktop systems. So the risk here was pretty high if certain anti-Apple technologies had been adopted. As it is, VC-1 and Java together were a formidable set of what Apple saw as anti-Apple technologies. They represented a video format that had already defined a non-Apple DRM system and a platform over which they had no control. Apple’s favorite H.264 was in there somewhere but the rest of Blu-ray was a confusing mess of non-Apple standards, or a ‘bag of hurt’ as Steve said. That meant it was a threat to the iTunes media ecosystem (DRM) and the iPhone apps ecosystem (Java). The only reason to implement it would be if users demanded it and Apple had to extend an olive branch to those users as it frequently needed to do during the make-friends period. It’s safe to say it has been years since Apple felt the need to do what users wanted rather than whatever is seen as building the larger ecosystem for Apple itself.
Beyond one positive reference in 2005 by Jobs shortly after joining the Blu-ray consortium, essentially nothing has been done on Blu-ray by Apple. It’s clear they either changed their mind or never intended to implement Blu-ray. In 2010 at an Apple internal all-hands, Jobs was asked by an employee when they would introduce Blu-ray. The response was after Blu-ray took off in the marketplace. This sounded particularly disconnected from reality as most people in the world were and still are under the impression that Blu-ray already took off. At around 20% of US households with a 70% YtY growth rate, it’s safe to say Blu-ray will be here a long while. The argument that it hasn’t taken off is the distortion field at work. If you say it enough times, maybe it will become true?
It’s the kind of silly semantic debate that would never have taken place in the make-friends Apple period. If the iPhone had never been introduced, we would have seen Blu-ray support on Mac OS X in 2008. There hasn’t been a competitor for Blu-ray in years. HD-DVD was declared dead by effectively the end of 2007. Apple had expressed public support for Blu-ray in 2005. There was no competition, and essentially there still isn’t. Only someone that has never travelled outside of major cities without broadband Internet would think downloading movies would be a viable replacement for Blu-ray anytime soon (sure, downloading movies is a lovely additional feature when you can use it and don’t need full quality). In a trip to Egypt a few weeks ago, I was happy and surprised just to find power and water much less high-speed Internet drops. On a cruise, I couldn’t even watch a fairly static security camera stream at home much less 720p streaming video. Watching movies on laptops is a primary use case, and it’s not clear which decade will provide reliable broadband worldwide, but it sure isn’t this one. The idea that it hasn’t ‘taken off’ or has been ‘beaten’ by downloads is clear distortion.
By late 2007, Apple had finally realized it was all about its platform, the media that played on it, and the applications that ran on it. That was what would allow it to compete long-term. The platform covered every major front end: the Mac OS X incarnation covered the desktops, iPhone OS covered mobile, and now iOS covers TV (iOS arguably being the name used for all three of them). Blu-ray was simply another platform that wasn’t Mac OS X and would thus cause problems down the road. Whether it was licensing issues with Microsoft codecs or the Java platform, anyone who really knows Apple should have seen back then that Apple would find a way never to support Blu-ray if it could. Apple no longer needed its friends as much as it had in the resurrection. It needed to solidify the walls around the garden it was building that made virtually everything not invented at Apple a threat.
In the resurrection period, Apple was quick to embrace other technologies. Before NeXT was acquired in 1997, a web-based platform had been developed by NeXT known as WebObjects. This was simply another Cocoa platform based on Objective-C. It was an insanely expensive solution for developing enterprise web apps. In 2001, Apple transitioned it from its native platform technologies for WebObjects to Java. This was due to market pressures as Java was becoming very popular for enterprise server technologies at the time. Apple didn’t have the market strength necessary to hold the line on Objective-C back then.
The next few years saw Apple increasingly open source and move WebObjects out of its core technology set. It was soon made free and eliminated from the product line by 2007. WebObjects, in its original conception, was a way to write native Cocoa applications for the web. Once it went Java, it was no longer strategic. It wasn’t part of the iOS platform. It was simply another way to develop Java apps for the web, and Java was increasingly contrary to Apple’s strategy of native Cocoa applications – because it could exercise ultimate control over that entire ecosystem.
For the iPhone, Apple needed something very similar to the original native WebObjects that it had just spent 6 years killing. For the first iPhone, Apple introduced a silly solution to write iPhone ‘web apps’. That was simply a sleight of hand trick to avoid admitting that ‘we don’t support apps yet but feel free to access websites.’ The fact they were able to convince some in the drooling press that these represented ‘applications’ remains an amusement. Internally, rest assured there was never any mystery that real applications and real developers would be forthcoming. It was a matter of engineering all the infrastructure such as the App Store and finishing public APIs to ensure Apple had control over what they now were certain would be their ace in the hole, their platform.
They couldn’t just pre-announce products or detail their entire future strategy as that would have terrified all the partners they’d spent years cultivating. The most ironic part is that, had Apple supported Java on iPhone OS, these ‘web apps’ could easily have been much more functional with a well-developed Java-embracing strategy. The fact Apple did not enable that should have made their plans clear to observers. Apple shortly introduced the ability to write full native Objective-C Cocoa applications for iPhone OS – as long as you don’t use Flash or Java (thus ensuring among other things that your source code not just your app will work only on Apple’s operating system and development environment).
Apple’s platform is somewhat unique in that nobody else uses any form of it. Building something for it requires a complete code rewrite to port to anything else in the industry today – entirely true at the user interface layer and often true for backend code. To give credit where credit is due, I’d also note that Apple’s development environment and language is arguably much better on the whole than any of its competitors, but the walled garden is certainly enforced by the fact that every single technology in the ecosystem is fully Apple controlled.
While above I made the point that enabling Flash is fairly easy on the iPhone if Apple wanted it, enabling Java would be literally trivial. The fact it is not supported from a technical perspective is pure politics. Java has been ported and deployed on many small devices without issues. At least with Flash one does need to redesign some things and likely drop some features altogether. The lack of Java isn’t a technical issue.
Java represents a foreign web technology that doesn’t advance Apple’s platform. Blu-ray’s core platform is Java. WebObjects had become polluted by Java a decade ago when Apple wasn’t paying attention to the platform as its core strength and needed market friends. Apple’s cornerstone technology is Objective-C Cocoa not Java. Flash and Java compete with Cocoa today. Apple wants to eliminate these other technologies it sees as threats. More recently, Google has based its Android operating system on Java and its market share has been going through the roof. It represents the first very credible threat to the iPhone. This has of course only increased Apple’s anti-Java resolve and likely led to it making the dramatic deprecation of Java announcement this week – a situation really unthinkable not too long ago. Being everybody’s friend and the supporter of Java is no longer in Apple’s best interest. Java is not only a platform threat to iOS, but now that Oracle has acquired Sun, Oracle has demonstrated that it’s going to be a troublesome parent that could sue anyone at anytime for specious reasons. Apple could have just as easily used Oracle’s lawsuit against Google as a good reason to deprecate Java. That kind of legal uncertainty is just not tolerable for Apple when Java is both a platform competitor as well as having very little reason to include it on client platforms – users barely notice its absence on the iPhone unlike Flash.
Android is bursting Apple’s bubble around its walled garden. For the first time since the iPhone introduction, Apple feels a very real threat. In September, it took the inconceivable step of relenting on some of its many restrictions by allowing apps that include Flash-based components. After impassioned anti-Flash arguments from Jobs only a few months before, Apple actually allowed some forms of Flash in the store.
It’s very important to see what that September announcement is not however. It’s nothing like Apple allowing Flash on its platform. No. If you write an Objective-C Cocoa app that happens to include a segment that runs in pre-loaded Flash (you can’t download it from a website dynamically, it must have been part of your app from the moment it appeared on the store), that is allowed. But that was never what people wanted. That was only interesting to a few developers who had gone down that path. Flash itself and especially Flash in the browser is still completely off-limits. So really there is nothing to see here. Nothing relevant changed.
There are a set of platforms each with their corporate backers against which Apple is defending. We’ll ignore Oracle. Google is the Java platform, but with APIs entirely unique to Google’s environment. RIM also uses Java, but again entirely customized for its environment. Microsoft now has Windows Mobile 7 with its old-school Microsoft .NET platform. Adobe is just a vendor, but Sony has adopted Chumby for its Dash platform, and Chumby is just a Flash environment. So you see that each of these technologies like Java and Flash (nobody cares about .NET anymore, sorry) aren’t just foreign execution environments, they’re actually market-level competitors for Apple the company, and a threat to its entire platform.
Interestingly, all of the competing platforms are non-native. They’re all interpreted platforms (this generally means slower) like Java, Flash, and .NET. Yes, some try to get around this via compilation optimizations, but any real tech knows that if you want performance, you want native code, and only Apple plays there. For the record, Android has a hacked native code solution, but its definitely a second-class citizen – you sure aren’t going to write your whole app in the NDK as it’s just not possible.
Meanwhile, Android runs Flash as well. Amusingly despite being based on Java, Android does not run Java apps. This is purely a market forces issue. Flash is becoming a major competitive feature to use against the iPhone so Google wanted it right away. Running client-side Java apps is entirely irrelevant as almost no websites use that. (Java as a technology never got into the mainstream anywhere other than the enterprise Internet applications area that it took hold of back in the WebObjects days – not counting essentially proprietary uses like Android of course).
Android’s use of Flash does however blow up many of Apple’s arguments against it. Now that the Emperor is wearing no clothes, what is left other than to admit it doesn’t want Flash because it sees Flash as a competing platform threat to iOS? Flash and Java (and by corollary, Blu-ray) are other people’s toys and can’t be depended upon. If we play with them, the toys might get taken away, their parents might get mad, who knows what crazy thing might happen!
If cross-pollenation of runtimes occurred, apps would be more portable between platforms, but portability is not good if you’re trying to build a walled garden where toys can grow up without foreign influences and lawsuits.
Apple doesn’t just have an ecosystem now. It’s evolved into an economy. They have essentially all of the mobile developers. They’ve built a system where those developers can make money off of Apple, but you give up the ability to make your app easily portable of course. We want you living off Apple and ideally depending on that. If you’re too big, or try to impact the platform itself, all bets are off.
I admit I both love Apple and want to see all three of these technologies supported on both Mac OS X and iOS (yes, Blu-ray support in iOS would be necessary for a complete AppleTV product). For better or worse, I know enough about these three to realize there are no real obstacles to that happening beyond politics.
I have to wonder whether Apple, at its newly grand size and flush with incredible amounts of cash, really has to worry so much about its walled garden anymore though. It’s all very convincing to look at this from Apple’s perspective on why one would try to eliminate these three technologies, but Apple is now in a position to outmatch Google dramatically on the openness front. If Apple were to allow these technologies in the appropriate places, it seems to me it would only solidify its position as market leader. As it is now, Apple creates holes in the market by holding what is so clearly a political line on these technologies that users want or need. Google and others are trying to fill these holes more or less successfully as the case may be.
If that short-sighted protectionism stays in place, I fear Apple may be shooting itself in the foot.