Hacker Read top | best | new | newcomments | leaders | about | bookmarklet login
Apple can own I/O to own the universe (diff.substack.com) similar stories update story
115 points by ejz | karma 489 | avg karma 3.88 2021-04-05 19:57:15 | hide | past | favorite | 94 comments



view as:

This article has an odd, simplified, and biased description of history. Apple pushed USB hard and early. The article carefully never made the claim Apple owned Firewire exclusively--it was a partnership with Sony and Panasonic that had certain benefits.

I'm not sure what alternative to the 30-pin connector the author would have been pleased with. Personally, I'm very glad it wasn't micro or mini USB. The Lightning port was shipping years ahead of something like USB-C. Sure, it would be nice if they switched now. I'm sure even if they used a standard port they'd still have a certification program because Apple loves control. I'm confident the licensing money is just a perk. Look at the iPad Pro. They used USB-C instead of extending Lightning.

Bluetooth headphones had been disappointing for decades. After being disappointed over and over I had settled on completely proprietary digital wireless headphones (much like wireless mice until BluetoothLE). Then Apple started making some great headphones that added a bit of "magic" around Bluetooth. I hear AirPods work great outside of Apple's ecosystem, too. They still have some of the "warts" of Bluetooth like having a battery-draining, low latency, low quality mono mode for phone/chat and a higher quality mode for music that sometimes get stuck in the wrong mode, but it's nice they're using a widely supported standard.


> The Lightning port was shipping years ahead of something like USB-C. Sure, it would be nice if they switched now.

If Apple can fix their supply of USB-C ports so they uniformly, consistently last for more than just 6-12 months instead of some do and some don’t, I’d be onboard for switching. As it is, it is frustratingly common for Apple Lightning plugs to wear out the fingers and Apple USB-C ports to lose their “click” (even when just standing in a laptop cradle on a desk and otherwise covered with a port plug when carried around).


> it is frustratingly common for Apple Lightning plugs to wear out the fingers

I have only experienced this with 3rd party cables, but never ones made by Apple.


Another reason why Lighning is inferior to USB: the USB cables can be cheap and of good quality.

You're being heavily downvoted but this is my experience as well.

Even Lightning cables from "alright" brands like Amazon Basics have lots of issues, whereas the only microUSB cables I've had issues with are the really bargain basement stuff that comes bundled with aliexpress junk. It seems the required tolerances for Lightning are way higher (=more expensive).

I don't have enough experience with cheap USB-C cables to make generalizations there though.


I haven't heard of or experienced those issues directly =/ The only issue I've hit myself is lint getting packed into the port which is fairly easily cleaned but annoying if you don't have a tool handy. I have heard lots of stories about the cable fatiguing. My suspicion is that's because of using the phone while charging; heating and bending the cable near the connector. I haven't experienced this (I tend not to charge while using it) but have friends that do.

I haven't seen or heard of an issue with USB-C ports. My beefs are that there isn't enough, you can't get more with a hub, and generic power supplies and cables have only been available in the past couple years. The older devices I had with USB-C I got rid of when they fixed the keyboard. I definitely had them longer then 6-12 months.


> it is frustratingly common for Apple Lightning plugs to wear out the fingers

This is a feature. If the metal inside the port wore out instead repairs would be a lot more expensive.

I’ve also only experienced lightning plug wear with cheap Amazon Basics or gas-station cables. The Apple lightning cables have much better made plugs, but they have a different flaw: the super-thin outer cover detaches from the connector housing over time, exposing the shielding wire.


It was also one of the major benefits in the (fast) transition from mini USB to micro USB! Mini usb had a bad habit of wearing out the port instead of the plug.

In Lightning, the moving/springy parts are in the port, not the cable (unlike USB-C where it's the cable that wears out)

I have an iPhone and AirPods, both with the Lightning port. I charge them every night with $7-$10 Qi chargers from Amazon. I've read that doing wirelessly charges slower but that's supposed to be better for the battery.

Rarely do I use a Lightning cable anymore. I bought an Apple Watch and there's nothing but an option to wirelessly charge it. Before that I would have been skeptical about the idea that Apple is not going to switch to USB-C but instead just use wireless charging. Now it seems reasonable.

PS: I had used various Bluetooth headphones before. The killer features of the AirPods Pro are their small size (I can fit the case in my coin pocket) and the ability to skip 30 seconds ahead in podcasts.

I had used various Garmin Vivofits and loved that they lasted a year on a coin battery or two. But I found that "completing my circles" on my Apple Watch was just enough gamification that I'm moving significantly more than I did before I got the watch. It does make me understand why some health care organizations give them out to members to try and get them to move more. I switched from the SE to the 6 because the SE takes 2 1/2 hours to charge instead of the 6's 1 1/2. It's still annoying to have something that requires such attention to keeping it charged and then charges so slowly. But the trade-off was worth it for me.


> Personally, I'm very glad it wasn't micro or mini USB.

Why is the 30-pin connector better than a well-made micro USB connector? As a consumer, I can think of a number of advantages going with the standard USB port.


> Why is the 30-pin connector better than a well-made micro USB connector?

Apple's 30-pin connector was released in 2003. And was undoubtedly in development for at least a year or two before that. It served Apple for nine years until 2012. It carried audio, video, FireWire, USB, and also charged connected devices.

In the span of those nine years, USB:

- didn't have a batter charging specification until 2007

- didn't have a power delivery specification until 2012

- deprecated mini-A and mini-B in 2007

- introduced a new connector in 2007

USB has always been a mess. Apple's connectors are the stability that people should strive for, not deride.


Except in the hands of kids who manage to destroy cables readily. The original iPad one and even lightning. USB-c is way better.

My kids have destroyed countless lightning cables and usb micro cables. They have also destroyed 3 amazon fire tablets and a nexus tablet by yanking on the cable but they’ve never destroyed any apple devices this way. Thats why the lightning connector is a better design.

I thought the lighting adapter was a big improvement but my kids have managed to break the tip off multiple times in multiple ipads... fortunately I've been able to get the pieces out. The other issue is that for some reason after too much use, they fail to connect/charge which isn't always a cable issue but something intrinsic to the port. The other thing I found is my kids have plugged the lighting adapter into an electrical outlet on a surge protector... they did with with usb-c too so there's some confusion there too.

USB-C seems more robust but it may have other issues I haven't discovered yet.


Breaking the tip of the cable is better than breaking the port.

As you have proven, it is usually possible to get those parts back out of the port, which is hopefully undamaged, even though the tip of the cable broke off inside.


> The other thing I found is my kids have plugged the lighting adapter into an electrical outlet on a surge protector...

Yikes! Oh well at least they are protected from power surges...


USB-C was finalised in 2014, two years after Lightning was shipped (so possibly up to four years after Apple had started development of Lightning).

It also requires a larger port than Lightning, so you can see why Apple would be okay with adding it to iPads, but not to iPhones.


It did predate it by over 5 years. If you're talking about tech features the 30-pin supported (at various points) Firewire, USB, play controls, line-level analog audio and video--for example there was a 30-pin to VGA. AFAIK USB isn't capable of almost any of these things. I believe things like USB audio came later and the 30-pin cable supported that, too.

Even though the connector is much wider I found it way more reliable and easier to plug in without looking at the bottom of the device. It also seems way sturdier when jamming your player into a speaker dock. You mention "well-made" cables. Micro USB is now 14 years old and I don't know of a reliable brand (maybe Anker?). I still constantly throw out cables not knowing if they're broken, not rated for use, or what.

I love standards. I don't know why it took decades to come up with a reversible connector. I've been waiting for USB-C to take off for awhile. Sadly, cable compatibility seems to be a mess (between power, speed, thunderbolt compatibility, etc).


I absolutely loathe micro USB. I am constantly having to throw away cables that are a few months old because they no longer reliably charge my devices (PS4 controllers, Fire tablets, Bose headphones), and that's with me buying the most reliable cables I can find (Anker). Micro USB really is terrible...

I never ever had a micro usb cable fail.

I often had problems with USB C in my previous phone on the phone side.

Apple cables fray almost immediately.


My issues have never been the Micro USB cables, but the ports. They seem to often not be solidly secured into the device. (Amazon's Kindles' ports are pretty rock solid, to their credit.)

IIRC the major innovation for USB-C is to put the spring loaded component into the cable rather than the device. This is brilliant, since springs wear out over time and it’s much easier to replace a cable than a device in most (all?) situations.

Micro-USB-A also has the movable part in the cable (the two little teeth on the flat side). You might be thinking of Mini-USB-A.

Yes I was wrong. I assumed that micro usb A and B did not spring components in the cable. It’s only the pre-micro usb cables lack this improvement.

The problem with the ports isn't (in in my experience) with spring-loaded components wearing out: rather, like ceejayoz said, I've seen a number of cases where the port just...moves around inside the device, so that you can't plug the cable in properly. Either it gets offset, so that the port is simply inaccessible through the slot in the case, or pushing in on it just makes the port itself push further into the case. (In that case, one can generally force it in a few times before things start to really break within the case.)

I know of no specific reason why this should be more common with micro-USB than with other connectors, but I've only seen it happen with micro-USB in person.


It's more likely the solder is cracking off the PCB. Jiggling the port around forces it back into good contact and it'll sporadically work.

The real problem is we don't bolt the ports to the chassis in most cases, we just put them on the PCB with little bracing them in the direction of applied force.


As it turns out, this is addressed in the class specification. I wonder if it’s the spec that’s deficient or whether the devices in question are non compliant.

See page 40: https://www.usb.org/sites/default/files/CabConn20.pdf


My first "hack" as a teenager was soldering a USB cable into the place the USB mini port was on a palm I was using as an ereader. The port would just become detached from the board if you looked at it funny; so this worked pretty well. I could just plug the whole device in to charge from then on.

Then I realized that the three volts that were supplied from a pair of AA batteries was really more of a suggestion. That summer, lots of gadgets I had became "USB powered"


Apple cables used to be the worst cables when it comes to fraying. I had never had a cable fray except apple cables, but I also haven't had an apple cable fray in a few years now. It seems like whatever terrible way they used to make the cables has been fixed

> After being disappointed over and over I had settled on completely proprietary digital wireless headphones (much like wireless mice until BluetoothLE). Then Apple started making some great headphones that added a bit of "magic" around Bluetooth. I hear AirPods work great outside of Apple's ecosystem, too.

It's funny that you mention Bluetooth LE for mice. Audio has been stuck on classic Bluetooth due to bandwidth requirements. But that's fixed with Bluetooth 5, which gives us audio over BLE. There are a bunch of functional improvements other than low power radio. Lower latency and better pairing for instance. I think BT 5.0 headphones will be a huge improvement.

Given the timing of the release of AirPods and the development of the BT5.0 spec, I've had a theory that Apple borrowed features from BT5.0 in the case where you use AirPods with Apple products.


everyone I've talked to who's worked on bluetooth hates it.

except ble - I think is a different and simpler protocol


I concur with this. I helped build a Bluetooth compatible health device for a hackathon. I was responsible for the Bluetooth integration. Coming into this knowing absolutely nothing about the Bluetooth spec and not knowing the difference between Bluetooth and BLE; I tried to use both and only successfully built a BLE integration (as it turns out, it was the right protocol for transmitting low dimensionality biometrics).

Bose also makes great Bluetooth headphones, but only the high-end ones. Cheaper Bluetooth headphones (of any brand) are awful.

> I hear AirPods work great outside of Apple's ecosystem, too.

I worked as a "technical expert" at apple during school (recently), and I can say this is not exactly true. The biggest problems with airpods were invariably customers trying to use them with non-Apple devices. As far as I understand, airpods are unable to update their own firmware without being paired to an Apple device. Pairing was also much, much more difficult with airpods than standard bluetooth headsets.


It's always worth nothing that Apple played a significant role in creating USB-C. There have been (unproven) "rumors" that Apple invented USB-C, but otherwise a quarter of the engineers listed on the (initial) USB-C spec working group were Apple engineers.

It's somewhat the reverse, in a way. Apple, and other companies, clearly would benefit from a once and always connector that far exceeds the demands of whatever USB will need in the future...

... except Lightning is the production version of an early rejected Type C prototype. It was rejected because external facing pins are a common and well known fault point in connector designs.

Those external pins on Lightning plug tabs are the most common failure point of their cables.


Do you know what the trade offs in terms of cable vs design reliability are for USBC vs Lightning out of interest?

> except Lightning is the production version of an early rejected Type C prototype

Do you have a source for this? Curious to read more about it.


It's better than a failure of the port on the device.

Apple not only pushed USB (a standard created by Intel and universally ignored by PC vendors until the candy colored iMacs created a sensation) but USB replaced the Apple Desktop Bus (ADB) - a proprietary serial connection.

Doesn't fit the "Apple is a walled garden" narrative though :p


> Sure, it would be nice if they switched now

I don't know. It has a nice ecosystem around it. Everyone has a bunch of lightning cables everywhere now. Just like the 30-pin it's not perfect, but is so widely used there's one almost everywhere you need.


To "airpods" example - I think that it is more about Apple making great product than Apple enforcing control. Better eco system control is more of useful side effect.

There are other players who can make great products like good quality headphones, with standardized components, because they care about product. Although not common behaviour, it means that it can be done. So I think that most manufacturers just don't care enough, probably because most people don't care either :-(


most people don't care either

There's the slight problem of having no practial way of knowing which set of headphones will work properly; even most reviews skip these crucial details and just talk about the "warm sound" etc.

If the headsets were marked with things like "kinda works as long as you hold it right", "works ok as long as you have a Samsung phone", "works flawlessly when paired with both your phone and your computer at the same time" at the time of purchase, I'm pretty sure people would let it affect their choice.

That's one of the benefits of staying within your ecosystem of choice; if it doesn't work properly Samsung or Apple can't blame it on someone else - it's their problem to fix.


> To "airpods" example - I think that it is more about Apple making great product than Apple enforcing control.

You cannot set up a HomePod without an iCloud account (over and above an Apple ID).

You cannot install even free apps on iOS without an Apple ID, which requires a telephone number.

It is indeed about control and the acquisition of identified user behavior data. Apple has growth targets to hit and only so much can be done via hardware sales. Services revenue doesn't grow automatically without deep, intense data on a user's habits.


[Citation needed]

What's the proposed mechanism here? How do you believe Apple is using "identified user behavior data" to grow services revenue? Since, unlike their major competitors, they don't actually do targeted advertising.


This is false, the ads in the App Store vary by at least location and I would be very surprised if they didn't vary by your purchase history.

Yes, you cannot do that and I would prefer that they enable these features too.

But really, for most lay people it is better, safer and convenient to just use Apple selected method - to connect everything with one account/Apple ID. And I think that *primary reason* is better product for users. Overall it is easier to control user experience if you tightly control whole platform.

I am not sure if it could be even possible to create broad and well integrated platform similar to what Apple has now, with general opennes. Is there an example of this in any other tech domain?

Counter-example: Linux desktop is nice, functional, very open but so complicated and fragmented ... and it never took off and it is staying as a niche among desktops so far.


The problem with the article is that it thinks Wireless is simply a better chip ( Apple Silicon ) and that is it. It is not. It is the combination of Software, Chip and Antenna.

Apple could sell W1 or W2. But they will have to provide firmware update ( as they do now to AirPod ) to third party which Apple will definitely not provide the source code. And since Apple does not own or understand third party implementation of their design it makes debugging and quality control of MFi literally impossible. ( They tried that with HomeKit )

And with that, the whole point of the article was moot.


Yes, and: Apple's UX is getting better. Controlling all the devices has enabled better integration. Stuff like Handoff and Siri-capable multiple devices. Often still frustrating, but definitely improving.

This is what APIs and standards are for.

> So what will the new computing paradigm be? The industry seems to have converged on what Rick Osterloh at Google called “ambient computing,” which is a bigger version of IoT: pretty much everything will be a computer, and pretty much everything will be connected all the time in a way that doesn’t necessarily require human intervention.

If « the industry » (whatever it means) thinks that the future of computing is to have computers everywhere, either they have lost hindsight of the reality or they are going to try dangerously hard to push this dystopian future.

Every previous computing paradigms were accompanied with some annoyance but also with significant gains for the humanity.

Where is the gain to anyone to have computers « everywhere » ? What real problems does this paradigm solves ?


No idea why you are getting downvoted dor asking a hard question.

It's hard because we've barely made progress in every other facet of technological progress except computing. Our systems are fragile and not future proof. We haven't learned. We've gone from discovering the secrets of the natural world and modelling it to internet connected toasters in less than a century.

This is called decline.


This implies that we've stopped trying to discover secrets of the natural world, which is the farthest thing from the truth.

Basic science is still going on. True, the US isn't as good at funding it as we used to be, but it hasn't stopped, and I doubt it ever will.

More importantly, the fact that we're making "internet connected toasters" (and various other experimental gadgets of questionable practical utility) does absolutely nothing to detract from the fact that we are still "discovering the secrets of the natural world".

There are a lot of people in the world. There's plenty of room for some to be doing amazing science while others make weird gadgets, no matter what you or I think of those gadgets.


> Our systems are fragile and not future proof. We haven't learned...This is called decline.

It’s a sign of technical immaturity. A Cambrian explosion of precarious innovation is standard for any new technology of any import (you can look back merely a couple of centuries to see the same in steam engines, trains, bridges, bicycles, automobiles, aircraft, pharmaceuticals...).

Be excited to live through one such!


> Where is the gain to anyone to have computers « everywhere » ? What real problems does this paradigm solves ?

You can find some answers by asking people how they use their smart speakers or watch.


So just to set timers?

Yes. Setting timers is pretty useful.

With the watch did you consider that the fall alert monitor can call for help? Or how the heart rate monitor can detect abnormalities that should be checked out?

None of this might interest you, but surely you can see how some people find it valuable.


This is a take I very much agree with. It's also borne out in modern innovations: the smartphone is the success it is not because it's ambient - but because it's personal. Small enough to always be on you, and powerful enough to do all the basic functions we need of a computer. It more or less is the basic defining digital boundary of our identities - and the reason why user controllable and ownable phones is so appealing to us tech-types.

I think the industry will push for smart everything, and I think it'll all be met with the increasing distrust and derision which has come from the realization that everything that has a computer is basically spying on you for someone else. If there's hope in this dystopic vision, it surely lies with the open source smartphone platform movements and reasonably egalitarian bodies like the EU pushing for consumer rights and standards.


This won't happen because it would require Apple to share their software as well as hardware. The seamless pairing and automatic handoff enabled by W1 and H1 relies on a very sophisticated software stack.

There's no way Apple is going to distribute that software to third parties, and even if they did there's no way they would be able to guarantee the quality and compliance of all of that on third party platforms. Apple is all about user experience and buggy flakey third party systems in the W1/H1 ecosystem is absolutely off the table.

The way to do this is for third parties to come together and come up with open standards for automatic pairing and seamless device handoff. The problem with that is I suspect it's incredibly hard.


Is there a convincing reason why this software can't be baked into the chip, and exposed to the containing device as digital output? I don't see one.

I find this comment amusing. I’ve been using a $5 knock off AirPods Pro with my iPhone and it behaves exactly like the real deal; handoff, renaming works flawlessly. Pretty sure it doesn’t have a W1 chip.

A lot of media has been repeating the statement, which I thing is just plain stupid; “AirPods are a new platform”. Because even though they are a great product, a pair of earbuds simply cannot be a platform just because someone looked at the total revenue and compared it to the first iPods.

This however is a great write-up showing how the wireless connectivity that powers AirPods might be turned into a platform in the future, if Apple wants to broaden its tech rather than keep it as a competitive advantage.


AirPod Pros have changed the way I use my iPhone.

More podcasts, more Spotify, more YouTube/Netflix on the phone rather than through AirPlay / Chromecast, and more Siri.

Now if only more of Siri was processed on device (which is slowly but surely happening), I can see AirPods+Siri becoming a “platform”.


Would that have been the case with any non-branded wireless earbud, given the same performance? What makes you associate the AirPod to this change in behaviour?

Apple marketing combined with submarine advertising like the linked article here.

I don’t think this is submarine advertising. Apple is big enough, prominent enough, and loved/hated enough to have an active community of pilotfish analyzing and speculating (and a big enough community of readers to pay the pilotfish in likes and even dollars).

I’m sure Apple does use submarine ads, just don’t think this is an example.


As long as the earbuds were: small, wireless, don't fall out during exercise, active noise cancelation, long battery life (although AirPod Pros could have even longer battery life), simple portable charger, easy interface.

It also helps A LOT that I never have any issues with the Bluetooth (not previously true with non-AirPods I had used, and given up on).

I'm also ok with the current touch interface on AirPods. However, I wish Apple would let me fully remap the buttons to any keyboard key or keyword-shortcut instead of just "start/stop music", "next song", "transparency/noise cancelation". Specifically, I want the AirPod controls to easily skip ahead during podcasts (15-30 seconds), unmute or mute in Zoom/Slack/etc, and volume controls.

Ideally, this sort of flexibility will be how Apple turns AirPods into a "platform".


I could say exactly the same thing about my on-ear Bose wireless noise cancellation headphones. They are bigger, heavier and more cumbersome than AirPods, but so what. They still do the job, and I don't call them a "platform" any more than I would call my PC's keyboard a "platform".

Keyboards were the first platform on a computer.

You should instead think of “new platform” in the context of the mouse. It offered a significantly different user experience. We just happened to call that platform “Graphical User Interface” it could have easily been called “the mouse platform” in its early days and made perfect sense.

Its not ideal to call AirPods a platform, but lend the writer some poetic license, what they are describing doesn’t currently have a canonical phrase in the zeitgeist.


By that definition, you can call any paradigm-shifting invention a "platform" and drown all remaining meaning of the word in buzzword-speak. Let's just call it how it is - it's a new kind of peripheral, or a new interaction paradigm that unlocks new things. It's revolutionary, yes, but that doesn't make it a platform.

> By that definition, you can call any paradigm-shifting invention a "platform"

I'd argue that, until proven otherwise, you probably should.

> it's a new kind of peripheral, or a new interaction paradigm that unlocks new things.

Look you're right! No doubt about that. The only problem with your mental model is that it is limiting, rather than expanding.

Even if they are wrong to call it a "platform", using wording that evokes an "expanding mental model" has benefits.

I'll give you an example of where your thinking might cause "limitation". Specifically the iPhone. In 2007 when touch screen was added to a phone, Apple could have left everything else about a phone the same i.e. "touch is just a new interaction paradigm". Basically a Palm Pilot without the AppStore or a Browser. However, they realized "touch isn't just a new interaction paradigm, its a whole new enabler/platform of <stuff not previously possible>, lets push to create that and see what happens."

People might be very wrong that personalized-audio, as a new interaction paradigm, will lead to a new enabler/platform of <stuff not previously possible>. But just because it might be wrong or not lead any where, doest mean we should discourage it's pursuit.


Said more succinctly, benevolent/inconsequential "lying" for society's or your own improvement can be a very useful modus-operandi.

[0] https://www.youtube.com/watch?v=3LopI4YeC4I [1] https://www.npr.org/2021/03/12/976337203/useful-delusions-ex...


I would argue that the $10 bluetooth earbuds I got at Menards have done exactly the same for the way I use my iPad mini. (I was still using wired headphones exclusively until about a year ago, but started to find it limiting with a device that can't fit in my pocket.)

The word platform is dead. Not it means nothing. It's now joined the rank of 'service as a service', 'platform on demand', and so on

Maybe I’m in the minority here, but I feel that when Apple pushes an I/O standard, it’s been because there isn’t really a better option with wide adoption. When they pushed FireWire, it was at least 10x faster than the SCSI interfaces it replaced while being a significantly smaller connector.

Lightning and the iPod dock connected are the only ones that are proprietary. But from all accounts Apple is planning to get rid of the port altogether, which would leave Bluetooth as the audio connector of choice.


in regards to the proprietary I/O we have seen already EU pushing for a standardised USB/connector which may come very soon[0] hopefully

[0] https://ec.europa.eu/growth/sectors/electrical-engineering/r...


This already happened. The industry settled around microUSB, and then later migrated to USB-C. Apple didn't like microUSB so they invented Lightning, and stuck with that while USB-C was developed and started being used by the other manufacturers.

This article seems to be asserting that Airpods have a Wifi radio (in addition to BT and NFC). I can't find anything to support that including the article he cites.

It does rather devalue his argument about why airpods are so good if he doesn't even understand which wireless standards are involved.

I have no experience with Airpods, but if they're good it's because of two things:

    Apple actually invests in the time and effort needed for a good end to end system. From RF and antenna design all the way up to user facing software.

    They control the whole stack. Basically every line of code, and every gate in every chip involve in that end to end link. That means that proper development can achieve the best possible result.

I have some Samsung Galaxy Buds (and a Samsung phone). I have found them to be excellent from a stability and ease of use point of view. Admittedly less good from an audio quality point of view (which isn't a problem for me as I use them almost exclusively for podcasts and phone calls at which they do a good job).

At this point Apple really owns most things. Either by having officially bought them or by controlling platforms that things have become coupled to.

Apple doesn't want to own the Universe, it wants to own the profitable universe.

Thus it leaves Huawei, ZTE, Xaiomi, Baofeng, and Samsung to make cheap and unprofitable phones.

It wants to leave Dell and HP and to make unprofitable PC's, etc.

If Apple were really to increase market share in phones and PC's they would do so at the expense of their own profits. It is much better to let the "competitors" stumble on... For instance the zombie OS Android has great value to Apple because if it didn't exist then antitrust regulators would be at Apple's throat -- Apple's real genius is getting all of these other companies to pay to keep Apple out of antitrust court.


People in upper middle class U.S. circles have a very skewed perception of the popularity of iPhones (almost everyone has them). The vast majority of mobile phones in use worldwide run Android.

That's true but that doesn't mean Huawei is making great profits selling phones in India.

An Android is good to talk on the phone, get SMS, and throw it out after 2 years. If it charges when you plug it into the charger 90% of the time you are doing pretty good.

In rural areas of the United States we have a system of factory constructed housing which is scaled in cost to the low cost of land.

Go into such a "trailer" (I have) and you may likely find somebody with a Amazon Fire tablet who is not really sure that it has anything to do with Amazon or that it is a Fire, etc.

It is a great piece of hardware for what it costs, you can get most of the software you want to work on it by sideloading and have better luck than Bluestacks, SHIELD and similar solutions. (No matter what they will serve up offensive ads for games you will never play.)

People say "Android Tablets are dead" but no the Fire is very viable if you want something to read PDF files on the bus and not feel bad to treat it rough because the replacement cost is so low.


I was more responding the the "zombie OS" comment then the profits idea. I think I basically agree.

>For instance the zombie OS Android has great value to Apple because if it didn't exist then antitrust regulators would be at Apple's throat


> and throw it out after 2 years.

I've only had one android device remain usable significantly longer than 1 year.


>The vast majority of mobile phones in use worldwide run Android.

True but irrelevant to the original point: Apple claims the lions share of mobile profits.

Apple is perfectly happy for their competitors to focus on raw sales numbers vs. profits. I would be too if I were them.

Another analogy - complaining you didn't win the Presidency while talking about the popular vote vs. electoral college.


It's the Pareto principle: A small portion of the market is where most of the money is.

> Baofeng

I didn't know Baofeng made phones. I thought they only made radio HTs.


I think they make coats too.

Since when is Android a "zombie" OS? It's going strong and numerous, with many people who outright prefer it to iOS. It's not just an OS for "the poors".

These speculations gloss over some important points. Apple doesn’t control anything but their own ecosystem (even that only to some extent).

FireWire never made it bug because they were too draconian in licensing. Ditto on their HomeKit certification: they ended up relaxing their security chip requirements to get even the low level of adoption they have today. They were arguably the primary enabler/driver of USB, SCSI, and the Sony rigid “floppy” but they did not initiate any of those.

Also their M.O. is to dominate profit share not market share. This doesn’t always require them to commoditization their compliment; often they basically ignore it.


> But why do the Airpods “just work?”

Ours don’t, the microphone went early on. It takes some fiddling to get them to connect to the intended phone and sometimes the left ear is has to be put in and taken out 3-4 times to get sound. My wife and I almost always use our wired lightning earphones in preference to the AirPods. We are both still resentful that our nice ordinary headphones no longer plug into our phones. The only reason we stick with apple at this point is the better privacy


Legal | privacy